I think a current bridge slowly being crossed is a move away from OO programming (along with GoF design patterns) towards functional languages.
GoF patterns provide structure (at the cost of complexity). FP patterns are simpler and succinct but aren't quite as cookie cutter and thus are less structured.
Probably the tick-tock of the pendulum swinging. As software became too unique to maintain, standard patterns were implemented. As standard patterns became overly complex, we swing back to more expressive constructs.
Yes, web to mobile. Native mobile development is a hugely different beast from web programming, just as web development was a huge shift from desktop GUIs in 2000, desktop GUIs were a huge shift from DOS and other microcomputers in 1990, microcomputers were a huge shift from UNIX and VMS and other timesharing OSes in 1980, and timesharing OSes were a huge shift from batch-processing on IBM & other mainframes in 1970.
It's interesting that you say the transition has happened between web and mobile, because of a lot of people I know are getting up to speed in HTML 5, and are betting the next few phase of their careers on it. Maybe it's because South Africa is a few years behind the curve, but is there evidence that the share of web-based jobs is declining enough to be concerning, in the first world?
The interesting thing about waves of technology disruption is there is usually more than one wave in play at a time, and so a technology can both be disrupting and disrupted. The web was invented in 1989, popularized in 1995, and arguably the point where people realized "Hey, desktop apps are in trouble" was around 2004 when GMail and Google Maps came out. Even so, Dropbox was founded in 2007 as a desktop software company, and that hasn't stopped them from being worth a few billion.
Mobile apps were invented in the 1990s (Newton, Palm), popularized in 2008, and (at least in Silicon Valley) it's a big topic of debate whether the web is dead. In my old job - Google Search - I was still pretty secure as a web guru. In my new job - startup founder - I feel that I at least have to do my due diligence and evaluate the technology.
Whether it's a problem for your career depends on exactly what you want to be doing with your career. There are still people making a living off of COBOL and IBM mainframes. In general, consumer markets peak about 5 years after the technology is introduced to the general public (so microcomputer apps came into their own around 1980 with Visicalc, windowing apps around 1990 with MSWord and various office suites, webapps around 2000 with Google etc, mobile just starting to peak now with Uber/Instacart/etc.) If you're founding a startup you need to account for the time it takes to build your product, so in general you want to use technologies not more than 2 years old. If you're working at a company you don't want to use technologies until the companies that have adopted them have gotten big, so it's frequently 10 years or so after adoption.
From web to mobile? What was the tough bridge to cross in the last 10 years?