I have this theory that we took technology for granted in the late 90's, we thought we were at the top of the world. That's because of the Dotcom Bubble of the late 90's, which burst together with 9/11. The depression made us waste the next 3 to 4 years trying to figure out what we would do next.
Apple had the luck to dodge the burst. They were already at the rock bottom since 1997. The cleaning up of their hardware lineup and the release of the iPod created their own small bubble. Steve Jobs was back and he brought good developers that made OS X a joy to use not only for creatives, but developers as well.
The Microsoft Antitrust case of 2000, the Dotcom Bubble burst in 2001, the release of the iPod in 2001, the switch from System 9 to OS X/BSD in 1999, the switch from PowerPC to Intel in 2005, the release of the ultra desirable Titanium Macbook Pro in 2004. All of those contributed to give Apple the edge. And while everybody was down, it made them feel even larger than life.
Then in 2007/2008 they had a new push: the iPhone, the App Store, and the need to use XCode - available only in Apple computers. Different from being obligated to remain in Windows because of Visual Studio, to use XCode was actually an extra reason to buy a Mac.
Linux distros were still struggling. To this day, we were obligated to remain with the stupid X11. If a Linux-based OS was cutting edge, X11 was the worst possible OS on top of it. Apple were smart enough to not adopt it and instead create Quartz from scratch and improve it step by step on each OS X new release. It took them 5 years to fully stabilize. Only now, in 2017, we are finally moving away from X into Wayland/Weston.
Microsoft also struggled after Bill Gates left the reins to Steve Ballmer. The switch from XP all the way to the current Windows 10, going through the disastrous attempts at Windows Vista up to Windows 8, Windows Mobile, then the many attempts to bring new life to the dominant but defunct Internet Explorer - the largest Walking Dead ever in technology. The Ballmer era was a disaster.
Then the iPad got released and Steve Jobs died.
Apple is stuck in 2010.
And so are we.
Major Linux distros, Microsoft, PC hardware manufacturers, big industry turnarounds (Nokia, Blackberry, Motorola) the dawn of the GPU era, the end of Moore's Law speed clock wars into parallel computing, the dawn of ARM. It took 10 years for all the smoke to finally settle, and it took having Apple standing still for 6 years for everybody to finally catch up.
Ballmer left the reins to Satya Nadella. Microsoft acquires Xamarin, make huge strides into open source and even Ubuntu now runs natively on top of Windows 10.
Ah, and the Asians. Samsung, LG, Huawei, the South Koreans, the Chinese, are all forcing the North Americans to push harder.
One last legacy of the Steve Jobs era was both killing Adobe Flash (finally!) and pushing HTML 5 technologies forward, an acceleration of 10 years in 5. It was because of the iPhone that we have the Web as it is.
Google jumped in, took over the ball that Internet Explorer left behind. WebKit became the quintessential web browser reference. Javascript became the new application platform, pushed by the many evolving Social Networks.
For better or worse, the social networks forced the new dawn of complex and interactive Web Applications. We got the example of Gmail and Web 2.0 and with the new mobile environment, everything converged into making cross platforms, ubiquitous apps.
It's now easier than ever to switch between desktop OSes because we made it easy to jump around between the desktop and mobile devices already.
Why is it easy now to move between macOS or Windows or Linux distros? Or between macOS and iOS, or between Linux and Android? Because most of the software is in "the cloud". Linux dominates the Cloud world that Amazon debuted in 2006 with S3 and EC2, now closely followed by Google Cloud and Microsoft Azure and other smaller competitors.
Because there is at least a good Chrome browser in every platform.
Because the apps we use are mostly web enabled: Gmail, Spotify, Slack, Hangout, Twitter, Facebook, Spotify, YouTube, Netflix, Amazon. The other half are all available natively under iOS or Android: Whatsapp, Waze, Swarm, Snapchat, Instagram.
10 years ago, with the release of the iPhone, the technology industry finally started to put their shit together.
It was a cat and mouse chase until at least 5 years ago, then everybody started to settle.
After Jobs died we still had expectations that Apple would keep pushing forward, albeit in a slower pace without Jobs' whip.
It didn't happen. They made bad decision after bad decision and we keep feeling that we are back in the early 90's, under Scully, Spindler or Amelio.
But now, there are fallbacks readily available: Windows 10 Anniversary Edition is finally a competent Windows, bringing back the glory days of the mid-90's. And major Linux distros are cohesive, requiring little to no nerdy tweaking into obscure config files, pushed by Canonical - for better or worse.
And most of the apps we use in our routines are available. Heck, you can find a good web-based cross platform text editor in the form of Atom nowadays.
Apple also left the legacy of ubiquitous cross platform compilation readily available in the form of LLVM. It served their own needs to have Objective-C available in PowerPC, Intel, ARM and Apple Ax processors. But LLVM is now the best backbone to enable languages such as Rust, RubyMotion, Crystal to exist and easily create fast binaries in multiple platforms.
"Apple's current board of directors need to go". "Tim Cook needs to step down". I am sure many are crying out loud like this, but there is no Steve Jobs waiting to return. Apple is so big that the inertia alone will keep them going a few more years, but they are in danger of falling into irrelevance.
Other industries such as energy, health, genetics, are evolving in their own ways, but computer technology is in a dangerous stand still.
Computer Languages stopped evolving in the mid-90's. All "new" languages are amalgams of language features that were already there since the 70's and 80's. CPU processors are stuck into just making transistors smaller and more energy efficient. And we are already there: $9 computers are good enough for trivial tasks such as web browsing. So much that we are putting them everywhere: even a bluetooth toothbrush that keeps track of your brushing sessions.
Did you see CES 2017 products? Pillows and beds that track your sleep and sync data to an app. Washing machines. Vaccuum cleaners. Fitness devices. Health devices. Personal assistants such as Alexa. Everything takes data from you and sync them into "the cloud".
"The cloud" also reached peak. It's ubiquitous. It's not only affordable but downright cheap. Technology in general is dirty cheap nowadays.
Wearables and VR are good gimmicks, but even if they become ubiquitous, it's not a revolution, just an extension of the use cases that the iPhone revolution bootstrapped.
Mobile, wearables, social networks, only made people more anxious. Allowing everybody have a "voice" online didn't improve things, just made people more anxious. Opinions are in such large supply that they are worth less and less.
Every person online is now a source of terabytes of useless information to generate pseudo-statistics that are useless for the most part. Every person is now defined by a bunch of data in the cloud.
I have no idea where we are going from here, as a witness of history, I'd have to say that I miss the excitement of exploring new frontiers. We had that in the mid-90's onwards. We had a small break after the Dotcom burst but we got it back full blown in 2004. We lost it after 2010.
2010 has been the longest year I've ever seem. Maybe settling down is good thing. Who knows?