Like any software professional I am sure that I spend too much time making things "work" as far as "upgrades" are concerned.
iOS and Mac OS X in particular have problems with critical software, e.g., XCode, Adobe products, and so on that are required for day-to-day making a living that conflict with "the latest OS."
The short movie below (note: lots of foul British language) is on person's answer to the misery of upgrading Apple's Final Cut software.
"Kernel Panic" is about what happens when you take the joy of automatic software upgrades, required "feature" upgrades, hardware and software compatibility, and so into the future.
Now this video is set in a time when space travel is common - but really - you have to ask yourself how different this all is from, say a laptop, car, or home entertainment system?
And what about a medical system?
Imagine this same scenario in, say, an emergency room or surgery...
Like everything else humanity is ceding is control in the realm of technology over to various vendors whose only job is to retain you as a customer.
Apple and Final Cut Pro, a professional movie platform, is what this scenario was based on.
As a Apple Logic user I am sure I face the same perils.
You make an investment in time and money over multiple years in a particular software platform only to have it yanked out from under you by the latest OS upgrade.
Its not hard to imagine a day in the not-to-distant future when your Tesla or other hybrid vehicle refuses to start or leave the house without the same level of nonsense as described in this video. Or perhaps the new fridge or washing machine will simply decide its working conditions no longer warrant its operation without an upgrade.
This is not the first time a video has addressed the notion of humans fighting with technology.
Here's another clip from the 1974 movie called "Dark Star:"
The "bomb" which is about to detonate and destroy the humans in the "Dark Star" spaceship is talked through a philosophical argument by a crew member. Basically the crew member convinces the bomb that its "sensory input" (the bomb is an intelligent device) may be flawed and there the "detonation order" it received could be false.
Again, even in 1974, it was clear that at least some could see where technology is taking us.
Fortunately or unfortunately for the "Dark Star" the "bomb" at least did not require a "paid upgrade" to reach the point of giving the situation "further thought."
Free software is little better.
Personally I have found that free software is never really free except perhaps when its very old and basically no longer interesting.
When its new its constantly under "change" by the "contributors" who seem to care little about what it does save for their own interests - which is just fine.
After all, they wrote it and they use it. If I happen to use it as well I should expect nothing less.
Something like FireFox or Apache, though, over time morph into "standards" where the entrenched user base becomes so large it begins to slow and ultimately stalls the development. Basically users see no need for further changes and so change becomes limited. Sometimes products stall and sometimes they are simply "branched" into a new and different version.
The bottom line is that eventually the inertia of "new releases" is going to slow down Apple, for example.
Incremental improvement (as well as the business of charging for it) works only so long before people realize that upgrades are being released merely for upgrade's sake.
Lion to me is really the first sign of this failure as I have mentioned before. Mountain Lion is more of a bad thing. Both forced me to upgrade a number of software components needlessly. (And while I understand the need for a company to charge for something like a CS upgrade to match this simply compounds the problem).
From a graphic arts perspective these new upgrades off me very little of value. I don't need the incremental features they offer and even finding about about them and using them is a pain.
So from what I can see we are reaching the inertia point on Apple OS X.
I don't see iOS as too far behind: Now the changes are to make the devices more acceptable to different user demographics - bigger or smaller screens for example.
Take the demise of the 17" MacBook Pro as a good example. The more skilled you become the more you can do and the less like everyone else you are. I can do wonders with multitasking and 17" of display - but I am no longer the norm.
(Think of it this way: I need the highest bandwidth between the device and my brain to be my best. The new, smaller "retina" displays attempt to do this - but the pixels are smaller than the 17" so while the letter "s" might be sharper this does nothing to improve the bandwidth between the display and my brain. My eyes (old as they are) and brain can distinguish the letters of the English alphabet just fine. I need to be able to see literally the "bigger picture" to layout my work.)
The norm today is some youngster watching videos in the coffee shop when he should be working - a smaller, lighter device makes this sort of loafing convenient and free from the physical activity of carrying things about.
We'll see where the inertia of Apple, for example, takes us.
How for is the next Apple or FireFox upgrade from "Entropic Cheese?"
Contrast this with the HAL 9000 of "2001: A Space Odyssey" fame. He kills because conflicts in his design fail to resolve his need to have his mission succeed versus the need for humans to be kept alive.
We should be so lucky.
I expect a "Kernel Panic" soon.