After a long debate with myself I have decided that I need to upgrade to Apple Lion.
Most of the computers I use at this point are Macs and so on the one hand upgrading to Lion would seem like a no-brainer.
But Lion is different to a large degree.
First of all it invalidates certain applications (list here).
Now, as a vendor with many customers for which I need my Mac to do certain things I feel a certain trepidation about replacing one OS with another that potentially would shut off some capability that I need to service a customer.
Now if I could somehow know what was going to break that would be one thing - but I don't. So I cannot put myself in a position of being on a service call, needing something and suddenly and unexpectedly finding out I cannot do what I need to do.
From a developers perspective each new OS brings new features and breaks old ones at the programming level as well. For example, I cannot use the latest XCode on a non-Lion system. To me this is just wrong. Why on earth does XCode require Lion. Ditto for iOS 5 development - for some unknown reason I need Lion as well.
So at this point I have a 10.5 (Leopard), and 10.6 (Snow Leopard) and now I will have a 10.7 (Lion) laptop.
Great.
The only ones just sit around as hot backups - at least until now.
What is troubling to me is why do these systems have to be incompatible at any level?
For example, going from 32-bits to 64-bits. Suddenly with Snow Leopard things you built in the development environment might not work on a slightly older machine - and all you get was a cryptic message on the failure.
I think that companies like Apple are still in a position of dictating to their customers.
Microsoft, on the other hand, with such a large (especially comparable to Apple) base of customers has had to bend to the "compatibility" wind and make sure that each new version of Windows within a very broad swath remains exactly backward compatible. (Yes I know that 16-bit Windows apps no longer work outside of Windows XP but, well, what can I say...)
Today with all the VM based technology I really don't see why we don't have systems that at some level are exactly backward compatible.
Certainly Apple, with its transition from the Power PC model to Intel accomplished this with Rosetta.
And today we have things like VMWare and Parallels that give almost perfect virtual machine representations of other machine environments.
Why are we still saddled with version upgrades breaking things?
One reason is money - every new version sells more machines (as is the case with me) as well as sells software.
I think, though, that there is a developer issue here as well. Developers only look forward - what new features can I add, what new things can my box do, and so on.
Therefore they don't look back unless the boss tells them to.
So anyway, all things considered, I have to go and get a new machine.
Most of the computers I use at this point are Macs and so on the one hand upgrading to Lion would seem like a no-brainer.
But Lion is different to a large degree.
First of all it invalidates certain applications (list here).
Now, as a vendor with many customers for which I need my Mac to do certain things I feel a certain trepidation about replacing one OS with another that potentially would shut off some capability that I need to service a customer.
Now if I could somehow know what was going to break that would be one thing - but I don't. So I cannot put myself in a position of being on a service call, needing something and suddenly and unexpectedly finding out I cannot do what I need to do.
From a developers perspective each new OS brings new features and breaks old ones at the programming level as well. For example, I cannot use the latest XCode on a non-Lion system. To me this is just wrong. Why on earth does XCode require Lion. Ditto for iOS 5 development - for some unknown reason I need Lion as well.
So at this point I have a 10.5 (Leopard), and 10.6 (Snow Leopard) and now I will have a 10.7 (Lion) laptop.
Great.
The only ones just sit around as hot backups - at least until now.
What is troubling to me is why do these systems have to be incompatible at any level?
For example, going from 32-bits to 64-bits. Suddenly with Snow Leopard things you built in the development environment might not work on a slightly older machine - and all you get was a cryptic message on the failure.
I think that companies like Apple are still in a position of dictating to their customers.
Microsoft, on the other hand, with such a large (especially comparable to Apple) base of customers has had to bend to the "compatibility" wind and make sure that each new version of Windows within a very broad swath remains exactly backward compatible. (Yes I know that 16-bit Windows apps no longer work outside of Windows XP but, well, what can I say...)
Today with all the VM based technology I really don't see why we don't have systems that at some level are exactly backward compatible.
Certainly Apple, with its transition from the Power PC model to Intel accomplished this with Rosetta.
And today we have things like VMWare and Parallels that give almost perfect virtual machine representations of other machine environments.
Why are we still saddled with version upgrades breaking things?
One reason is money - every new version sells more machines (as is the case with me) as well as sells software.
I think, though, that there is a developer issue here as well. Developers only look forward - what new features can I add, what new things can my box do, and so on.
Therefore they don't look back unless the boss tells them to.
So anyway, all things considered, I have to go and get a new machine.
No comments:
Post a Comment