On Jan 17, 2006, at 11:38 PM, Robert Ameeti wrote: > Every computer a user bought would most always bring with it a new > version of their software. As often as users purchased computers, > they too wanted new versions of their software. Rarely would a user > reinstall their old version of software when they purchased a new > computer with a new OS. Possibly, but I see situations like large government agencies, such as USDA, that run custom developed application suites on Windows 98 and 2000. In 2003 they updated to new Dell hardware and Windows XP. Their custom developed application suite continued to work perfectly, both client and server. If that government agency was running Mac OS X Jaguar and transitioned to Tiger even on the same hardware, the chances are very, very good that their application suite would be broken due to changes in core OS X frameworks, and require further development cost to make it work again. When you're looking at 130,000 seats running this software in every county in 50 states, and the fact that USDA hires a third party developer to write their custom software, the cost of Apple changing a framework and breaking their application is huge. Microsoft, with the exception of SP2 for Windows XP has taken extreme pains to make sure this doesn't happen. And with SP2, IT departments were warned to thoroughly test application suites before large scale deployment. I've never seen this with Apple - it just breaks without notice like the WorldBook application that shipped with an iBook that we bought for our daughter about three years ago. It stopped working when she upgraded to Panther, and hasn't worked since unless she shells out the 40 bucks to mackiev software to upgrade to a new version that works with the updated libraries in OS X. When you're talking large scale deployments, this lack of version increment reliability is disastrous. -- Chris ------------------------- PGP Key: http://astcomm.net/~chris/PGP_Public_Key/ -------------------------