To my mind, software upgrades are an economically efficient and pro-user offering. They are good for both the production and use side of the equation, allowing users to pay directly for the additional cost of development since their last version rather then all the original work and value that went into the product. They allow developers to reward their own supporters and more efficiently allocate resources. Additionally, "upgrades" should be (again, from a user perspective) simply full versions, identical, except cheaper and for existing users. This is how all commercial software I use works as well.
However, the entire concept of upgrades depends completely on legal licensing: that I can have a clause that says "you may not use this unless you previously owned a full version". I already see a number of posts, both here on Slashdot and on other forums (such as the comments with the Ars Technica article on this story), that are enraged at the result, and that argue that Psystar was "adding value" by "lowering hardware costs". The underlying argument is that, if a piece of software is sold, that should be that. However, how do those of you who argue for that square it with upgrading? Do you simply agree with the App Store take, where upgrades don't exist at all? Or do you have some other way of squaring things away?
As things have existed, Mac OS X offerings have all been upgrades and have been priced accordingly. There seems to be a reasonable consideration on both sides here: buyers pay less money, but in exchange have the restriction of needing to have a Mac as Apple has chosen to build their development around an integrated model. Do some of you think that such integrated models should be illegal, regardless of what benefits they offer? Should Apple be required by law to sell a "full" version of Mac OS X, and would you actually be willing to pay what that might cost (ie., if they said "full version, $400")? I'm genuinely curious about people's thoughts around this.