Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

The Importance of OS Backwards Compatibility 380

gbjbaanb writes "Raymond Chen (of ancient Microsoft heritage) has a blog where he describes some of the things he's worked on, as well as oddments of obscure code and design decisions in Windows. Regardless of what anyone thinks of Windows, it is informative and often thought-provoking. Recently, Raymond posted an entry about backwards compatibility, and why it is such a big deal for large corporations. Something that I have read about on Slashdot regularly (where Windows is criticized for bothering with it at all), I thought readers would be interested in exactly why Microsoft spends so much effort on backwards compatibility, and by inference, why it is an important topic for getting Linux adopted by big business."
This discussion has been archived. No new comments can be posted.

The Importance of OS Backwards Compatibility

Comments Filter:
  • Omelette? (Score:3, Insightful)

    by caluml ( 551744 ) <slashdot&spamgoeshere,calum,org> on Tuesday November 14, 2006 @12:27PM (#16838870) Homepage
    You've got to break a few eggs to make an omelette, I always say.
    For a tasty omelette, add cheese, tabasco sauce, and ground black pepper.
  • by InsaneProcessor ( 869563 ) on Tuesday November 14, 2006 @12:28PM (#16838890)
    I am still working on 2.4 to 2.6 kernel issues. Linux and it's authors have no concept of backwards compatiblity. We have to redo everything and our purchased software suffers even more.
  • Huh? (Score:5, Insightful)

    by Salvance ( 1014001 ) * on Tuesday November 14, 2006 @12:30PM (#16838932) Homepage Journal
    Something's not right with these links ... is someone just trying to /. their own blog here or am I being redirected? The first link is completely off topic and goes to a post about SoftMac (a Mac emulator), the second is just an example of how one company has 9000 legacy scripts that require older version of windows (+ 400 16 bit programs). So what? This hardly seems like the a front page of slashdot argument for OS Backwards Compatibility ... there's really no argument other than stating the 9000 # and the 3 years it would take to convert them.

    Pure and simple, Microsoft has protected their market share by remaining backwards compatible, and will continue to do so for that reason only. A company like Apple can afford to ignore backwards compatibility to some extent, as this actually drives greater revenue from their loyal customer base buying new software. Microsoft though, cannot afford to give their corporate users a chance to make a migration decision.

    If Microsoft eliminated backwards compatibility, thousands of companies would be in a position where they needed to include the cost of migrating software in the upgrade decision. All of a sudden, Linux would become a viable option for these corporate clients, which Microsoft can't afford. For example, my company currently has over 900 16 bit applications that we haven't touched in ~10 years. Almost all of these run fine under XP and the beta versions of Vista, so upgrading to Vista will be a cheap option. However, if Vista didn't support these 16 bit apps, we'd have to spend years of time and Millions of dollars upgrading ... in this case Linux would likely become our new O/S.

    For this reason, Linux advocates (and many others) would love to see Microsoft remove backwards compatibility, but from a business standpoint Microsoft just can't do it.
  • by Neil Watson ( 60859 ) on Tuesday November 14, 2006 @12:32PM (#16838976) Homepage
    If corporate data was stored in more open format legacy applications would not be such a problem.
  • by Pantero Blanco ( 792776 ) on Tuesday November 14, 2006 @12:34PM (#16839004)
    Something that I have read about on Slashdot regularly (where Windows is criticized for bothering with it at all), I thought readers would be interested in exactly why Microsoft spends so much effort on backwards compatibility, and by inference, why it is an important topic for getting Linux adopted by big business.


    Perhaps I misunderstand, but is the submitter trying to say that Linux should learn from Windows in this area? Backwards compatibility is one of Linux's strong points, and Windows' performance in the area is terribly weak. Where are these people complaining about Microsoft spending TOO MUCH time on backwards compatibility; I've only seen people complaining about things being broken with every upgrade.

    Raymond's blog is interesting, though. I'm glad to see someone at MS interested in backwards compatibility.
  • It's the data! (Score:3, Insightful)

    by cpinto ( 1027148 ) on Tuesday November 14, 2006 @12:35PM (#16839044)
    Actually, I see a much bigger need to be able to access information than to be able to run old applications. The big problem in all of this is that a lot (if not all) legacy applications have closed data repositories so being able to run those applications in a modern OS is imperative.
  • Re:Omelette? (Score:5, Insightful)

    by CockMonster ( 886033 ) on Tuesday November 14, 2006 @12:36PM (#16839054)
    no point breaking eggs if no-one will be able to eat your omelette
  • by PadRacerExtreme ( 1006033 ) on Tuesday November 14, 2006 @12:38PM (#16839086)
    Not true at all. All that has to happen is have a supporting library change and the code doesn't work. Two examples:

    I am looking into a problem we have here where my software works fine with GD 2.0.23. If I upgrade GD to 2.0.28 (compiling from source, not using binaries) my code stops working. Everything compiles fine. Everything links fine. Just doesn't work.

    Look at the FOX toolkit. The interface completely changed from 1.X to 2.X. No backwards compatibility. I need to re-write all of my source to handle the new interface.

  • by NineNine ( 235196 ) on Tuesday November 14, 2006 @12:39PM (#16839122)
    I can run the original Visicalc on Windows XP Pro. Can you name one 27 year old program that you can run on a Mac? I know you can't do that on Linux without having to re-compile, which isn't at all acceptable for end-users.
  • by Jennifer York ( 1021509 ) on Tuesday November 14, 2006 @12:41PM (#16839168) Homepage
    Any argument against Backwards Compatibility (BC) is usually based form the point of view from the developer and not the user. A developer decides that it is too hard to implement a feature, or some such thingy, and they decide that a "re-write" is the best option. This is quite often the wrong decision if you are developing software for money. Throwing away the mountain of existing code so that a developer "feels" more comfortable is bad business. There are often better ways to solve this problem, and refactoring the code early and often is a good strategy.

    Now, if you happen to write code for fun, like me, then you are of course free to chuck it all out and start again. The Fun Factor outweighs the cost, since the cost is free. So govern yourselves accordingly, if you want to make money, then do as little as possible to develop the code, if you are in it for love and glory then do it for yourself.

  • by ohearn ( 969704 ) on Tuesday November 14, 2006 @12:44PM (#16839212)
    Trust me in a corporate setting not having backwards compatability is a big deal. For home users it is even worse.

    If you loose backwards compatability you run into the same problem that all the smaller OSs have in getting the corporate world to adopt them on a larger scale. An IT manager may be able to convince the bean counters to give enough money to do the OS swap, but try asking them for the money to have to swap every application you have as well because the old ones won't work anymore. Then on top of that try getting the funding and the authorization for the time to retrain all your employees on new applications that they are not as familiar with. Beyond that, productivity will go down for a while until people get used to the new systems even with training. Now if the new system is truely more efficient (from a worker bee point of view in time to complete a task) this may eventually pay for itself, but do you really think most upper managers are going to be that patient before firing the IT manager for screwing over the entire company's productivity rates? If you do then you are dealing with much more patient managers than I am used to or just folling yourself. That is the true cost of loosing backwards compatability, especially when a large number of your key applications are built in house, then you can't just go purchase the new version that will run on the new platform with minimal retraining needed. For companies that develop a lot of thier own applications in house, you have a lot of down time until the IT departments can recode to whatever new standard the new OS requires.

    Home users are better and worse off in some senses. While a lot of home users will probably just buy the new versions of major software (office software, email, etc.) when they purchase a new computer that still adds a lot to the cost of a machine. For the more technically savvy people out there (this is /. after all), even if you like the new OS and it's features, do you really want to have to replace every app you've got because they don't work properly anymore. Yes VM solutions can mitigate a lot of this, but those solutions are not perfect.

    I was one of the people that MS picked up on a 6 month contract for the extra support load when XP SP2 came out. Take it from someone who was there, the biggest complaint (especially from corporate customers) was when it broke compatability with old apps they were using. I heard a lot of people (and people I knew personally) say that they would install SP2 once the compatability issues were fixed when they eventually swapped to a newer version of the key app that they couldn't live without. Now MS did fix some of those issues with patches shortly after SP2 came out, but imagine that problem scaled up to replacing an entire OS without regards to backwards compatability. I know everyone around here likes to bash MS, but they are not nearly stupid enough to piss off thier corporate customers that badly. They know that would be the fastest way to push people away from Windows in a heartbeat, or at least to insure that not many people bithered swapping to the new version instead of just staying with what they already had that works.
  • by Chris Burke ( 6130 ) on Tuesday November 14, 2006 @12:44PM (#16839218) Homepage
    Because I know they need it. Without compatability with old Windows software, the next buying cycle every CTO on earth would have a chance to consider whatever operating system they wanted since all would involve switching to entirely new software, as opposed to now where buying Windows to replace Older Windows is the automatic choice. It isn't just Window's market share that maintains there monopoly, it's the resulting market share of software for Windows that really does it. Drop that, and Windows' real advantage is gone.

    I do make fun of them for not being able to stabilize and secure some of the old code, though I understand it's tough when the code is old, complicated, and crufty, and a lot of old programs require "bug for bug" compatability.

    I think the solution for them is deprecation. Old interfaces that are stuffed with crufty code, and which were often insecure by design anyway, should simply be made unavailable for new applications. You have an old app that wants to use it, fine. You want to code a new app using OLE or ActiveX or what have you? Tough. Eventually software gets replaced, and eventually you wouldn't have any software using the old systems and you could disable them. Sadly there's still plenty of new development using ActiveX and other crap, so MS both shows no interest in doing this nor may they be able to.

    Now how does this apply to Linux? In Free Software Happy Land, you have the source to all the software on your system, so with the ability to recompile you don't need binary backward compatability. Being able to link your source code against new libraries can fix a lot of the problems with having to support really old binaries. There's still the issue of interface compatability (which is tied up in binary compatability in the Windows world), but a lot of the interfaces are stable.

    This isn't Free Software Happy Land, though, we're talking about businesses. Personally I think that adopting Linux should also come with adopting some of its philosophies, in particular that having source code is much much better than not having it, so if you aren't getting source then it had better be much much better than the program you can get with source. Linux does not have a strong history of binary compatability, and I'm not sure it's best for Linux to start establishing such a history. Business may not like it, but maybe they need to adapt to Linux not the other way around. Who knows, they might figure out that they like getting source code from their vendors and accidentally discover a better way to do things.
  • FUDmeister! (Score:3, Insightful)

    by NineNine ( 235196 ) on Tuesday November 14, 2006 @12:49PM (#16839304)
    Pick any random Linux distribution or *BSD and you will find many "30 year old programs"

    Well that's a good argument, isn't it?

    Stop being stupid - most 16 bit windows software does NOT run on windows XP.

    Really? Are you just flat out lying in order to accomplish something, or are you really that clueless? Would you like to see a screenshot of what I have running right now, my intelligence-challenege little friend?
  • Re:Huh? (Score:2, Insightful)

    by SillyNickName4me ( 760022 ) <dotslash@bartsplace.net> on Tuesday November 14, 2006 @12:55PM (#16839402) Homepage
    Pure and simple, Microsoft has protected their market share by remaining backwards compatible, and will continue to do so for that reason only.

    Since they are a business, in it to make money, NOT to make cool technology, that is the only correct decision for them to begin with.

    A company like Apple can afford to ignore backwards compatibility to some extent, as this actually drives greater revenue from their loyal customer base buying new software.

    Aha? Apple invested a lot of time and efford into making at least the important OS9 applications work on OSX, and for the exact same reason. Again now they moved to Intel hardware, they are spending time and efford to ensure powerpc based apps will work.

    Microsoft though, cannot afford to give their corporate users a chance to make a migration decision.

    Migration is expensive anyway, and is not what this is about. This is about selling upgrades and not shooting yourself in the foot.

    If Microsoft eliminated backwards compatibility, thousands of companies would be in a position where they needed to include the cost of migrating software in the upgrade decision.

    migration: move to another platform/application
    upgrade: move to a newer version of the same platform/application

    Yeah, you can quite argue the later is a special case of the first, but the conversation is a lot easier to follow when you don't do that.

    All of a sudden, Linux would become a viable option for these corporate clients, which Microsoft can't afford.

    It becomes a viable option as soon as the applications for Linux exist. Where they do, it becomes a matter of which choice is more attractive, but all choices that offer the desired functionality are 'viable'

    For example, my company currently has over 900 16 bit applications that we haven't touched in ~10 years. Almost all of these run fine under XP and the beta versions of Vista, so upgrading to Vista will be a cheap option. However, if Vista didn't support these 16 bit apps, we'd have to spend years of time and Millions of dollars upgrading ... in this case Linux would likely become our new O/S.

    A large part of the business world still runs Windows 2000, and I'm pretty sure that if those apps won't run on Vista, your company would stay on XP, possibly paying Microsoft for extended support, untill such moment that there are new applications that will run on Vista.

    The cost of migration in a large environment is first of all related to training people for the entirely different system.

  • by PFI_Optix ( 936301 ) on Tuesday November 14, 2006 @12:55PM (#16839406) Journal
    Allow me to do a variation on a Balmer-related meme:

    Dependencies! Dependencies! Dependencies! Dependencies! Dependencies! Dependencies! Dependencies!

    I'm a broken record on this subject, but I've had quite a few nightmare compiles on Linux that have resulted in me abandoning it on my laptop in favor of Windows. At least there the software I want to use works. They have *got* to fix that problem if Linux is going to become a mainstream desktop product.
  • by NineNine ( 235196 ) on Tuesday November 14, 2006 @12:58PM (#16839460)
    I don't ever plan to run 27 year old programs, and neither do a large amount of people

    I disagree. There are tons and tons of old DOS programs that are still at work today in many, many businesses. I user several, myself, along with a few 16 bit programs. Not using something simply because it's old is pretty dumb. Do you just throw out old things once they get too old, even if they're still working?
  • by abigor ( 540274 ) on Tuesday November 14, 2006 @12:58PM (#16839464)
    "For MS, backwards compatibility is delusional lip service - but that is what IT managers like, after all."

    Er, have you ever worked in software or IT? Microsoft place the highest emphasis on backwards compatibility, running 16 bit software in virtual machines and 32 bit software natively. I'll ignore your inane comment about IT management, since you sound like you're just making stuff up anyway.

    Can you explain the precise mechanism of this Unix binary backwards compatibility you're talking about? I work with Linux every day and have since 1998. I'm pretty sure if I grabbed a version of, say, Postgresql from those days and tried to launch it now, it wouldn't work, nor would it even compile. Same with any gui desktop app or driver. I haven't worked a lot with AIX or HP-UX since the '90s, but I'd be pretty surprised if a CDE app from 10 years ago just fired up without any problems.
  • Backwards compatibility is one of Linux's strong points

    Hmm.. you sure about that?

    I seem to recall a few issues with this...

    ie, multithreaded applications built for 2.4 may encounter problems on 2.6 unless you set some specific environment variable.
    Or.. how about the nice compatibility between glibc versions?

    As long as you can recompile your software, Linux provides excelent backward compatibility, but that is a non-option for many situations.
  • by plopez ( 54068 ) on Tuesday November 14, 2006 @01:05PM (#16839588) Journal
    Probably the best in the industry
    Only if you ignore IBM. Their customers spend huge sums on very large, capable machines and expect a huge ROI (banks, oil companies, gov't institutions) so there is old JCL and Cobol running on mainframes going back to the 70's. If it doesn't run natively, you can always run it under VM.

    I for one haven't found any useful applications I can run prior to mid 90's.
  • Re:Two ways (Score:3, Insightful)

    by beuges ( 613130 ) on Tuesday November 14, 2006 @01:11PM (#16839684)
    Mr Chen has also addressed that issue at length as well. There are many problems with simply including an emulator with the new OS that can host the old OS, for example supporting copy/paste between apps in the native OS and apps in the emulated OS. Simply providing an emulated solution for backwards compatibility with the old OS was considered but rejected because it provided a crappy user experience.

    Another example - you have a network share mapped on your native OS. You double-click a file in the network share which is associated with a program written for a previous version of the OS, so it needs to be launched in the emulator. Emulator lauches, and launches the old application. How does the emulator know that this drive needs to be network mapped?

    Not to mention the fact that you'd have two task bars, two system trays, two start menus, etc.
  • by AdamKG ( 1004604 ) <slashdot&adamgomaa,com> on Tuesday November 14, 2006 @01:13PM (#16839712) Homepage
    The difference is, you still have 2.0.23, and you can keep it forever. With restrictive-licensed software, that isn't the case. With MS moving to yearly licenses- especially for companies- F/OSS is going to look more attractive to companies who can't afford to rewrite their software at a whim- they will love the option of sticking with the earlier library.

    Notice that even though it would be convenient for you to be able to upgrade GD- it's not required. You have an eternal license to it (Assuming it's under an OSS license, which I don't know for sure as I have no idea what GD is :D but it wouldn't make sense in the context if it were proprietary... but whatever. )
  • by archaic0 ( 412379 ) on Tuesday November 14, 2006 @01:15PM (#16839774) Homepage
    You're sort of missing the point here though. This statement is pretty common in the Linux world... "all you have to do is recompile..." ALL you have to do? Really? All I have to do is re-compile? Assuming I have the proper kernel to begin with, and the proper libraries, along with all their dependencies... I just want to run a program man, I don't want to become a programmer just to use my computer. (90% of office workers speaking, I.T. specific roles excluded)

    The truth is that compiling a program (under ANY platform) IS rocket science level stuff as far as computers are concerned. Yes, a programmer does it in his sleep (as do most Linux daily users). But Joe CEO and Jack employee can click next all day but the moment he has to type ./configure or ./make or whatever, his eyes will glaze over and he'd rather spend a million dollars on a setup wizard app than have to go to school just to learn how to install a free app. Not to mention the fact that 6 times out of 10, you'll get an error and have to track down what went wrong and fix something before you can attempt to compile again.

    Compiling source to a binary IS complex stuff despite what any mainstream Linux supporters might think. And having to re-compile something EVER, WILL keep it from being accepted main stream.

    I've even heard people say "well all ya gotta do is just make your own kernel and there ya go, piece of cake". As easy as that might sound for the Linux guru, that is exactly equal to telling a driver to build their own car. Plenty of mechanics out there that can do it, yes, but EVERYONE isn't a mechanic nor should they be. People who build their own kernels and compile software are the mechanics of the computer world and shouldn't expect all computer users to be mechanics.

    I for one do not ever want my bosses to be as 'smart' as us I.T. guys are. Why then would they have any use for me if THEY know how to compile a kernel? If the world was so enlightened as some people seem to want it to be, then a lot of people would be without jobs because their skills wouldn't be needed.

    Don't get me wrong though, I'm not saying one way is the end all better way, I'm just saying that as long as the corporate world is run by people who just 'want it to work' (which will be forever) software that requires the user to do the programmers job will not fly. All my servers are Linux servers, but that's only because myself and the other techs here are skilled at making them work. I wouldn't DREAM of putting Linux on our desktops. Are you crazy?! Will YOU then be the one who baby sit every soccer mom who comes to work for us and teach them how to use it for free?

    If products could be packaged such that they get compiled during install, but in the background with the user being none the wiser, then it might fly. Like say your installer went out and did all the work on its own of gathering the required libraries while the user only ever saw a wizard with a next button.

    My two cents, take it or leave it...
  • by letxa2000 ( 215841 ) on Tuesday November 14, 2006 @01:18PM (#16839830)
    I agree. From 2002 to early 2006, I was using Linux exclusively on my laptop desktop. I got used to things not working. And although I am a geek, my goal in life is not to Google and investigate to see if I can tweak my OS to do what I want. This year I got a new laptop with XP on it. Everything just worked so I just stayed with that.


    I still have my Linux laptop which I ironically use as a Linux server--something that Linux is good at.

  • Re:Huh? (Score:3, Insightful)

    by rainman_bc ( 735332 ) on Tuesday November 14, 2006 @01:18PM (#16839838)
    If Microsoft eliminated backwards compatibility, thousands of companies would be in a position where they needed to include the cost of migrating software in the upgrade decision.

    As compared to the upgrade path to OSX, where non native apps wrong like a slug in emulator mode?

    Flame away here, but Microsoft has been fairly good with their backwards compatibility. At least as good as OSX, if not better.

    If a law firm for example continues to insist they should be running Word Perfect 5.1 because that's all the legal secretaries want to use, that isn't Microsoft's problem. Moving to Linux isn't going to make that any different either.
  • by MrSteveSD ( 801820 ) on Tuesday November 14, 2006 @01:19PM (#16839852)
    I thought readers would be interested in exactly why Microsoft spends so much effort on backwards compatibility

    Shame they don't do that with other products. They discontinued the VB language (forget about VB.NET, not the same!) and left thousands of companies in the lurch. Millions of dollars were invested in writing VB products around the world and many of the companies do not have the finances to completely rewrite their products again in a new language. I suspect that many of them are keeping quiet since making a big noise about it would frighten both customers and investors.

    I was therefore happy to hear that Java is going open source. Perhaps we can now consider it "safe" to use.
  • by 0racle ( 667029 ) on Tuesday November 14, 2006 @01:23PM (#16839924)
    Your point? Since the NT line does not ship with DOS, but a command interpreter, that the NT OS's run DOS programs is a very good indication of MS's backwards compatibility.
  • by Anonymous Coward on Tuesday November 14, 2006 @01:27PM (#16840002)
    The fact that most software found on a typical Linux install has source available only mitigates the backwardly-incompatibility. In the same way that Internet Explorer vulnerabilities are mitigated by "the attacker must trick the user to visit a web page". Sure it helps, but... not really.
  • by Magada ( 741361 ) on Tuesday November 14, 2006 @01:32PM (#16840074) Journal
    Products can and are packaged like that. Ever heard of Gentoo, for instance? True that you are left staring at that "Next" button for a long time, ;-) but it does work, there's no dependency hell and when it doesn't work, it's usually a bug in the app, not missing kernel header files or some such inanity. Under the GPL, you are supposed to have the sources. Why not use them?

    Also, soccer mom doesn't need kernel upgrades, or tweaked kernels, or upgrades to apps. She needs security patches, no more, no less. Apps can and should stay the way they are because Aoccer mom doesn't really want to "upgrade". She wants things to JustWork(tm), 'cause she's a luddite.
  • by Darkforge ( 28199 ) on Tuesday November 14, 2006 @01:33PM (#16840084) Homepage
    So don't upgrade major versions. [...] Something breaking between 1.x and 2.x is expected.
    You and I have come to expect that, because we're used to it. But that's not the way it has to be; Microsoft has billions of dollars and enormous market share because they don't break backwards compatibility, even with major releases. (If they did, nobody would upgrade, just as you say.)
  • by Savage-Rabbit ( 308260 ) on Tuesday November 14, 2006 @01:34PM (#16840106)
    With Free/Open Source Software, you are free to upgrade the applications with the OS, and it's the distros' business to make it work.

    In an enterprise Environment isn't always as simple as upgrading to the latest bleeding edge Linux, downloading the newest software versions, compiling, installing, uploading your configs and scripts and everything works flawlessly. That's the point TFA was making. Unless your distro is 100% backwards compatible (ok, 90% compatible, there are always problems) back to, say... the 2.2 kernel many corporations won't take Linux seriously as a solution because the cost of debugging the problems that accompany each upgrade because of broken compatibility issues would be prohibitive. Today some Enterprise grade Linux distro's offer this kind of a guarantee, at least up to a point. So if you have ever wondered why people shell out all that money for RedHat SuSE and Co and their Enterprise quality distributions when they can download Slackware for free, that's **one** of the reasons why. While they may not be on the bleeding edge technologically or perform as well some other distributions, the stability and continuity these Enterprise distros offer reduces costs appreciably.
  • by mha ( 1305 ) on Tuesday November 14, 2006 @01:40PM (#16840176) Homepage
    So don't upgrade major versions.


    Not possible - no one supports that old version. If there are any important fixes (not just security, anything) they are always for the latest version. Open source people don't bother supporting older versions... ;-)

    It's also open-source, so you're free to keep your own development and bug fixes going if you can fund it yourself.


    This argument isn't even worth refuting, it's so obviously childish.
  • Re:Public Enema #1 (Score:3, Insightful)

    by swordgeek ( 112599 ) on Tuesday November 14, 2006 @01:45PM (#16840236) Journal
    I sense a lot of anger coming from this one.

    Seriously, you've got a point. An honest OS vendor guarantees backwards compatability only using published, official programming interfaces. Application vendors turn around and find some obscure undocumented library call that makes life easier for them, and use it everywhere. When the OS is upgraded, the app breaks and "backwards compatability" gets shat upon unfairly.

    A major problem though, is that many of these applications are niche products which makes them a de facto monopoly. If the customer needs functionality "x", then they're the only game in town. Thus the customer says, "I don't care if it's the application vendor's fault, fix it in the OS or I'm switching to Windows!"

    Result: Backwards compatability is guaranteed for cases where it _should_ break. Unpublished interfaces become official and supported standards. Cruft carries forwards.

    As for the "don't upgrade your computers," that's lovely--as long as you don't want any support for your OS. Kind of hard to get (unrelated) patches for OSes that have been EOSL for four years, though. Next year, a significant part of the western world is going to change daylight savings time, which will require a patch. Any machines that are running old OSes are going to be unable to keep proper time.
    Pretty close to a metaphorical gun, I'd say.
  • by Rodness ( 168429 ) on Tuesday November 14, 2006 @01:52PM (#16840352)
    This isn't Free Software Happy Land, though, we're talking about businesses. Personally I think that adopting Linux should also come with adopting some of its philosophies...

    That's an idealistic view, and completely out of whack for most people. Remember that most businesses (at least in the US) are small businesses, meaning that the IT person usually wears other hats. That IT person is more likely to be a Mom or Pop type who just wants the damn thing to work and doesn't want to go fiddling with source. The moment they upgrade and there's bustage, if they don't switch outright then they're going to make a mental note that it fucked up the business for a week and the next time there's upgrading to be done, they're going to consider something with better backwards compatibility, ie Windows Server or something.

    The analogy to your view would imply that just because I drive a car, I'm interested in doing the maintenance myself.

    If you use Windows, do you automatically share Redmond's philosophies? I didn't think so, or you wouldn't be reading Slashdot. So why should Linux users necessarily share Linux's philosophies?
  • by Sancho ( 17056 ) on Tuesday November 14, 2006 @01:58PM (#16840428) Homepage
    Can you point me to a place that shows that MS is moving to yearly licenses? Also, point me to a clause that says I can't continue running an older version of their software? Thanks in advance.
  • Scale Models (Score:3, Insightful)

    by Doc Ruby ( 173196 ) on Tuesday November 14, 2006 @02:00PM (#16840472) Homepage Journal
    Without backwards compatibility, however incomplete, different versions of Windows would be different platforms. Which would compete with each other, and especially with new releases of apps and OS. And interfere with Microsoft's claims to represent 90% of installed computers/users/whatever. Which claims are the main factor when most people decide whether to buy/install/develop MS apps.

    Backwards compatibility, even if not extensive enough to make all installed MS hosts (including WinCE) into a single platform for the largest imagined application scale economies, is enough to create the illusion of those benefits. Which keeps people buying and building MS at the large scales which actually do deliver those lesser, but still extremely competitive, scale economies.
  • Small businesses? (Score:2, Insightful)

    by mha ( 1305 ) on Tuesday November 14, 2006 @02:08PM (#16840602) Homepage
    A friend of mine has a very nice mexican restaurant and the software for the cash register is DOS based. It looks good (even uses a touch screen), it works - why would he ever invest 10000 Euros to buy a new one (that much because it's special software, you can't buy cash register software for restaurants at CompUSA, and he has to buy new hardware too - even if the new stuff would run on his machine, the companies who sell it to him and support him require this to lower their own support load by refusing to support any hardware/software combinations but their own ones).

    I myself still own half of a (German) MBE (Mailboxes Etc.) store (95% business customers, 75% digital printing), and the only reasons our PCs were up to date were a) we did a lot of digital printing and some design, and Adobe Creative Suite CS(2) needs LOTS of resources (but it's great software - proprietary, bloated, etc. - but I like it anyway except for the price :-) ), b) I'm an IT guy, had quit my employee life only half a year before starting this business together with a friend, c) we just started so everything was new. But reality is, you're busy all day and thinking about which patch, what software, or what nice piece of hardware one could buy for ones business is the LAST thing one wants to worry about in such a situation! There just is no time to deal with IT, and little money ((esp. onsite) IT support is *very* expensive if you have a question that cannot be answered by the outsourced call center in India), and if there's money left one can think of many more important investments than IT. Do you know how expensive even very basic equipment used for making brochures or booklets is? (tip: don't EVER buy plastic, such machines must be STEEL or they'll fall apart - paper is very tough, hard to process, you won't belive it!)

    There are many, many SMALL businesses like that. You /. guys keep talking about huge data centers and big businesses with a big budget and an IT department. Do you have any idea how hard any update is for the much larger part of the economy, the SMALL businesses? They cannot afford to higher someone, they cannot afford to spend much time on their software (even I need more than a full day to finish setting up a new PC, I just got a new Dell, and installing all my software and setting up my work environment is SUCH a pain!)

    You (in general, majority here) don't like large corporations or at least treat them with a LOT of suspicion, but all I ever read here is about the big companies.
  • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday November 14, 2006 @02:10PM (#16840628)

    If products could be packaged such that they get compiled during install, but in the background with the user being none the wiser, then it might fly. Like say your installer went out and did all the work on its own of gathering the required libraries while the user only ever saw a wizard with a next button.

    I think modern package management is weak on every platform, but by combining several into the OS would bring the benefits you crave. I'm a huge fan of OpenStep/GNUStep/OS X where applications are folders ending in ".app" with a specific structure. This allows for multiple binaries for different platforms in one "file" that can just be copied and run anywhere. There is an increase in size, but given modern hard drives it is tiny, especially since all resources can be shared rather than duplicated. And having those resources separate is great for those of us that want to grab a song or image used in some game we have. The portability and lack of an installation step is really, really nice, especially for novice users.

    To get back to your specific point, I see no reason why source code and build instructions cannot likewise be included in this package. Add a little "build custom binary" option for each application and you have the ability to do just what you ask, but without the drawback of having to wait for an install process. In fact, the OS could automatically schedule the compiling of a custom binary the first time a program is run. There is no need for a wizard at all. Drag the program where you want it and double click. There's no need for dependency checking either since needed libraries are in the package. I really think this level of simplicity would benefit everyone.

  • by doodlebumm ( 915920 ) on Tuesday November 14, 2006 @02:13PM (#16840688)

    Even after billions and billions of development dollars Microsoft still breaks lots of applications on their major releases. I've been working on a server 2003 system that we've had to tweak and fiddle with for over a month to get a couple of applications to work properly, and we're still working on them. There are a couple more that will not work and have to be abandoned. These are older applications, so that could be the problem, but they were running on server 2000. No one can tell me that they are 100% compatible, because they are not.

    Which would I rather do, try to get a program to work that is proprietary on a proprietary system or open source on an open source system? Hmmmmm. Let me think.

    Also, if you want an open source application be backward compatible, send a little money to the authors. I bet you'll get a much better response from them than you would from a company that charges you an arm and a leg for a proprietary application. Try getting Microsoft to make a change to their system! Even large companies have to usually take what they get pushed on them from Microsoft.

  • by tchuladdiass ( 174342 ) on Tuesday November 14, 2006 @02:17PM (#16840750) Homepage
    The fix is simple, but most people argue against it. Use shared libraries where appropriate, and static libraries where it makes sense. The argument for shared libraries everywhere is that it saves disk space, and it makes updating all your apps easier by just upgrading the libraries they depend on. This is ok for widely deployed and tested libraries, where it is very unlikely that something will break between minor versions. But if a library is likely to be used by only a handfull of apps (such as audio / video codecs), then by all means compile them into the app. That way you reduce the number of dependancies and the breakage, at the expense of a bit more disk space used and the inconvience of upgrading a few more packages when a bug is found in the library in question.

    Note, that this is mearly an issue for third-party packages for your os. Everything that comes with the os can follow whatever rules is best suited for it, as the whole thing is developed together. But I see no advantage to having to track down and install a dozen seperate library packages in order to get a particular app installed, esp. if those libraries are use _only_ by that app. Also, the worst offense a package can commit is to require major updates to packages that are included as part of the os. If the os ships with a supported version of libfoo-1.7.2.so, then the app package should be compiled against that version and not against (and requiring) libfoo-1.7.5.so (or worse, libfoo-1.8.x.so), instead if it actually requries functionality that is only present in the newer lib version, then it should ship with a private copy either statically linked, or installed in the app's lib directory. Because if something else uses that lib, chances are it will need libfoo-1.7.4.so, and be incompatible with libfoo-1.7.5.so (even though minor version increases shouldn't break compatibility, but it happens anyways).

    [ /soapbox ]
  • by The Great Pretender ( 975978 ) on Tuesday November 14, 2006 @02:18PM (#16840764)
    ...a non-coding standpoint. As a business we have a huge amount of data in the archives, in our case only from 91. One of our biggest issues is if we need to access that data we need the current platforms and application software to be backwards compatible. If the systems were not backwards compatible we would have to dedicate a/several computer/s and then up-load the old software to access the data. I suppose that we could use emulators. Bottom line is that it would be a huge pain for IT. In addition, it would be nice if anyone of the Project Managers can access the data from the server directly without worrying how to view it.
  • by Anonymous Coward on Tuesday November 14, 2006 @02:26PM (#16840952)
    twitter, please read this carefully. Following this advice will make Slashdot a better place for everyone, including yourself.

    • As a representative of the Linux community, participate in mailing list and newsgroup discussions in a professional manner. Refrain from name-calling and use of vulgar language. Consider yourself a member of a virtual corporation with Mr. Torvalds as your Chief Executive Officer. Your words will either enhance or degrade the image the reader has of the Linux community.
    • Avoid hyperbole and unsubstantiated claims at all costs. It's unprofessional and will result in unproductive discussions.
    • A thoughtful, well-reasoned response to a posting will not only provide insight for your readers, but will also increase their respect for your knowledge and abilities.
    • Don't bite if offered flame-bait. Too many threads degenerate into a "My O/S is better than your O/S" argument. Let's accurately describe the capabilities of Linux and leave it at that.
    • Always remember that if you insult or are disrespectful to someone, their negative experience may be shared with many others. If you do offend someone, please try to make amends.
    • Focus on what Linux has to offer. There is no need to bash the competition. Linux is a good, solid product that stands on its own.
    • Respect the use of other operating systems. While Linux is a wonderful platform, it does not meet everyone's needs.
    • Refer to another product by its proper name. There's nothing to be gained by attempting to ridicule a company or its products by using "creative spelling". If we expect respect for Linux, we must respect other products.
    • Give credit where credit is due. Linux is just the kernel. Without the efforts of people involved with the GNU project , MIT, Berkeley and others too numerous to mention, the Linux kernel would not be very useful to most people.
    • Don't insist that Linux is the only answer for a particular application. Just as the Linux community cherishes the freedom that Linux provides them, Linux only solutions would deprive others of their freedom.
    • There will be cases where Linux is not the answer. Be the first to recognize this and offer another solution.

    From http://www.ibiblio.org/pub/linux/docs/HOWTO/Advoca cy [ibiblio.org]

  • by tchuladdiass ( 174342 ) on Tuesday November 14, 2006 @02:45PM (#16841290) Homepage
    But they could ship the dynamic versions, and store them in their own app directory. Then use LD_LIBRARY_PATH, or LD_PRELOAD, or even use dlopen().

    Also, even open source apps have that issue, when you are downloading pre-built packages. MythTV is an example (of coures, it is an unfair example as it is still considered beta). I would rather download one RPM that has the needed shared libraries in a private directory or have most of them statically compiled in this case. That way it would be easy enough to make one package that works on multiple os distros (one rpm should work with rhel / centos 3.x, 4.x, fedora core 4, 5, 6, for example).
  • by Moraelin ( 679338 ) on Tuesday November 14, 2006 @03:33PM (#16842228) Journal
    While I'll fully aggree with your points about the common user, I'd argue that IMHO it's not that much different for programmers and generally those in IT roles. Sure, for a programmer it's a lot less intimidating and a lot less "rocket science", but that doesn't mean he/she will automatically enjoy it.

    Speaking as a programmer, the interesting and challenging part is the _programming_ part. The tweaking of algorithms, the thrill of learning some new technique, etc. That's the fun part. The compiling itself is _not_ the exciting part. Sitting and watching Joe's Own Toy Program (TM) compile is about as exciting and watching paint dry. Tracking down the dependencies for it is even less exciting.

    In fact, I'll even go ahead and say that anyone whose great feat was compiling some 3rd party program, probably isn't really a programmer to start with. There are a ton of people who just like to pretend they're oh-so l33t because they can run someone else's build script. Maybe they even configured (through the nice supplied GUI) and compiled (by running the commands supplied in the readme) a _kernel_. Wow, that makes them sooo great computing gurus. Not. That's to programming what script-kiddies are to real security experts. A sad joke.

    And even as programming goes, the fun is doing the things _I_ want to do, learning the things that _I_ find interesting at the moment. Maybe I'll toy with this great new algorithm or language I just heard about, or maybe I'll mod a game just for the sake of seeing if I can get to the ballistics code, or whatever. Whatever tempts me at the moment. But that's a personal, subjective and transient thing. What tempts me tonight might be (and usually _is_) a whole other thing than whatever program Tom couldn't be arsed to finish, or Dick couldn't be arsed to test, or Harry couldn't be arsed to port. I want to do _my_ stuff, not debug Tom, Dick and Harry's programs just because they happen to be OSS and on my computer.

    Basically just like a literature buff might choose to spend the evening reading the novel of _his_ choosing instead of coming over to help the neighbour's kid finish a school essay about War And Peace. Sure, he certainly is qualified to help that kid, but it's not necessarily what he'd choose to spend the evening with.

    In the end what I'm saying is that what I want from a computer is no different from what Jack Random and Jane Average want. I want it to just work. Whatever I choose to do with it, whether it's programming my own toy app or watching a DVD or playing a game, I _don't_ want to compile the IDE/media-player/game/whatever first, and that goes double for pointless track-the-dependencies games. If I chose to do X tonight, then anything else that gets in between me and X (like having to first compile some other stuff) is just a waste of my time.
  • by gfxguy ( 98788 ) on Tuesday November 14, 2006 @03:58PM (#16842636)
    But they wouldn't upgrade the OS on those boxes!
  • by suv4x4 ( 956391 ) on Tuesday November 14, 2006 @04:10PM (#16842858)
    I've been working on a server 2003 system that we've had to tweak and fiddle with for over a month to get a couple of applications to work properly, and we're still working on them. There are a couple more that will not work and have to be abandoned. These are older applications, so that could be the problem, but they were running on server 2000. No one can tell me that they are 100% compatible, because they are not.

    Do you know what this means. That in most cases those apps were not coded properly. Let's take IE7 for example. People scream how it breaks apps left and right.

    I tested 30 of my existing sites, some involving relatively complex CSS and JS ("ajax") functionality. I haven't found a single flaw. That shocked me, you know? No issues whatsoever, after all doomsday reports about IE7 being so incompatible.

    The reason is many people don't code to specs, don't code to standards, don't code to API's. They just code and tweak until it seems to work and then move on. The reasons are plenty: lack of experience, lack of time and so on.

    Still, when you use the tools (in my case HTML/CSS/JS) you're provided with in the way they were meant to be used, there's a huge chance your apps will work with future versions.

    If you rely on race conditions, hacks, or pure luck, this is virtually impossible to be backwards compatible with, although apparently MS tries hard to do that as well.
  • by Anonymous Coward on Tuesday November 14, 2006 @04:28PM (#16843150)
    What about the 2.6 device driver interface? I saw a good paper on 'collateral evolution' in the 2.6 kernel - every time the driver interface changes, something that runs inside the kernel breaks. And in the 2.6 dev cycle, there have been hundreds of such changes, some of which were very significant... http://www.emn.fr/x-info/coccinelle/eurosys-padiol eauy.pdf [www.emn.fr]
  • by westlake ( 615356 ) on Tuesday November 14, 2006 @04:51PM (#16843594)
    In an enterprise Environment isn't always as simple as upgrading to the latest bleeding edge Linux, downloading the newest software versions, compiling, installing, uploading your configs and scripts and everything works flawlessly.

    in what environment could a process so agonizing and fragile as this ever be called "simple?"

  • by westlake ( 615356 ) on Tuesday November 14, 2006 @05:07PM (#16843946)
    She wants things to JustWork(tm), 'cause she's a luddite.

    a "luddite" rebels against technology that threatens her employment. your soccer mom simply wants to get the weekly financial reports out on time so she can get home to her kids.

    she has every right to expect tech that "just works."

Old programmers never die, they just hit account block limit.

Working...