Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

The Importance of OS Backwards Compatibility 380

gbjbaanb writes "Raymond Chen (of ancient Microsoft heritage) has a blog where he describes some of the things he's worked on, as well as oddments of obscure code and design decisions in Windows. Regardless of what anyone thinks of Windows, it is informative and often thought-provoking. Recently, Raymond posted an entry about backwards compatibility, and why it is such a big deal for large corporations. Something that I have read about on Slashdot regularly (where Windows is criticized for bothering with it at all), I thought readers would be interested in exactly why Microsoft spends so much effort on backwards compatibility, and by inference, why it is an important topic for getting Linux adopted by big business."
This discussion has been archived. No new comments can be posted.

The Importance of OS Backwards Compatibility

Comments Filter:
  • Omelette? (Score:3, Insightful)

    by caluml ( 551744 ) <slashdot@spamgoe ... minus herbivore> on Tuesday November 14, 2006 @12:27PM (#16838870) Homepage
    You've got to break a few eggs to make an omelette, I always say.
    For a tasty omelette, add cheese, tabasco sauce, and ground black pepper.
  • by InsaneProcessor ( 869563 ) on Tuesday November 14, 2006 @12:28PM (#16838890)
    I am still working on 2.4 to 2.6 kernel issues. Linux and it's authors have no concept of backwards compatiblity. We have to redo everything and our purchased software suffers even more.
    • Quite true (Score:5, Interesting)

      by csoto ( 220540 ) on Tuesday November 14, 2006 @02:55PM (#16841494)
      This is one of the reasons "Solaris is better than Linux." There are few things that we've deployed on Solaris 2.5 (possibly none, but I won't swear to my memory) that don't also work under Solaris 10. This is a far cry from the Linux 2.4-2.6 headaches we've experienced.
  • the issue is less worrisome, since the same app (if written well) can be written to compile on many different platforms, unlike compiled applications that are basically immutable unless recompiled.

    -b.

    • by PadRacerExtreme ( 1006033 ) on Tuesday November 14, 2006 @12:38PM (#16839086)
      Not true at all. All that has to happen is have a supporting library change and the code doesn't work. Two examples:

      I am looking into a problem we have here where my software works fine with GD 2.0.23. If I upgrade GD to 2.0.28 (compiling from source, not using binaries) my code stops working. Everything compiles fine. Everything links fine. Just doesn't work.

      Look at the FOX toolkit. The interface completely changed from 1.X to 2.X. No backwards compatibility. I need to re-write all of my source to handle the new interface.

      • The difference is, you still have 2.0.23, and you can keep it forever. With restrictive-licensed software, that isn't the case. With MS moving to yearly licenses- especially for companies- F/OSS is going to look more attractive to companies who can't afford to rewrite their software at a whim- they will love the option of sticking with the earlier library.

        Notice that even though it would be convenient for you to be able to upgrade GD- it's not required. You have an eternal license to it (Assuming it's under an OSS license, which I don't know for sure as I have no idea what GD is :D but it wouldn't make sense in the context if it were proprietary... but whatever. )
        • Re: (Score:3, Insightful)

          by Sancho ( 17056 )
          Can you point me to a place that shows that MS is moving to yearly licenses? Also, point me to a clause that says I can't continue running an older version of their software? Thanks in advance.
      • by Fastolfe ( 1470 ) on Tuesday November 14, 2006 @01:21PM (#16839888)
        So don't upgrade major versions. Something breaking between 2.0.23 and 2.0.28 is a bug, and should be filed and treated as a bug (unless it was your error). Have you done that? Important applications should undergo testing when new releases of libraries come out, specifically to catch issues like this. The fact that your testing picks up problems doesn't mean there's a flaw in the process. This demonstrates that the process is working. If you had simply upgraded all of your clients with the assumption that things would work, that would indicate a flaw in the process.

        Something breaking between 1.x and 2.x is expected. A lack of compatibility is expressed right there in the version number. Major projects will keep each major version going independently for some time. You should continue to see bug fixes in the 1.x line even though 2.x is out, provide demand and interest is there.

        It's also open-source, so you're free to keep your own development and bug fixes going if you can fund it yourself.
        • Re: (Score:3, Insightful)

          by Darkforge ( 28199 )
          So don't upgrade major versions. [...] Something breaking between 1.x and 2.x is expected.
          You and I have come to expect that, because we're used to it. But that's not the way it has to be; Microsoft has billions of dollars and enormous market share because they don't break backwards compatibility, even with major releases. (If they did, nobody would upgrade, just as you say.)
          • by doodlebumm ( 915920 ) on Tuesday November 14, 2006 @02:13PM (#16840688)

            Even after billions and billions of development dollars Microsoft still breaks lots of applications on their major releases. I've been working on a server 2003 system that we've had to tweak and fiddle with for over a month to get a couple of applications to work properly, and we're still working on them. There are a couple more that will not work and have to be abandoned. These are older applications, so that could be the problem, but they were running on server 2000. No one can tell me that they are 100% compatible, because they are not.

            Which would I rather do, try to get a program to work that is proprietary on a proprietary system or open source on an open source system? Hmmmmm. Let me think.

            Also, if you want an open source application be backward compatible, send a little money to the authors. I bet you'll get a much better response from them than you would from a company that charges you an arm and a leg for a proprietary application. Try getting Microsoft to make a change to their system! Even large companies have to usually take what they get pushed on them from Microsoft.

            • Re: (Score:3, Insightful)

              by suv4x4 ( 956391 )
              I've been working on a server 2003 system that we've had to tweak and fiddle with for over a month to get a couple of applications to work properly, and we're still working on them. There are a couple more that will not work and have to be abandoned. These are older applications, so that could be the problem, but they were running on server 2000. No one can tell me that they are 100% compatible, because they are not.

              Do you know what this means. That in most cases those apps were not coded properly. Let's ta
        • by mha ( 1305 ) on Tuesday November 14, 2006 @01:40PM (#16840176) Homepage
          So don't upgrade major versions.


          Not possible - no one supports that old version. If there are any important fixes (not just security, anything) they are always for the latest version. Open source people don't bother supporting older versions... ;-)

          It's also open-source, so you're free to keep your own development and bug fixes going if you can fund it yourself.


          This argument isn't even worth refuting, it's so obviously childish.
    • by PFI_Optix ( 936301 ) on Tuesday November 14, 2006 @12:55PM (#16839406) Journal
      Allow me to do a variation on a Balmer-related meme:

      Dependencies! Dependencies! Dependencies! Dependencies! Dependencies! Dependencies! Dependencies!

      I'm a broken record on this subject, but I've had quite a few nightmare compiles on Linux that have resulted in me abandoning it on my laptop in favor of Windows. At least there the software I want to use works. They have *got* to fix that problem if Linux is going to become a mainstream desktop product.
      • by tchuladdiass ( 174342 ) on Tuesday November 14, 2006 @02:17PM (#16840750) Homepage
        The fix is simple, but most people argue against it. Use shared libraries where appropriate, and static libraries where it makes sense. The argument for shared libraries everywhere is that it saves disk space, and it makes updating all your apps easier by just upgrading the libraries they depend on. This is ok for widely deployed and tested libraries, where it is very unlikely that something will break between minor versions. But if a library is likely to be used by only a handfull of apps (such as audio / video codecs), then by all means compile them into the app. That way you reduce the number of dependancies and the breakage, at the expense of a bit more disk space used and the inconvience of upgrading a few more packages when a bug is found in the library in question.

        Note, that this is mearly an issue for third-party packages for your os. Everything that comes with the os can follow whatever rules is best suited for it, as the whole thing is developed together. But I see no advantage to having to track down and install a dozen seperate library packages in order to get a particular app installed, esp. if those libraries are use _only_ by that app. Also, the worst offense a package can commit is to require major updates to packages that are included as part of the os. If the os ships with a supported version of libfoo-1.7.2.so, then the app package should be compiled against that version and not against (and requiring) libfoo-1.7.5.so (or worse, libfoo-1.8.x.so), instead if it actually requries functionality that is only present in the newer lib version, then it should ship with a private copy either statically linked, or installed in the app's lib directory. Because if something else uses that lib, chances are it will need libfoo-1.7.4.so, and be incompatible with libfoo-1.7.5.so (even though minor version increases shouldn't break compatibility, but it happens anyways).

        [ /soapbox ]
        • Re: (Score:3, Interesting)

          by jZnat ( 793348 ) *
          I think the only problem with that is proprietary apps can't statically compile GPL or LGPL libraries, and proprietary apps seem to be the ones that have unfixed dependency issues.
          • Re: (Score:3, Insightful)

            But they could ship the dynamic versions, and store them in their own app directory. Then use LD_LIBRARY_PATH, or LD_PRELOAD, or even use dlopen().

            Also, even open source apps have that issue, when you are downloading pre-built packages. MythTV is an example (of coures, it is an unfair example as it is still considered beta). I would rather download one RPM that has the needed shared libraries in a private directory or have most of them statically compiled in this case. That way it would be easy enough t
        • Re: (Score:3, Informative)

          Looks like I'm arguing against it.

          Those who do not understand package management are doomed to reimplement it, poorly. Although you do make a point:

          Note, that this is mearly an issue for third-party packages for your os.

          Perhaps, but most distros now support adding third-party repositories, and even if you don't, when you download a .deb file manually, it's still going to pull in dependencies when you install it.

          Ultimately, I see compiling things statically as being kind of like offering a WinZip Self-Ex

    • by archaic0 ( 412379 ) on Tuesday November 14, 2006 @01:15PM (#16839774) Homepage
      You're sort of missing the point here though. This statement is pretty common in the Linux world... "all you have to do is recompile..." ALL you have to do? Really? All I have to do is re-compile? Assuming I have the proper kernel to begin with, and the proper libraries, along with all their dependencies... I just want to run a program man, I don't want to become a programmer just to use my computer. (90% of office workers speaking, I.T. specific roles excluded)

      The truth is that compiling a program (under ANY platform) IS rocket science level stuff as far as computers are concerned. Yes, a programmer does it in his sleep (as do most Linux daily users). But Joe CEO and Jack employee can click next all day but the moment he has to type ./configure or ./make or whatever, his eyes will glaze over and he'd rather spend a million dollars on a setup wizard app than have to go to school just to learn how to install a free app. Not to mention the fact that 6 times out of 10, you'll get an error and have to track down what went wrong and fix something before you can attempt to compile again.

      Compiling source to a binary IS complex stuff despite what any mainstream Linux supporters might think. And having to re-compile something EVER, WILL keep it from being accepted main stream.

      I've even heard people say "well all ya gotta do is just make your own kernel and there ya go, piece of cake". As easy as that might sound for the Linux guru, that is exactly equal to telling a driver to build their own car. Plenty of mechanics out there that can do it, yes, but EVERYONE isn't a mechanic nor should they be. People who build their own kernels and compile software are the mechanics of the computer world and shouldn't expect all computer users to be mechanics.

      I for one do not ever want my bosses to be as 'smart' as us I.T. guys are. Why then would they have any use for me if THEY know how to compile a kernel? If the world was so enlightened as some people seem to want it to be, then a lot of people would be without jobs because their skills wouldn't be needed.

      Don't get me wrong though, I'm not saying one way is the end all better way, I'm just saying that as long as the corporate world is run by people who just 'want it to work' (which will be forever) software that requires the user to do the programmers job will not fly. All my servers are Linux servers, but that's only because myself and the other techs here are skilled at making them work. I wouldn't DREAM of putting Linux on our desktops. Are you crazy?! Will YOU then be the one who baby sit every soccer mom who comes to work for us and teach them how to use it for free?

      If products could be packaged such that they get compiled during install, but in the background with the user being none the wiser, then it might fly. Like say your installer went out and did all the work on its own of gathering the required libraries while the user only ever saw a wizard with a next button.

      My two cents, take it or leave it...
      • by 99BottlesOfBeerInMyF ( 813746 ) on Tuesday November 14, 2006 @02:10PM (#16840628)

        If products could be packaged such that they get compiled during install, but in the background with the user being none the wiser, then it might fly. Like say your installer went out and did all the work on its own of gathering the required libraries while the user only ever saw a wizard with a next button.

        I think modern package management is weak on every platform, but by combining several into the OS would bring the benefits you crave. I'm a huge fan of OpenStep/GNUStep/OS X where applications are folders ending in ".app" with a specific structure. This allows for multiple binaries for different platforms in one "file" that can just be copied and run anywhere. There is an increase in size, but given modern hard drives it is tiny, especially since all resources can be shared rather than duplicated. And having those resources separate is great for those of us that want to grab a song or image used in some game we have. The portability and lack of an installation step is really, really nice, especially for novice users.

        To get back to your specific point, I see no reason why source code and build instructions cannot likewise be included in this package. Add a little "build custom binary" option for each application and you have the ability to do just what you ask, but without the drawback of having to wait for an install process. In fact, the OS could automatically schedule the compiling of a custom binary the first time a program is run. There is no need for a wizard at all. Drag the program where you want it and double click. There's no need for dependency checking either since needed libraries are in the package. I really think this level of simplicity would benefit everyone.

      • by Moraelin ( 679338 ) on Tuesday November 14, 2006 @03:33PM (#16842228) Journal
        While I'll fully aggree with your points about the common user, I'd argue that IMHO it's not that much different for programmers and generally those in IT roles. Sure, for a programmer it's a lot less intimidating and a lot less "rocket science", but that doesn't mean he/she will automatically enjoy it.

        Speaking as a programmer, the interesting and challenging part is the _programming_ part. The tweaking of algorithms, the thrill of learning some new technique, etc. That's the fun part. The compiling itself is _not_ the exciting part. Sitting and watching Joe's Own Toy Program (TM) compile is about as exciting and watching paint dry. Tracking down the dependencies for it is even less exciting.

        In fact, I'll even go ahead and say that anyone whose great feat was compiling some 3rd party program, probably isn't really a programmer to start with. There are a ton of people who just like to pretend they're oh-so l33t because they can run someone else's build script. Maybe they even configured (through the nice supplied GUI) and compiled (by running the commands supplied in the readme) a _kernel_. Wow, that makes them sooo great computing gurus. Not. That's to programming what script-kiddies are to real security experts. A sad joke.

        And even as programming goes, the fun is doing the things _I_ want to do, learning the things that _I_ find interesting at the moment. Maybe I'll toy with this great new algorithm or language I just heard about, or maybe I'll mod a game just for the sake of seeing if I can get to the ballistics code, or whatever. Whatever tempts me at the moment. But that's a personal, subjective and transient thing. What tempts me tonight might be (and usually _is_) a whole other thing than whatever program Tom couldn't be arsed to finish, or Dick couldn't be arsed to test, or Harry couldn't be arsed to port. I want to do _my_ stuff, not debug Tom, Dick and Harry's programs just because they happen to be OSS and on my computer.

        Basically just like a literature buff might choose to spend the evening reading the novel of _his_ choosing instead of coming over to help the neighbour's kid finish a school essay about War And Peace. Sure, he certainly is qualified to help that kid, but it's not necessarily what he'd choose to spend the evening with.

        In the end what I'm saying is that what I want from a computer is no different from what Jack Random and Jane Average want. I want it to just work. Whatever I choose to do with it, whether it's programming my own toy app or watching a DVD or playing a game, I _don't_ want to compile the IDE/media-player/game/whatever first, and that goes double for pointless track-the-dependencies games. If I chose to do X tonight, then anything else that gets in between me and X (like having to first compile some other stuff) is just a waste of my time.
  • Huh? (Score:5, Insightful)

    by Salvance ( 1014001 ) * on Tuesday November 14, 2006 @12:30PM (#16838932) Homepage Journal
    Something's not right with these links ... is someone just trying to /. their own blog here or am I being redirected? The first link is completely off topic and goes to a post about SoftMac (a Mac emulator), the second is just an example of how one company has 9000 legacy scripts that require older version of windows (+ 400 16 bit programs). So what? This hardly seems like the a front page of slashdot argument for OS Backwards Compatibility ... there's really no argument other than stating the 9000 # and the 3 years it would take to convert them.

    Pure and simple, Microsoft has protected their market share by remaining backwards compatible, and will continue to do so for that reason only. A company like Apple can afford to ignore backwards compatibility to some extent, as this actually drives greater revenue from their loyal customer base buying new software. Microsoft though, cannot afford to give their corporate users a chance to make a migration decision.

    If Microsoft eliminated backwards compatibility, thousands of companies would be in a position where they needed to include the cost of migrating software in the upgrade decision. All of a sudden, Linux would become a viable option for these corporate clients, which Microsoft can't afford. For example, my company currently has over 900 16 bit applications that we haven't touched in ~10 years. Almost all of these run fine under XP and the beta versions of Vista, so upgrading to Vista will be a cheap option. However, if Vista didn't support these 16 bit apps, we'd have to spend years of time and Millions of dollars upgrading ... in this case Linux would likely become our new O/S.

    For this reason, Linux advocates (and many others) would love to see Microsoft remove backwards compatibility, but from a business standpoint Microsoft just can't do it.
    • Re: (Score:3, Interesting)

      by blindd0t ( 855876 )
      Just know that while your 16 bit apps will run under Vista, but this is only true for the 32bit version of Vista, as your programs will always fail to run in the 64bit version... This may not be a big deal now, but it could become a problem as 64bit processing becomes more common.
    • Re:Huh? (Score:4, Interesting)

      by TheThiefMaster ( 992038 ) on Tuesday November 14, 2006 @12:48PM (#16839290)
      Then why upgrade the machines running the legacy apps / scripts at all? It's not like the older versions of windows don't run fine. Making sure they're not connected to the internet is all you need to do to make them secure, or if that's not viable then heavily restrict their access with a firewall (preferably hardware). After all, why should a weather data sensing and reporting machine (for example) be able to connect to anything except the database it's sending the data to? Why should it be able to get any incoming connections at all? Even running unpatched windows 3.0 it would be safe if set up like that.

      Do small in-line hardware firewalls exist? Just with an incoming and outgoing RJ45 socket and a hardware circuit that only allows data through to or from a single ip (or range)? I can see many businesses could use these.
    • Re: (Score:2, Insightful)

      Pure and simple, Microsoft has protected their market share by remaining backwards compatible, and will continue to do so for that reason only.

      Since they are a business, in it to make money, NOT to make cool technology, that is the only correct decision for them to begin with.

      A company like Apple can afford to ignore backwards compatibility to some extent, as this actually drives greater revenue from their loyal customer base buying new software.

      Aha? Apple invested a lot of time and efford into making at le
    • Re: (Score:3, Insightful)

      by rainman_bc ( 735332 )
      If Microsoft eliminated backwards compatibility, thousands of companies would be in a position where they needed to include the cost of migrating software in the upgrade decision.

      As compared to the upgrade path to OSX, where non native apps wrong like a slug in emulator mode?

      Flame away here, but Microsoft has been fairly good with their backwards compatibility. At least as good as OSX, if not better.

      If a law firm for example continues to insist they should be running Word Perfect 5.1 because that's all the
    • Re: (Score:3, Informative)

      by Jerf ( 17166 )
      Raymond Chen's blog is worth reading for the technical posts, and the most interesting ones are about Windows reverse compatibility or why certain Windows API things are the way they are, but there is no one link that you can give for them. That's probably why the submitter's links seem unfocused. His blogging software doesn't seem to have any categorization.

      There are some real gems in there and if you are serious about software development you should probably just read the whole blog. Doesn't take that lon
  • If corporate data was stored in more open format legacy applications would not be such a problem.
  • One of the problems with maintaining backwards compatiblity in Windoze is that the whole mess evolved by acreting guano rather than having a clear path for upgrades. That is define where the various modules will interface with each other before you start coding.
  • Something that I have read about on Slashdot regularly (where Windows is criticized for bothering with it at all), I thought readers would be interested in exactly why Microsoft spends so much effort on backwards compatibility, and by inference, why it is an important topic for getting Linux adopted by big business.

    Perhaps I misunderstand, but is the submitter trying to say that Linux should learn from Windows in this area? Backwards compatibility is one of Linux's strong points, and Windows' performance in

    • Backwards compatibility is one of Linux's strong points

      Do you have an example of a 27 year old program that can run on a current install of Ubuntu, without having to do anything else? I'm looking at Visicalc on Windows XP Professional right now.
      • vi, emacs, ls, grep, yacc, lex, tex... hella lot of the code is ancient.

        Maybe you mean a 27 year old binary? It would be pretty difficult to find a 27 year old binary in the Gnu world, since it's easier to just recompile the binary for whatever system the binary is supposed to run on. Before decrying 'but that's too much hassle for a user' remember: This doesn't have to be done by an end user. I use a all of the above software, and whole lot more, without ever having recompiled the stuff. I install

      • Re: (Score:2, Funny)

        by brunascle ( 994197 )
        #!/bin/sh
        echo "Hello, World"
        what do i win?
      • Do you have an example of a 27 year old program that can run on a current install of Ubuntu, without having to do anything else? I'm looking at Visicalc on Windows XP Professional right now.

        Linux (and indeed Windows) can run quite a lot of old software through emulation, including tons of 27 year old games, Visicalc (using something like dosemu+freedos), CP/M programs under Z80 emulators, and Windows programs under Wine (not that any of those are 27 years old).

        In fact emulation is possibly a better

    • Windows has pretty good backward compatibility for binary programs. In most cases you can dig out a CD or Floppy from 10-15 years ago... or longer, and windows will still know what to do with it. Even with some of the really flaky "copy protection" companies used to build by putting broken code in their EXEs windows will still try to recreate those bugs and run the program. It's quite impressive how much time they spend re-engineering their mistakes in each new OS because you CAN'T update your software f
    • Backwards compatibility is one of Linux's strong points

      Hmm.. you sure about that?

      I seem to recall a few issues with this...

      ie, multithreaded applications built for 2.4 may encounter problems on 2.6 unless you set some specific environment variable.
      Or.. how about the nice compatibility between glibc versions?

      As long as you can recompile your software, Linux provides excelent backward compatibility, but that is a non-option for many situations.
      • by hritcu ( 871613 )

        As long as you can recompile your software, Linux provides excelent backward compatibility, but that is a non-option for many situations.

        Tell that to a Gentoo user!

        The only problem is that in the Windows world almost everything is closed, proprietary, binary only. And I'm extremely glad that Microsoft has to spend huge amount of resources trying to prevent things from going crazy. They created this sick ecosystem!

        And open source is the solution to it.

  • Something that I have read about on Slashdot regularly (where Windows is criticized for bothering with it at all),...

    Yeah, and you'll also see posts by people who don't understand all sorts of basic concepts.

    Why complain about them? Why even bother to mention them at all?

    Whether you need backward compatibility has never been in doubt. You simply cannot expect your customers to re-enter all their data every time you issue a patch.

    The question is whether you should bring the old bugs forward for the sake of "

  • Compatibility freak?
    Retro chic
    Tweak the hardware
    Old-school sleek
    Burma Shave
  • billions at stake (Score:5, Interesting)

    by yagu ( 721525 ) * <yayagu@[ ]il.com ['gma' in gap]> on Tuesday November 14, 2006 @12:35PM (#16839028) Journal

    There is a bit of a "you rub my back..." going on when Microsoft maintains backwards compatibility. While MS is still the 800-lb Guerrilla, they have an audience with which they collaborate to some degree to make billions of dollars. MS holds the reins, but the team would refuse to pull at all if Microsoft cut them all of at the compatibility pass -- that would guarantee a stampede to find alternatives in OS implementations.

    I don't think many are aware how hard Microsoft has to work to maintain compatibility... I once talked with one of the MS engineers -- he said much of the OS code has preamble code to run through a giant "case" statement to accommodate and make allowances for either bad or incorrect coding by outside developers, or bugs in their code that don't execute correctly for the outside software. It's a lot of baggage to carry around, but it's baggage worth billions of dollars.

    Interestingly (to me) is I don't think Linux's big task yet is to maintain backwards compatibility with Linux programs (though that would be nice, and seems to mostly be a given anyway), I think the bigger task for Linux is to maintain backwards compatibility with Microsoft programs, specifically legacy Windows software. Unless and until that hurdle is cleared, Linux will always be #2, or #3, etc.

    (Sorry for the paragraph of metaphors.)

    • Re: (Score:3, Funny)

      >While MS is still the 800-lb Guerrilla
      Please tell me you meant gorilla?
    • I don't think many are aware how hard Microsoft has to work to maintain compatibility... I once talked with one of the MS engineers -- he said much of the OS code has preamble code to run through a giant "case" statement to accommodate and make allowances for either bad or incorrect coding by outside developers, or bugs in their code that don't execute correctly for the outside software. It's a lot of baggage to carry around, but it's baggage worth billions of dollars.

      Interestingly (to me) is I don't think

  • It's the data! (Score:3, Insightful)

    by cpinto ( 1027148 ) on Tuesday November 14, 2006 @12:35PM (#16839044)
    Actually, I see a much bigger need to be able to access information than to be able to run old applications. The big problem in all of this is that a lot (if not all) legacy applications have closed data repositories so being able to run those applications in a modern OS is imperative.
  • update treadmill (Score:4, Interesting)

    by Speare ( 84249 ) on Tuesday November 14, 2006 @12:38PM (#16839100) Homepage Journal

    (Also from an ancient Microsoft experience.)

    Microsoft's continued existance has ALWAYS depended on cash cow products such as MS-DOS, Word, Excel and Windows. The only way that a product goes from concept to cash cow is through multiple releases which are sold to end users, offering the vital feedback to improve the product and market preparation to need the product. The only way a cash cow does not turn into a dead cow is through multiple releases which are sold to end users, offering newer features for devotees and fixing some of the most egregious integration problems for enterprises. Without new versions, people grow out of a product. Users adopt a new methodology entirely, or adopt a new product from someone else.

    An update treadmill necessarily requires that the updates keep coming. Users cannot adopt a new update unless it is nearly seamless to synchronize and integrate with the other treadmills they are running.

  • Two ways (Score:5, Interesting)

    by Bluesman ( 104513 ) on Tuesday November 14, 2006 @12:40PM (#16839138) Homepage
    Backwards compatibility is important, but there are two ways you can do it.

    One is to include all of the old stuff in your new OS, the other is to continue to support the old version, or possibly emulate it on the new version.

    It seems that backwards compatibility significantly impedes progress. Why not continue to support the older versions, but separate them from the new stuff? Our computers are fast enough to run Windows 3.1 in a VM, much faster than it would run on the hardware it was designed for.

    Better yet, include a copy of the old software in the new one, with a built in emulator designed to run it.

    It's important to maintain backwards compatibility, but it's just not a good excuse for bad design decisions in new softare.
    • there are two ways you can do it. One is to include all of the old stuff in your new OS, the other is to continue to support the old version

      At the risk of stating the obvious, continuing support for an old product has nothing to do with backwards compatibility. Now obviously, you can continue to work on old systems to keep something running but that is no way related to backward compatibility.

      That would be like MS advertising:
      The Xbox 360 as 100% backward compatible with every game ever made *
      * - mo
    • Re: (Score:3, Insightful)

      by beuges ( 613130 )
      Mr Chen has also addressed that issue at length as well. There are many problems with simply including an emulator with the new OS that can host the old OS, for example supporting copy/paste between apps in the native OS and apps in the emulated OS. Simply providing an emulated solution for backwards compatibility with the old OS was considered but rejected because it provided a crappy user experience.

      Another example - you have a network share mapped on your native OS. You double-click a file in the netwo
  • he says, This isn't a company that bought some software ten years ago and don't have the source code. They have the source code for all of their scripts.

    this is a bit confusing to me. is he saying they have the source code of the apps and the install scripts or that they have the 'source code' of install scripts only? and if so- wouldn't that just make sense? Are there people out there running some kind of binary install scripts that they can't read themselves?

    my next question is this -
  • Personally, I don't use Windows. I am a self admitted Mac fanboy (OK, also a professional mac developer for the past 10 years.) However, I love diversity. I like the fact that Windows provides the ultimate in backwards compatbility because I can related to the fact that for some businesses, backwards compatbility is the most important thing.

    I love the fact that Windows and Mac and Linux exist. I wish that Amiga and Be could still be serious choices. Because more choices is better.
  • by Jennifer York ( 1021509 ) on Tuesday November 14, 2006 @12:41PM (#16839168) Homepage
    Any argument against Backwards Compatibility (BC) is usually based form the point of view from the developer and not the user. A developer decides that it is too hard to implement a feature, or some such thingy, and they decide that a "re-write" is the best option. This is quite often the wrong decision if you are developing software for money. Throwing away the mountain of existing code so that a developer "feels" more comfortable is bad business. There are often better ways to solve this problem, and refactoring the code early and often is a good strategy.

    Now, if you happen to write code for fun, like me, then you are of course free to chuck it all out and start again. The Fun Factor outweighs the cost, since the cost is free. So govern yourselves accordingly, if you want to make money, then do as little as possible to develop the code, if you are in it for love and glory then do it for yourself.

  • While this all makes sense, and some Linux distros must be sensitive to the practical requirements of these corporations, there is another side...

    It is very useful if you can, occasionally, break backward compatibility. With time, it becomes apparent that past decisions were a mistake. Being able to correct some of those mistakes, and do it in a reasonably clean way, makes for a better system.

    So I wouldn't say that backward compatibility is a rule. After all, lot at MS with dotNet. dotNet 2.0 is not bac
  • You need backward compatibility because people still use old apps. You need to eliminate backwards compatibility to get rid of bloat. Seems like the answer is to make the legacy features into optional packages that sys admins could choose to install or not install based upon their needs. They then could balance security against functionality. They could also clearly see what they needed to ask developers to upgrade first. (If we upgrade APP X, we can uninstall LEGACY PACKAGE Y)...
  • by ohearn ( 969704 ) on Tuesday November 14, 2006 @12:44PM (#16839212)
    Trust me in a corporate setting not having backwards compatability is a big deal. For home users it is even worse.

    If you loose backwards compatability you run into the same problem that all the smaller OSs have in getting the corporate world to adopt them on a larger scale. An IT manager may be able to convince the bean counters to give enough money to do the OS swap, but try asking them for the money to have to swap every application you have as well because the old ones won't work anymore. Then on top of that try getting the funding and the authorization for the time to retrain all your employees on new applications that they are not as familiar with. Beyond that, productivity will go down for a while until people get used to the new systems even with training. Now if the new system is truely more efficient (from a worker bee point of view in time to complete a task) this may eventually pay for itself, but do you really think most upper managers are going to be that patient before firing the IT manager for screwing over the entire company's productivity rates? If you do then you are dealing with much more patient managers than I am used to or just folling yourself. That is the true cost of loosing backwards compatability, especially when a large number of your key applications are built in house, then you can't just go purchase the new version that will run on the new platform with minimal retraining needed. For companies that develop a lot of thier own applications in house, you have a lot of down time until the IT departments can recode to whatever new standard the new OS requires.

    Home users are better and worse off in some senses. While a lot of home users will probably just buy the new versions of major software (office software, email, etc.) when they purchase a new computer that still adds a lot to the cost of a machine. For the more technically savvy people out there (this is /. after all), even if you like the new OS and it's features, do you really want to have to replace every app you've got because they don't work properly anymore. Yes VM solutions can mitigate a lot of this, but those solutions are not perfect.

    I was one of the people that MS picked up on a 6 month contract for the extra support load when XP SP2 came out. Take it from someone who was there, the biggest complaint (especially from corporate customers) was when it broke compatability with old apps they were using. I heard a lot of people (and people I knew personally) say that they would install SP2 once the compatability issues were fixed when they eventually swapped to a newer version of the key app that they couldn't live without. Now MS did fix some of those issues with patches shortly after SP2 came out, but imagine that problem scaled up to replacing an entire OS without regards to backwards compatability. I know everyone around here likes to bash MS, but they are not nearly stupid enough to piss off thier corporate customers that badly. They know that would be the fastest way to push people away from Windows in a heartbeat, or at least to insure that not many people bithered swapping to the new version instead of just staying with what they already had that works.
  • by Chris Burke ( 6130 ) on Tuesday November 14, 2006 @12:44PM (#16839218) Homepage
    Because I know they need it. Without compatability with old Windows software, the next buying cycle every CTO on earth would have a chance to consider whatever operating system they wanted since all would involve switching to entirely new software, as opposed to now where buying Windows to replace Older Windows is the automatic choice. It isn't just Window's market share that maintains there monopoly, it's the resulting market share of software for Windows that really does it. Drop that, and Windows' real advantage is gone.

    I do make fun of them for not being able to stabilize and secure some of the old code, though I understand it's tough when the code is old, complicated, and crufty, and a lot of old programs require "bug for bug" compatability.

    I think the solution for them is deprecation. Old interfaces that are stuffed with crufty code, and which were often insecure by design anyway, should simply be made unavailable for new applications. You have an old app that wants to use it, fine. You want to code a new app using OLE or ActiveX or what have you? Tough. Eventually software gets replaced, and eventually you wouldn't have any software using the old systems and you could disable them. Sadly there's still plenty of new development using ActiveX and other crap, so MS both shows no interest in doing this nor may they be able to.

    Now how does this apply to Linux? In Free Software Happy Land, you have the source to all the software on your system, so with the ability to recompile you don't need binary backward compatability. Being able to link your source code against new libraries can fix a lot of the problems with having to support really old binaries. There's still the issue of interface compatability (which is tied up in binary compatability in the Windows world), but a lot of the interfaces are stable.

    This isn't Free Software Happy Land, though, we're talking about businesses. Personally I think that adopting Linux should also come with adopting some of its philosophies, in particular that having source code is much much better than not having it, so if you aren't getting source then it had better be much much better than the program you can get with source. Linux does not have a strong history of binary compatability, and I'm not sure it's best for Linux to start establishing such a history. Business may not like it, but maybe they need to adapt to Linux not the other way around. Who knows, they might figure out that they like getting source code from their vendors and accidentally discover a better way to do things.
  • by Great_Jehovah ( 3984 ) * on Tuesday November 14, 2006 @01:01PM (#16839512)
    We have several in-house apps that were written recently enough to be done in C# and none of them work in Vista. We have two guys working to convert them and they've had no success after spending dozens of hours on the problem.
    • Re: (Score:3, Informative)

      What's the problem? I converted all our in-house vb.net and c# code to work in vista already. The only hiccups I had were a quirky parallel port library for some external hardware. Most of our apps didn't even need to be recompiled.
  • ...solve the biggest issue why companies don't want to upgrade. Yes, there's compatibility testing and user retraining and so on and so forth, but if every 18 months you need to update your glacier-like distro like RHEL/CentOS, Debian stable or Ubuntu LTS, is that really a big hurdle? How much in linux userspace will break? And even then you got 18 months in which to either migrate it or just stay behind - I think the 2.0 kernel is still maintained, if nothing else. Firewall it down and run what you must. O
  • Is the reference that most everyone here is commenting on.

    Best case scenario for most sysadmins is backward compatibility is the equivalent of a broad fallow field with landmines randomly placed. Everything else is just arguing how many angels fit on the head of a pin.

    I will argue that backward compatibility is something a PHB throws out when she/he wants to say no to something without having to specify their concerns. It's not a reason why the purchase order is cut for your company versus your competitor.
  • by MrSteveSD ( 801820 ) on Tuesday November 14, 2006 @01:19PM (#16839852)
    I thought readers would be interested in exactly why Microsoft spends so much effort on backwards compatibility

    Shame they don't do that with other products. They discontinued the VB language (forget about VB.NET, not the same!) and left thousands of companies in the lurch. Millions of dollars were invested in writing VB products around the world and many of the companies do not have the finances to completely rewrite their products again in a new language. I suspect that many of them are keeping quiet since making a big noise about it would frighten both customers and investors.

    I was therefore happy to hear that Java is going open source. Perhaps we can now consider it "safe" to use.
  • Scale Models (Score:3, Insightful)

    by Doc Ruby ( 173196 ) on Tuesday November 14, 2006 @02:00PM (#16840472) Homepage Journal
    Without backwards compatibility, however incomplete, different versions of Windows would be different platforms. Which would compete with each other, and especially with new releases of apps and OS. And interfere with Microsoft's claims to represent 90% of installed computers/users/whatever. Which claims are the main factor when most people decide whether to buy/install/develop MS apps.

    Backwards compatibility, even if not extensive enough to make all installed MS hosts (including WinCE) into a single platform for the largest imagined application scale economies, is enough to create the illusion of those benefits. Which keeps people buying and building MS at the large scales which actually do deliver those lesser, but still extremely competitive, scale economies.
  • by The Great Pretender ( 975978 ) on Tuesday November 14, 2006 @02:18PM (#16840764)
    ...a non-coding standpoint. As a business we have a huge amount of data in the archives, in our case only from 91. One of our biggest issues is if we need to access that data we need the current platforms and application software to be backwards compatible. If the systems were not backwards compatible we would have to dedicate a/several computer/s and then up-load the old software to access the data. I suppose that we could use emulators. Bottom line is that it would be a huge pain for IT. In addition, it would be nice if anyone of the Project Managers can access the data from the server directly without worrying how to view it.
  • by Ungrounded Lightning ( 62228 ) on Tuesday November 14, 2006 @05:26PM (#16844250) Journal
    Seems to me that backward compatability issues are an OPPORTUNITY for linux.

    Windows API support in linux (ala Wine) not only CAN be done, but it's EASIER for older, frozen, versions of Windows, which are no longer moving targets.

    Seems to me that a "tested and seems to work" compatibility list for older Windows commercial apps versus an API emulator/kernel/library version number would provide:
      - IT departments with an opportunity to migrate and a starting point for doing their due-dilligence checking
      - API emulator project members the feedback they need to find and fix any mis-emulation that is blocking such a migration
      - Linux evangelists a selling point
      - Management the wake-up call that it is now POSSIBLE to migrate away from their addiction to Microsoft and other proprietary software, and
      - Stockholders a hammer to use on management. B-)

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...