Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Squaring the Open Source/Open Standards Circle 255

Andy Updegrove writes "Before there was Linux, before there was open source, there was of course (and still is) an operating system called Unix that was robust, stable and widely admired. It was also available under license to anyone that wanted to use it, and partly for that reason many variants grew up and lost interoperability - and the Unix wars began. Those wars helped Microsoft displace Unix with Windows NT, which steadily gained market share until Linux, a Unix clone, in turn began to supplant NT. Unfortunately, one of the very things that makes Linux powerful also makes it vulnerable to the same type of fragmentation that helped to doom Unix - the open source licenses under which Linux distributions are created and made available. Happily, there is a remedy to avoid the end that befell Unix, and that remedy is open standards - specifically, the Linux Standards Base (LSB). The LSB is now an ISO/IEC standard, and was created by the Free Standards Group. In a recent interview, the FSG's Executive Director, Jim Zemlin, and CTO, Ian Murdock, creator of Debian GNU/Linux, tell how the FSG works collaboratively with the open source community to support the continued progress of Linux and other key open source software, and ensure that end users do not suffer the same type of lock in that traps licensees of proprietary software products."
This discussion has been archived. No new comments can be posted.

Squaring the Open Source/Open Standards Circle

Comments Filter:
  • The REALLY nifty thing about UNIX is the userland. Without it and its tremendously clever and somewhat unique approach to solving problems in the way it does, it's REALLY not just "Linux" you should be talking about when referring to a modern UNIX-reimplementation. It's GNU/Linux.
    • This post and the dozens below it are arguing the difference between Linux the open source kernel project and Linux the brand.

      When most normal non-dev people talk about Linux they aren't talking about a kernel, seperate from the development projects which rely on it; they are talking about Linux the operating system alternative. Linux is actually a really good brand and those of you who try to box it into just the kernel are missing the point of this and many other articles like it.

      If we think of Linux

  • by El_Muerte_TDS ( 592157 ) on Tuesday May 30, 2006 @06:39AM (#15427341) Homepage
    This article is more about standards in Open Source development, specifically Linux
    To me Open Standards are much more important than Open Source. Open Standards allow Open Source solutions to be created that are compatible with the other solutions.
    • To me Open Standards are much more important than Open Source. Open Standards allow Open Source solutions to be created that are compatible with the other solutions.

      Works both ways - having standards in open source solutions allows other licensed software to be compatable with it :-)
      • Depending on what open source license the standard's implementation is released with, ofcourse.
        The most common license, (L)GPL pretty much blocks most other licenses.
        • Depending on what open source license the standard's implementation is released with, of course.
          The most common license, (L)GPL pretty much blocks most other licenses.


          I'm not sure I understand what you mean. I've never seen an open standard (documentation) released under a license designed for code. The free standards group (LSB & other free standards) release their licenses under the GNU free documentation license [freestandards.org].

          I'm not exactly an IP lawyer, but I don't see any obstacle to writing code to conform to
        • You are making a common error seen in the open source world -- Open Code is not the same as Open Standards.

          For example, ODF is an open standard that can be implemented by anyone (I think), but the old StarOffice file format wasn't a standard, it just had an open source implementation.
          • That why I used the word "implementation".

            The grandparent's argument was that open source implementations of open standards lead to interoperability.

            My argument is that such interoperability depends on the open source licensed used for that particular implementation.

            If an implementation requires all source code to be open, closed source is forced to create their own implementation with the added risk of less compatibility between implementations. Infact there is the added problem of closed source programmer
            • The grandparent's argument was that open source implementations of open standards lead to interoperability.

              Utterly incorrect. I wrote:

              Works both ways - having standards in open source solutions allows other licensed software to be compatable with it :-)

              Either way; open source implementation of an open standard does not necessarily result in better interoperability.

              Noone said that - the idea is that if there's an standard used to create open software, anyone (open or not) will be able to follow said standar

  • by yeOldeSkeptic ( 547343 ) on Tuesday May 30, 2006 @06:51AM (#15427354)

    Unix was killed by the high price of licenses. Unix during the early 1990's was supposed to be for the big boys --- the enterprise customers willing to pay up to 10,000 USD per seat for a Unix license.

    With the license for Windows NT starting at less than 1000USD, the enterprises which formed the majority of the paying Unix customer base soon found a way to make do with NT and delete their Unix installations.

    It wasn't open standards and the fragmentation that did Unix in, it was plain hubris among the Unix vendors who cannot fathom a future where a cheaper Windows NT would replace the robust, stable and widely admired Unix they are selling.

    • by IntlHarvester ( 11985 ) on Tuesday May 30, 2006 @07:20AM (#15427407) Journal
      While that might have been true, there was a standards brawl called the "UNIX Wars" right before Windows NT showed up. So clearly some people were frustrated with the state of standardization in the Unix world.

      UNIX vendors also basically stopped workstation development (X11, Motif, CDE etc) in the early 90s when NT showed up, giving up the desktop without much of a fight.
    • The high price of the hardware was another issue with the cost. I never saw the receipt, but some of the sparc 20s we give away my boss said were well over $20k when they were purchased. That's also part of what drove everyone towards DOS and the cheaper pc hardware.

    • It wasn't open standards and the fragmentation that did Unix in, it was plain hubris among the Unix vendors who cannot fathom a future where a cheaper Windows NT would replace the robust, stable and widely admired Unix they are selling.


      At that time, IIRC, there was a lot of criticism that Unix wasn't as robust or secure as
      the mainframes.
    • by Greyfox ( 87712 ) on Tuesday May 30, 2006 @10:15AM (#15428027) Homepage Journal
      Fragmentation was never the biggest issue -- you tended to buy one vendor's UNIX for your shop anyway. For the most part a C program compiles cleanly between one UNIX and the next, though HP/UX 9 was a bit odd when it came to networking code. UNIX licenses weren't cheap back in the day and they didn't make UNIX run on cheap PC hardware. Back in '89 a base copy of SCO Xenix for the 286 ran about $1200 through Techdata. If you wanted a C compiler, X11, man pages, or TCP/IP networking you had to sprint for all those separately. You could get BSD, apparently, but there wasn't a lot of easily accessable information on installing it and it sounded like it'd be an exercise in arcane lore.

      There was an arrogant attitude toward PC hardware in the mainframe and workstation market. If you wanted to do real computing, you wouldn't use a PC -- those were just toys! Drop 15 grand on our workstation and then we'll talk. Well PC's WERE toys for a few years, but you had to have blinders on to see that they weren't going to make progress. That arrogant attitude persisted while the 386 and then the 486 came out, while all the while Windows NT and to a smaller extent OS/2 started stealing more and more business from the traditional UNIX vendors.

      And while the UNIX vendors arrogantly believed they had a better product, not a single one of them ever made an effort to push the GUI portion of UNIX beyond CDE (Well... except NeXT and SCO, but SCO's offering was a step back from CDE.) Gnome, KDE and Enlightenment were all efforts of the Open Source community and to my knowledge Sun's really the only one of the old guard to even consider using one of them. Hell, even Afterstep is a step up from the commercial vendors' offerings.

      In the end it was cheap Intel hardware and cheap Intel operating systems that did the old guard in. Windows on a pentium made a server that worked well enough that it was impossible to justify the price jump of an order of magnititude to get just a little bit more. And I doubt there are more than a handlful of companies that would even consider putting UNIX on an employee's desk. Had the old guard of UNIX vendors played their cards right and embraced PCs as a natural extension of their high-end UNIX systems, things might have gone differently.

      The current situation is rather interesting. The cost of Windows licenses is significantly more than the cost of Linux licenses. Microsoft can't really compete with free, so they have to find other avenues of attack. That, more than fragmentation, is the biggest danger to Linux. Most commercial companies only deal with RedHat or SUSE anyway. I don't know what the future will bring, but we most definitely live in interesting times.

  • by vdboor ( 827057 ) on Tuesday May 30, 2006 @06:51AM (#15427355) Homepage
    The talk about the LSB is nice, but one of the major problems of Linux is the diverse locations where KDE and GNOME are installed. Some use /usr, others use /opt/kde3, or /opt/kde/3.x. Does the LSB already address this issue? These diverse paths are the main reason I can't deploy one RPM/DEB/TGZ package for all Linux distributions.

    All mainstream package formats have the full installation path hard-coded in the archive. LSB does not address this yet. The other problem of RPM, namely binary compatibility between different library versions, is already solved by compiling with apbuild [autopackage.org]. This works surprisingly easy, and allows my to provide one single package that can be installed everywhere [1].

    [1] I can recommend to compile packages at Slackware because Slackware ships most packages without patches. Compiling an app at SuSE for example, made binaries depend on ABI changes caused by SuSE patches.

    • I know this is a major complaint of mine.

      I simply don't understand why this has never been addressed.

      The Linux community is always talking about expanding and competing with the Windows world, but they shoot themselves in the foot on trivial details like this.

      The response I often get when I ask why don't we change to something that makes more sense is, "if you want a product more like Windows, then use Windows. We don't want our product dumbed down."

      However, just because a product is difficult to use does
      • Basic inconsistencies like these frustrate people attempting to switch

        Wrong. Nobody switching to Linux gives a shit what directory their KDE is installed in. Believe it or not most people have more important criteria that they demand from their computers, and are much more likely to switch back to Windows if they are required to look in their KDE directory in the first place.

        "If you want a product more like Windows, then use Windows. We don't want our product dumbed down."

        This is the OSS Godwin. People turn

        • Actually I switched every box in my house to Linux. My boxes were all Gentoo, and Suse for my wife.

          She was incredibly pissed at Windows and wanted to increase her geek cred (which is substantial). However, she is back to Win x64. She is pretty smart. But she'd download an RPM, try to install it, then have no clue where the program was because it didn't create an entry in the menu, and she'd have no clue where the program directory was.

          Are you telling me that isn't something that would annoy something at
          • Actually I switched every box in my house to Linux. My boxes were all Gentoo, and Suse for my wife.

            She was incredibly pissed at Windows and wanted to increase her geek cred (which is substantial). However, she is back to Win x64. She is pretty smart. But she'd download an RPM, try to install it, then have no clue where the program was because it didn't create an entry in the menu, and she'd have no clue where the program directory was.

            Yes, we all know RPM sucks... so why didn't you just give her Gentoo al

          • However, if in a specific instance the Windows method is better, shouldn't it then be preferable?

            Only if it can be added in such a way that it has zero impact on those
            of us who are not interested in it. Nothing pisses me off more than when
            I have to relearn how to configure fundamental subsystems becuase they've
            been changed to make things easier for users of software that I don't use.

            Out of curiosity, why didn't you show your girlfriend the find command?
            If that wouldn't have increased her geek-cred, then not
            • As I said in another post, she is impatient. That's why I couldn't install Gentoo on her laptop. She shouldn't have to hunt and find something.

              The funny thing is that I put on Beagle to speed up searches on her laptop, but I don't think she ever used it. Like me, she is anal in how she organizes things. Having a good file structure means you don't have to hunt for items. All Beagle did was take up over 2 gigs with indexing data. Sheesh!

              (Side note, but I think an even happier medium is virtual folders
          • Using SuSE, doesn't YAST list the files installed? I know that Mandriva's urpmi does. Even in the graphical mode (actually, moreso). And why should she download RPMS? Were they specific SuSE packages or simple RPMs packaged for another distro? Aren't there package repositories for SuSE Linux? I understand that these are steps you might have tried already, but since you say nothing about them... Even in the worst case, downloading RPMS from RPMbone or similar sites, they provide a list of the files contained
          • Your story and others like it rings far too true. For my part, I find Linux and other community-made open-source OS's suitable when I have a stable list of things I expect from the system: file sharing, print sharing, routing, firewall services, web services, etc.

            When I want a computer as a flexible environment, however, in which I will install and uninstall games, media players, various productivity applications that I may be trying out, and the like, I just can't imagine going back to Linux. In the 4+ yea
        • Wrong. Nobody switching to Linux gives a shit what directory their KDE is installed in. Believe it or not most people have more important criteria that they demand from their computers, and are much more likely to switch back to Windows if they are required to look in their KDE directory in the first place.

          Exactly! You seem to have missed the point poster was getting at. It may well be necessary to look in the KDE, and other, directories to simply get KDE to install on a given system, because the files get
      • Lately, whenever I find a program that doesn't give itself a proper menu item, I've been filing it as a bug. I suggest doing the same!

        I agree partially with your post in that a standard directory structure would be useful, but on the other hand I think it's very important that operating systems have the freedom to reorganize things as they see fit..
        Personally I think all programs should be flexible enough to be relocated easily without being recompiled, but that's another story.
        • I just don't understand the need for binaries to be placed in ten different locations arbitrarily. I understand somewhat why the structure started the way it did, for security purposes. But security permissions have evolved a great deal, and a much simpler tree I believe is best. Do we really need anything past: /home /usr /var /bin /src /tmp Maybe /log? Would it be so horrific to have a simple tree with all binaries branching from the /bin folder?
        • Sure, all programs should be relocatable without needing a recompile, so you could run 2 versions on the same box. That would be a big plus for linux.

          However, they should all go in a standard place. Windows has 'program files', if you want a binary - you know exactly where it is. Unless it's something like PHP what only installs into c:/php and breaks if it goes elsewhere. That is rubbish in the Windows world.

          The same (doubly so) applies to configuration files. Whereas nearly everything ends up in /etc or a
    • Leaving aside the issue of patches, if standard environment variables couldn't solve the non-standard location of things, then it would seem to me you're asking for something akin to the Windows Registry, maybe more elegant, but there doesn't there need to be a standard place to find the information you're seeking. It seems a better solution than getting all distributions/programs/modules/etc to standardize where they should be deployed - users tend to have different ideas about such things anyway. Also,
    • This is one thing I've been trying to figure out.

      So different distros will put their files in different places. (Actually, I can't believe programs will actually have the library locations hard-coded in, but whatever; I'll accept that the alternatives have some disadvantages.) So Ubuntu will store its WonderfulLibrary.so in /lib/UbuntuLib/, and Slackware will put it in /var/log/opt/etc/usr/lib/. So why can't we just massively symlink the bloody directories together? Someone create a script file with two
  • Fear of fork. (Score:5, Interesting)

    by killjoe ( 766577 ) on Tuesday May 30, 2006 @06:53AM (#15427358)
    The article summary is a bit of a flamebait. In order for a product to fork there must be two forces in action.
    1) Licensing that allows a fork.
    2) Frustrated users who feel like they can't shape the future of the product via existing channels.

    This is why there are at least three forks of java and none of perl. I suppose one could argue that the forks of Java are not true forks but attempts at re-engineering but the end result is the same.

    Will linux fork like Unix? Well in a way it already has, there is real time kernel, different kernels for devices etc but not in the way the article talks about it. The article isn't talking about forks per se it's talking about distros. The author seems to have missed the point that the Unix forks were actual forks in the kernel not "just" distros.

    Weird article really. Kind of pointless too.
    • From the users' point of view; can I install a Gnome-based application on a KDE desktop, without having to install anything else, and expect it to work just as well?
    • The author seems to have missed the point that the Unix forks were actual forks in the kernel not "just" distros.

      That's a distinction without a difference. You, as a user or developer, never interact with the kernel. The closest you are likely to come is calling a libc function that is a thin wrapper around a system call. If the included libraries are different, or they use different versions of the compiler which conform to a different ABI, then the same code will not easily run on both.

      Oh, and the

    • Re:Fear of fork. (Score:5, Insightful)

      by Enderandrew ( 866215 ) <enderandrew@NOsPAM.gmail.com> on Tuesday May 30, 2006 @07:30AM (#15427443) Homepage Journal
      Linux is a kernel, not an OS, but in common parlance, Linux might as well refer to the OS.

      As an OS Linux is horribly fragmented. That is why people flock to a popular distro like Ubuntu, regardless of whether or not it is the best distro.

      Personally, I do believe that the community needs fewer distros. There should be three methods for installing, period. Something like apt-get, emerge and then installing from a downloaded RPM. You shouldn't see different binaries for different distros. A Linux app should be an Linux app, period.

      If we had true standards, we'd have fewer distros. But how many methods and standards do we have for installing programs? For file structures? For menu structures?

      In what I believe to be a perfect world, there would only be maybe 8 major distributions of Linux.

      Home/Personal
      Developer
      Media Center
      Server

      For each of those 4, you get a focus on either GTK or QT apps. Regardless, the file structure, configuration files, menu structure, etc. would be the same for every distro.

      And while this will NEVER happen, I think we need one major development kit, instead of GTK vs QT. When it comes to aesthetics, visual style and usability, I can certainly understand people wanting a choice between Gnome and KDE. But when I design an app, I should build it on one toolkit, and then it should work on both Gnome and KDE, letting Gnome/KDE handle how it looks, etc. As it stands now, the dependency chains are ridiculous. If I use KDE but want a few GTK apps like Firefox or GAIM, I have to install half of Gnome.
      • Home/Personal
        Developer
        Media Center
        Server

        You missed out 'Ultimate'.

        HTH

        • Actually right after I hit submit I realized I should have put Gamer.

          There are plenty of people who want a completely streamlined, tweaked out build specifically for gaming.
      • I just installed ubuntu and one of the best reasons for using ubuntu is automatix, followed by the ubuntu wiki and forums.

        One of the big problems with linux is the problems of nonfree formats. simple reasuring things like the ability to play an MP3 or watch a dvd are stumbling blocks for new adopters of linux.

        I understand the free and open principles and how mp3 has legal problems while obvorbis is patent free and anyone can use it. It still doesn't help much when you want to play an MP3 or play a DVD.

        Autom
      • I agree with a great deal of what you said, Enderandrew, but I have to disagre with some. This is all been gone over at distrowatch time and again.

        Yes, there should be standards. We already have good installers and package managers. The Fedora install works great so why does every distro need a different one? And like you said, a Linnux app should be a Linux app. You ought to be able to apt-get what you want and install a package you find at, say, IceWalkers without so much trouble. A standardized file str

        • YaST would still exist as the SuSE installer and configuration tool. But here is why YaST can't be used anywhere else. SuSE puts configuation files, and the whole file structure in different places than anyone else. So they write this great tool, and most of the Linux community doesn't get to see it.

          How is that really helpful?
      • You shouldn't see different binaries for different distros. A Linux app should be an Linux app, period.

        Amen! Not only is it frustrating figuring out where all the config files are, but having an app fail to install or work because of dependancy or lib versions is also frustrating. I remember having fits trying to install Oracle 8i (circa 1999-2000) and having the install fail because the linker was choking over libc version incompatabilities and LD_ASSUME_KERNEL settings. Ofcourse, all the problems cou
      • Re:Fear of fork. (Score:3, Informative)

        by swillden ( 191260 ) *

        And while this will NEVER happen, I think we need one major development kit, instead of GTK vs QT. When it comes to aesthetics, visual style and usability, I can certainly understand people wanting a choice between Gnome and KDE. But when I design an app, I should build it on one toolkit, and then it should work on both Gnome and KDE, letting Gnome/KDE handle how it looks, etc.

        That's nearly how it works now, and the Free Desktop folks are pushing it closer to that ideal all the time. Programmers should

    • 1) Licensing that allows a fork.
      2) Frustrated users who feel like they can't shape the future of the product via existing channels.

      I'd add:
      3) A lack of a passable alternative

      There wouldn't be much point forking product X if product Y met the requirements.

    • You talk about the old Unix forks, with "kernel forks" as real forks, while userland forks ("distributions") as of little relevance. This is the opposite of how I remember my experience of those days, and the opposite of what my theory indicate to me should happen.

      The kernel is fairly much irrelevant for portability. Userland headers, C library compatibility, file locations, compiler options, linker options, Bourne shell incompatibilities, C compiler incompatibilities, C compiler and library bugs, word

  • by Anonymous Coward
    Those wars helped Microsoft displace Unix with Windows NT, which steadily gained market share until Linux, a Unix clone, in turn began to supplant NT.

    When did this happen? I must have missed it.
  • by wandm ( 969392 ) on Tuesday May 30, 2006 @06:56AM (#15427368)
    I'd like to support the nonfragmentation of Linux - as I guess many would. But looking at the LSB 3.0 certified list http://freestandards.org/en/Products [freestandards.org], just shows Red Hat, SUSE and Asianux. Are these all the choices I have?

    Could someone please explain me?
    • Support standardization where it makes sense.

      For distros that have a regular release cycle, something like LSB makes
      sense. For distros that are moving targets by design (Gentoo, Arch,
      Debian), then any standard that specifies specific versions of
      libraries and compilers would reduce the value of these distros and so
      they're better off ignoring those parts of the standard (and thus will
      never be certified).
      • For distros that are moving targets by design (Gentoo, Arch, Debian)...

        Perhaps it's a matter of opinion, but I'd hardly call Debian stable (plus security updates, of course) a "moving target". Isn't the real reason that LSB requires RPM? (Not wanting to start a flame war, the greatest benefit I found when I switched from R.H. to Debian was no longer having to use RPM. But that's just my personal preference, I guess.) In fact a search leads us to Red Hat package manager for LSB package building [debian.org] which says

  • Splintering (Score:4, Insightful)

    by ronanbear ( 924575 ) on Tuesday May 30, 2006 @07:02AM (#15427376)
    Splintering is also something that helps Linux innovate so rapidly. If you have a good idea and are willing to do the work you can pick a distro that suits your needs. If there isn't one for you or the distro maintainers aren't receptive to your ideas you can fork a distro and experiment on your own.

    Sure this leads to some incompatabilities and duplication of work but there are several ways for developers to mitigate this. Open standards are essential as they allow code be ported between distros rapidly. Another good idea is for devs to be involved (in some way) with using multiple distros. Different projects could work together more closely to achieve better interoperability.

    Its an essential aspect of forking to accept that many forks are dead ends and should be allowed to die or merge back into the tree where desirable. There are many good projects out there and it isn't really in everyones interest to reinvent the wheel continuously.

  • by Big Sean O ( 317186 ) on Tuesday May 30, 2006 @07:03AM (#15427380)
    All your (Linux Standards) Base are belong to us.
  • by drsmithy ( 35869 ) <drsmithy&gmail,com> on Tuesday May 30, 2006 @07:08AM (#15427387)
    It displaced Netware.

    Similarly, Linux isn't displacing NT, it's displacing commercial UNIX.

    The overlap of functionality between NT and Linux is, really, quite small. There aren't many cases for which Linux is a good solution, where NT could also be (and vice versa).

    • absolutely right. It killed Netware completely by offering all the network functionality with an operating system! No more installing drivers (ie buying an OS and then buying a NOS, just buy NT and have both).

      Similarly, NT drove off commercial Unixes - you never hear about AIX or HPUX anymore.

      However, the factors that made the NT market (ie cheap whilst still being good enough for purpose) should be the factors that make Linux kill NT in just the same way. The trouble is, that Linux doesn't provide all that
      • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Tuesday May 30, 2006 @08:57AM (#15427699)
        NetWare would broadcast its serial number on the network.
        It killed Netware completely by offering all the network functionality with an operating system! No more installing drivers (ie buying an OS and then buying a NOS, just buy NT and have both).
        You could not install two NetWare boxes with the same serial number. They would kick out all the users. If you had a single license for 25 users, that's all you could have until you purchased more licenses.

        NT did not broadcast its serial number. You could buy a single copy of NT and install it a thousand times. If you needed a new file server or a temporary file server, it was so much easier to setup another NT box.
        Similarly, NT drove off commercial Unixes - you never hear about AIX or HPUX anymore.
        Yes you do. But they're still in the organizations that had them before.

        What has changed is that Windows servers swept through the smaller companies. Those companies never had a *nix box. They might have had LANtastic or NetWare or nothing, but they did not have *nix.
        However, the factors that made the NT market (ie cheap whilst still being good enough for purpose) should be the factors that make Linux kill NT in just the same way.
        Okay, I can agree with you on that.
        The trouble is, that Linux doesn't provide all that - although its price trounces Windows, and its feature set is damn good, it just doesn't have the 'polish' or the standardisation that matters to a business.
        I guess that depends upon what business segment you're talking about.

        Linux has been showing double digit growth for the past 5 years (maybe longer). Businesses are deploying it. At the server level.
        No business will go Linux for general purpose use (ie, if you standardise on a distro to run a particular app, then you're fine, but you have a controlled ecosystem) where users use it to do everything because it doesn't have the "shrink-wrapped" approach to apps.
        Now you're talking about the desktop segment.

        The corporate desktop segment is different than the corporate server segment.

        And the biggest problem with the corporate desktop segment is all the Access databases that have been built over the years.

        The 2nd problem is all the not-supported-or-sold-anymore Windows apps that users "absolutely must have to do my job" that they've acquired over the years.

        Changing 10 servers is easier than changing 10 workstations for users who've spent 10 years with the company.
        This is the big issue, and it keeps Linux in the realm of the hobbyist market (we'll ignore the outsourced, this-is-what-you'll-get approach from a big consultancy).
        You might want to take a look at Google before you talk about "hobbyist market".
        Once those 2 things are there, so I can take a binary package and install it on whichever distro I use (its Linuix after all, isn't it? - at least that's what Joe User will say) then Linux will be accepted a lot more readily.
        I'll have to disagree with you on that.

        While that would be nice, it is far more likely that one distribution will become dominant and that distribution's structure will become the de facto "standard".

        And it seems we're already on that path with Red Hat and Ubuntu.
      • This is the big issue, and it keeps Linux in the realm of the hobbyist market (we'll ignore the outsourced, this-is-what-you'll-get approach from a big consultancy). If Linux wants to be taken seriously, the prime things it has to get right are standardisation on some simple features (eg basic directory structure), and binary installers. No 'just ./configure and ./compile', just 'yum install xyz'. (or apt-get, I don't care which) that can install anything.

        I find using apt-get with a decent GUI front-end to

    • The overlap of functionality between NT and Linux is, really, quite small. There aren't many cases for which Linux is a good solution, where NT could also be (and vice versa).

      Does not matter to the manager that wants a particular OS deployed for a particular solution. A few years ago I migrated a Netware printing system that handled tens of thousands of documents per day to an NT solution. It ended up requiring 16 NT servers to replace 3 Netware servers. Of course NT was not the correct solution but
      • NetWare was very very good at File+Print, sure, but it was also not very cheap to run. Server licenses, connection licences, the administrative skillset, supporting the client software, in some cases supporting an IPX WAN, etc. Depending on the situation, buying extra NT servers to replace Novell could well have been cheaper.
  • LSB not opensource (Score:3, Insightful)

    by grahamm ( 8844 ) <gmurray@webwayone.co.uk> on Tuesday May 30, 2006 @07:13AM (#15427394) Homepage
    The thing I do not like about the LSB is that it seems to be pampering too much to the desires of the closed source application suppliers who only ship binaries not source. When the application is available in source form, many of the issued addressed by the LSB become either irrelevant or much less important. The binary distributions can build binaries for RPMs, debs etc for their own distribution and for the most part, users can install from source using the 'standard' "./configure; make; sudo make install" without having to worry about having the exact layout and library versions mandated by the LSB.
    • Agreed.

      How can packaging be such an issue for the commercial vendors, when huge projects like KDE, GNOME, PostGres, MySQL, etc... manage to have packages for all major distros? I fail to see how hard is to mantain build scripts for RedHat, Suse and Debian boxes to automatically generate RPMs, DEBs and Tarballs.

      I think that the scenario is pretty much defined, we have the RPMs for RedHat/Novell based distros, DEBs for Debian and it's offspring, and TGZs for everybody else.
  • by munro ( 265830 ) on Tuesday May 30, 2006 @07:16AM (#15427399)
    The funny thing about the LSB is that it concerns APIs for use by userland programs -it has _absolutlely nothing_ to do with the kernel. All of the requirements for LSB compliance concern calling conventions, executable formats, libc, POSIX facilities, filesystem layout an other extra-kernel configuration, most of which any UNIXoid system could support.

    There are no obstacles to Darwin, *BSD and Solaris systems meeting LSB compliance, because it has nothing to do with kernels and everything to do with the specific details of a UNIX userland environment.

    Generally I don't get into 'Linux' vs 'GNU' discussions but the LSB is once case where I feel the name 'Linux' is used completely inappropriately.
    • Well, if you'd read the article you'd see it discussed in the second answer:

      "2. How does FSG work with the Linux development team and the Linux process?

      Actually, the LSB doesn't specify the kernel--it only specifies the user level runtime, such as the core system libraries and compiler toolchain. Ironically, then, the _Linux_ Standard Base isn't Linux specific at all--it would be entirely possible (and probably not altogether hard) for Solaris to be made LSB compliant. The LSB is entirely concerned wi
  • FUD alert (Score:4, Informative)

    by Cardinal Biggles ( 6685 ) on Tuesday May 30, 2006 @07:52AM (#15427495)

    What's this about "various types of licenses" under which Linux is supposed to be available? Linux is GPL, so forking is possible, but there is no risk of UNIX-style fragmentation because the source is open and copyleft. For somebody to create a "closed Linux" they would have to start from scratch. You can't add closed bits to GPL software and keep them hidden, so any incompatible Linuxes ("fragments") could always be re-connected by users irritated about the differences.

    The nonsense about UNIX displaced by NT and NT in turn displaced by Linux already set off my alarm, but the above really is FUD designed to further somebody's personal agenda.

    It is not possible for UNIX-style fragmentation to happen to Linux, because of the GPL.

  • Unix never died (Score:4, Interesting)

    by Steeltoe ( 98226 ) on Tuesday May 30, 2006 @07:53AM (#15427499) Homepage
    From the submission:
    Unfortunately, one of the very things that makes Linux powerful also makes it vulnerable to the same type of fragmentation that helped to doom Unix - the open source licenses under which Linux distributions are created and made available.

    I believe fragmentation has very little to do with the issue concerning the doom of UNIX. My three top reasons are:

    1) Price of purchase
    2) Expensive/hard to administer
    3) Stagnation in development

    Users want the cheapest, easiest and most feature-filled solution. It's pretty straightforward actually, and a Personal Computer with Windows was the first to fill the niche, if you leave out Apple.

    Apple lost because they wanted monopoly on _both_ hardware and software, while Microsoft only wanted to control the OS (in the beginning). More importantly, Microsoft was better at hyping/marketing their next generation, something that Apple has learned to do better in the recent years.

    UNIX and IBM lost because they failed to scale down to personal PCs, which is where the commodization of computing happened in the 90's. IBM and other mainframe dealers refused to understand the Personal Computer (too much vested in big contracts), thus the clones took over along with Microsoft Windows while the dinosaurs waited it out.

    Without the IBM PC Clone, the computing world would probably look very different today. In those days it was very attractive to be able to upgrade the PC, exchange parts and use commodized hardware for the whole rig. Many tasks which rented expensive CPU-time on UNIX mainframes, were moved over to PCs during the 90's.

    Fragmentation, no doubt, can be very bad for development, but it is also a boon since it leaves developers free to explore different avenues regardless of politics and limitations. I think once a system becomes popular enough like "Linux", the demand for standardization will pull it together. Hey, even the BSDs keeps compatibility with "Linux".

    What killed UNIX was lack of creativity, focus, commodization, too much control and maybe most importantly: arbitrary high prices just to milk customers.

    Linux may have killed off UNIX (oh what irony), but NT have been beating the crap out of it for many years. Linux and UNIX never actually competed on even terms, because UNIX has already been pretty much abandonded for a long time - it's owners only keeping it for milking the last drops.

    My pet peevee with bash and the GNU utilities is the lack of standards, and lack of further development of the command-line. In that regard, I hope "Linux" can progress without having to be beat by Microsoft releasing a better command-line.

    POSIX is really an antique joke compared to what could be possible via the command-line. So the trap "Linux" might fall into, is the same as for UNIX: stagnation, because most users drool at eye-candy and not the actual implementation in the back-end. However, maybe the cost of switching command-line is not worth the gain, time will tell.

  • It might be that the LSB makes life easier for distribution but does it also have an effect for developers and users? I don't remember I ever have looked into the LSB when designing and coding an application nor when distributing source files. And I'm quite sure most users don't even know that the LSB exist. While the LSB is very important for the binary distribution, it's influence on a Linux system is rather limited. Yet the FSG only cares for the LSB and therefore it's importance is also rather limited.

    A
    • Oh, the LSB's influence is very minor thing, but as with life, its the little things that count most.

      look at http://freestandards.org/docs/lsbbook/install-app. html [freestandards.org] and you have something that is no effort on the developer or installer, but will make a hugely beneficial difference to the sysadmin.

      A GUI standard would be good too - but I doubt its time is here for Linux yet. Reminds me of the Windows Usability Guidelines that used to exist, it made Windows a much more consistent interface and was a Bible for
  • by erktrek ( 473476 ) on Tuesday May 30, 2006 @09:11AM (#15427748)
    I thought forking and merging (thanks to the license) are strengths of Linux. It's an evolutionary development process where the best or most popular ideas rise to the top. Basic standards arise out of the most widespread/adopted projects.

    It's hard for me to see in this chaotic (but necessary) environment how much external control developers are willing to have "imposed" on them by such standards - unless of course from a development/technical standoint it makes sense.

    My understanding is that Linux really isn't in the game of competing with anybody (Unix, Windows or otherwise) anyway. it's just about the code, love of things computers and a new way of doing things.

    E.
  • Where's the script that installs symlinks on any distro's filesystem that makes every "filesystem API" available on any Linux distro?
  • Those wars helped Microsoft displace Unix with Windows NT, which steadily gained market share until Linux, a Unix clone, in turn began to supplant NT.

    Does anyone have figures for this - that Linux has supplanted NT.
    Not that I don't believe it - Since it's many years since NT was sold, it may
    be possible that Linux has begun to supplanted NT. Hopefully by 2010 or so (when
    Vista will be released ;-)) Linux would have begun to supplant Windows 2000.
  • Ah yes but (Score:4, Informative)

    by The Cisco Kid ( 31490 ) * on Tuesday May 30, 2006 @10:00AM (#15427952)
    Linux has one thing Unix never did - if someone forks it and does something innovative, then ther forks/branches can use it too, thanks to the GPL. The varous 'Unix' flavors didn't allow that.
  • by The Pim ( 140414 ) on Tuesday May 30, 2006 @10:38AM (#15428151)
    Happily, there is a remedy to avoid the end that befell Unix, and that remedy is open standards - specifically, the Linux Standards Base (LSB).
    Let's see, Linux has been around 15 years, and there has been a competitive commercial marketplace for, what, at least 10 of those. The LSB has been relevant for how many of these? Oh wait, it's still not relevant! Whatever has saved Linux from "the end that befell Unix" so far, it ain't the LSB.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...