Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
GNU is Not Unix

Linux to Fragment? 175

King_B writes "news.com has an article in which Sun's COO Ed Zander addresses the competition. One point to note is his prophecy concerning the eventual fragmentation of linux into non-compatible vendor-specific linuces. " Doesn't really say anything new, but nothing else seems to be happening today *grin*. People have been preaching about fragmenting Linux for years but it hasn't happened. And even if it did, I somehow doubt it would matter all that much. But it still gives COOs something to talk about I guess.
This discussion has been archived. No new comments can be posted.

Linux to Fragment?

Comments Filter:
  • A while ago, I would have replied to this with the standard "It's open source yadda yadda yadda choice is good". However, after seeing Redhat's latest hijinxs with gcc, I'm starting to worry if this sort of thing will become a trend. So first it's gcc that's incompatible. Then what? Xfree86? glibc?

    More work needs to be done to adhere to standards [linuxbase.org] and distros not doing their own thing just for the hell of it. I'd really hate for Microsoft's 'mutant' ads to ring true.
  • Exactly!

    It's the freedom of choice of the whole thing that makes it all work.. With 'an entity' in charge of 'all things Linux', of course it can get perverted. Deals are made. Things are made to work a particular way without any alternative. In Linux land that is different and there will always BE alternatives. I think the same was said about RedHat a while ago 'Oh!.. what if RedHat try to take over Linux!?!?' .. Bzzzt.. Wrong. Not happening. Why? .. Because someone else will just do things THEIR way and in the end its the user that decides what they put on their machine, and if there's better alternatives then we use them. That's why a lot of us are using Linux anyway in the first place.
    --
  • Yea, that's like saying "Don't elect Dick Cheny as VP as his heart might explode if you sneeze loudly" .. it's possible but ..

    Umm... yea..

    Hmm.

    --
  • Why are people even worrying about this? Sun represents the old and largely failing UNIX industry where hardware vendors try to get a competitive edge over other vendors by being odd.

    I think it's a miracle that Sun even exists today. Perhaps this goes to show how valuable a good brand name is. I mean, Sun hardware is expensive and delivers less bang for the buck than other hardware solutions -- yet they survive.

    If you are going to listen to the generation of business people who did their best to run UNIX into the ground, to make it marginal, to make it expensive, to make it exclusive, you should have your head examined. These people were not responsible for the great UNIX surge of late. It wasn't because they did something great. They were just lucky to still be in business.

    So when Ed Zander or Bill Joy talk about UNIX, or Linux or open source or even Java I can't really say I get very excited because I don't think they have much important to say.

    So what if Linux fragments. It has fragmented already. There are many different Linux kernel projects and if people fail to see that this is beneficial to the Linux kernel development they need to get off the drugs they are on.

    If Zander is trying to get attention by Metcalfing then so be it, but people should be able to recognize it for what it is.

  • As a *nix developer and system specialist, I already have to know the difference between HP-UX, AIX, Solaris, Linux, et. al. While each are similar, coding an application for each requires knowing the caveats of each and using tons of #ifdef statements. Why should the idea of Linux fragmentation be any scarier?
  • The only incompatibility I've heard of is with compiled binaries. I don't see how this is the start of some evil trend. Unless you are using a whole bunch of proprietary precompiled binaries, any package that needs it can simply be recompiled using the newer libraries or compiler.

    In fact, I have a hunch that Red Hat took care of most of the dirty work in doing that when they put 7.0 together. If anything, they've made more work for themselves, since they now have to recompile any patched 3rd party software with each compiler/library combination.
  • It doesn't matter that Redhat is using a version of GCC that's not compatible with Debian, or that Mandrake has different configuration files than Stormix. It doesn't even matter that redhat 7 RPM's don't work with anything else. Why? Because no one needs to take any of these parts and interchange them! As long as I can still compile the same source on each, it really doesn't matter.

    Did Linux fork?
    no
    Are distributions different?
    yes.

    Ultimately, whether corportations create their own versions of Linux and purposefully make them incompatible with the main, doesn't matter at all. Whatever version they create is GPL'd. If it's worth it, compatibility gets added to the mainstream, if not, no one will be using their crap-ass distro anyway! Think about it; how many distro's out there are just gathering dust in some dark corner? That's just because they suck, immagine if they broke compatibility with EVERYTHING else out there! There is VERY strong pressure to keep things compatible and no need for Quote: "strong cohesive force to keep Linux...on track."

    I can see it now! you get a brand new sparc machine that kicks ass, Oh but it came with Sun-Linux which is incompatible with Linus-Linux. Abracadabra! fdisk ; apt-get Debian! :O)

  • I say it's already happened, for all practical purposes, but not over the desktops (pbbbt!). It's in the debates over "which Linux should I get", and "will this binary package work on my Linux", or "damn! why doesn't this work on my Linux", and "Should I use the Red Hat Package system, or the Degain?" and a few others I've come across among my friends in the Linux world... I know you've lost many newcomers by all this confusion, I've watched it happen.

    I try to get them to try NetBSD [netbsd.org], which has one (series of) kernel, one core SW install distribution, one package system (which beats Red Hat's all to hell, but could take a couple of lessons from Debian's if what I've read is accurate), runs on almost anything, is supposed to be able to run Linux binaries (I've never tried), and other Good Things (TM). It's far less overwhelming/daunting for a newbie than the Linux menagerie. I think that the one major technical point holding it back is the install; it's not pretty from any angle...

  • Yeah, that's what I said above, only a little better :) Someone mod this up.
  • PCs dominate the world today because the spec was opened up

    Partly... but more because it bore a name that carried big weight in the business sector, which had more money "laying around" for trying expensive new toys, introducing them at work to people who otherwise would never have considered a computer, etc.

    And don't forget that IBM didn't open the entire machine; the BIOS had to be reverse engineered; and very carefully, to avoid legal hassles.

  • Linux will fragment when the PC fragments. The PC is commodity hardware, made from interchangeable parts acquired from different vendors according to known specifications and assembled by people who stamp a brand name on it. Linux is the exact same thing in software. The kernel, XFree86, GNU tools, raid freshmeat, integrate it into one big package.

    Anybody who wants to can become a PC vendor, just like anybody who wants to can become a linux vendor. Same difference. It's not a cause of fragmentation, the little guys have to be MORE standard because they don't have the clout to push for changes in the standard base. Only by BEING standard can they get anybody to pay attention to them.

    If Seagate made an incompatable hard drive that didn't conform to the ATA spec, Dell, Gateway and Compaq wouldn't use them. They'd fold. If Dell put out a computer that wasn't compatable with gateway and compaq's, they'd get bad PR and loose customers. The PC HAS fragmented before, and the offshoots died because the main base simply outgrew them and rendered them obsolete.

    It's the exact same thing with linux. Compatability is evaluated by consumers and enforced by consumers who decide what they want to use. It's that simple. Tandy didn't make compatable stuff, they lost out. -IBM- stopped making compatable stuff (PS/2, PC Jr.), they lost out.

    Any enhancement that can spread and be adopted by other vendors becomes a new standard. Any that can't diffuse in this way is eventually ostracized (even initially successful stuff like US Robotics HST modems: if it's single vendor proprietary it is DOOMED to inevitablly fall by the wayside. The commodity stuff out-evolves it over time. Guaranteed.)

    We've got decades of history here, the trend's not hard to spot. Even for guys in suits.

    Rob


  • I don't think a completely study of linux binaries on FreeBSD has been done, so unless you have some facts to back up your statement I'd suggest you blow me.

    Ranessin
  • Unfortunately, his quote, removed of context, implies that it doesn't. In the context of the original article he wrote, he was pointing out that he could get an IRIX system working quicker than he could get Linux working (aside: is his time really worth the cost of an SGI box? Did he really find Linux that hard?); that's a fair point.

    However the quote, which is what I'm referring to, is typically used, out of context, to imply that other options are somehow bree of time costs.

  • We're already seeing incompatible, irreconcilable differences between linuces and linuxes.

    --
  • JWZ's quote is snappy, but absurd, since it assumes that everything except Linux requires no time. The reality is that it takes time to get any system running correctly.

  • The only real threat to linux is the Linux community itself. If the community starts worry about religious wars instead of looking at them as options and different way of thinking then we will be ok. There is alot of room for different opinions
  • He's just ticked off because older unix variations could not keep it together. Maybe he just doesn't want to stick a "fork" in Solaris yet, but he knows it's in his future.
  • I have experience coding for Windows 98, NT, and 2000, and I can tell from personal experience that there are many ways these OSes are incompatible. For example writing some simple 16 bit assembly programs I had numerous places where my program on NT or 2000 would just work, no bugs. But put it on 98 and the program hangs or generates an exception. The same binary will work on NT 4, and 2000 but will not execute under dos or Windows 98.
  • This only would happen if there is only _one_ good Linux tree. But what if there were two, the first one supported by five companies and the second one supported by five other companies, both with a large number of users? What if these two trees became incompatible? The Linux user base would be split...
  • > Somehow I think something was lost in the comparison :/

    maybe the mileage after which they change the grease? :)

    --
  • You're right, but saying "Don't bother with Linux because it might fragment one day" is like saying, "Don't elect Al Gore, he might die in office!" -- it's possible, but unforseeable.

    -Omar

  • But he never said that it doesn't take time to get any other system running. Nor does his quote assume that other things are free even if they take time. He's just pointing out that time is valuable and that anything that times time has an inherent cost to it.

    Ranessin

  • And you've run every linux binary to test this theory of yours? Didn't think so. End of fucking story.

    Ranessin
  • Actually, most of the changes in the RH7 file system were made to bring it closer to the FHS2.1 standard. There is, in fact, an agreed upon standard for where things are supposed to go, and AFAIK RedHat is as close to conformant as anybody. They even added a bunch of symlinks to their rc.d directory structure so that programs from other distributions would find inits where they thought they should be.

  • > However, having it all in one tree still seems to make some sense: It will all be the same kernel, but if I compile for my pentium I won't have to include stuff for alphas, handhelds, or supercomputers
    Sure, but it'll be a huge tarball (which is not a big deal). What is worse is that it will _always_ be broken.

    Let's see the different level of modificaitons possible:

    1/ A change that is good for everyone. No problem, it goes into the mainstream kernel.

    This is what your original post was about

    2/ A addition ("include stuff") that is good for some people, but useless for other people. No problem, you wrap it into a CONFIG option.
    This is what you are talking about now.

    3/ A change that is good for people but that would harm others (even if not enabled. It would populate the kernel with hundreds of #ifdef). Here, I am talking about big changes, like real time. Those are maintained off-line, in patches.

    But, such changes are harder and harder to maintain. At one point, it will be more work to tweak the code so the patch still work, than it would be to re-implement (cut'n'paste) the kernel new features. And the patches would be in so many parts of the kernel that they would confict with other patches out there (ie: if, when you apply the handhelds patch, you can't apply most of the other patches out there, it means that the result is hardly linux)

    This would be the the 4th kind of kernel modifications:

    4/ Modifications that are so invasive that the result cannot be called linux anymore. Those deserve forks. And in that case it would be a good thing (note that you can bet that the fork would stay compatible with the model used for drivers, filesystem, and won't be a total alien)
    (And there is always the classical ego-fork. Linux is probably safe from this because Linus is incontested. But, if he was hit by a bus...)

    Cheers,

    --fred
  • Even if fragmentation occurs, it's all GPL'd so it will still be possible to make the various distributions/kernels compatible.
  • Actually, because it implied a falsehood through illogic, but that wouldn't fit on the subject line.

    The post in question boils down to:

    A. Any one can submit patches for linux.

    D. Only the BSD inner circle can put patches into releases.

    E. Therefore BSD has better quality code.

    It left out the following clauses:

    B. Only Linus or Alan can put patches into Linux releases.

    C. Anyone can submit patches for BSD.

    When you add the missing clauses, the conclusion E. is obviously not a logical results of the premises. Since the post is attacking Linux in a Linux subject with false statements, it counts as a troll.

  • Well Linux will necessarily fragment. It's been years that 'experts' and 'industry insiders' have started predicting it ... it's just a matter of time, now!

    Oh and there will be a major havoc on January 1st 2000, they predict, too.

    --

  • Does anyone know what is happening with the linux standard base? Are the main distrobutions embarcing it? I havn't heard anything on websites. If not, why not or when?
  • Do they mean "fragment" in the same way that Windows already has, under the direction of but one company.

    No two versions of windows are completely binary compatible due to dll version differences.

    Most people haven't upgraded to WinME so right now most home users are split amongst Win95/98/ME and even NT in some cases.

    I know businesses that run NT 3.51 and 4.0 and 2000 in the same freaking room. Don't even begin to tell me they are compatible...

    And to a lesser extent what about the crap Microsoft pulled with shipping 2000 itself in 30 different versions (Professional, Server, Advanced Server, Super Dooper Advanced Server).

    Anyway, just trying to draw a parallel... As long as linux stays with a common C library (glibc) and keeps with one windowing system (X) and Gnome/KDE don't let themselves become incompatible than things can only become so fragmented. Although, a uniform configuration/filesystem standard(s) may be a good idea.

    Justin Dubs
  • With regards to ther kernel:
    There has been talk of forking the kernel for ages. This hasn't happened yet, and I can't see it happening anytime soon. All the major Linux vendors seem to believe Linus will make the best (right?) choices for what should and should-not be in it. So far he has done a great job, by most peoples reckoning....


    Unless there is a Microsoft distribution.....

  • Maybe they'll be able to back their claim that they invented an open source revolution if this happens
  • > Although it does make me wonder why the world according to the Bible is only 6000 years old,

    *sigh*

    The bible NOWHERE says the Earth is 6000 years old.

    Gen 1:2 describes the RE-CREATION of the earth. The word 'was' really should be translated "became"

    http://www.custance.org/hidden/6ch1.html [custance.org]

    In Genesis 1:2 the first "was" is printed in ordinary type, the second "was" in italics. Similarly in verse 3, the first "was" is in ordinary type, but in verse 4 it is in italics. We are by this to understand that the Hebrew original supplies the appropriate form of the verb in the first instances, but omits the verb in the second. This signifies that a change had occurred with respect to the earth in verse 2 and a change occurred in respect to the coming of light. What was a perfect earth became a ruin; what was dark became light.


    And here is one possible explaination for the "missing history" between Gen 1:1 and Gen 1:2

    http://www.homeworship101.com/recreation_of_the_ea rth.htm [homeworship101.com]

    Yahweh Bless
  • I have found more information regarding perveted practices and terms of the 'Open Sauce' community

    see here [lwn.net]!!!!
    This document details a new disgusting prastice called grope which is short for GNU rope!!!!
  • You're right, but saying "Don't bother with Linux because it might fragment one day" is like saying, "Don't elect Al Gore, he might die in office!" -- it's possible, but unforseeable.

    Actually, if you look at the 'Zero Curse' numbers, whoever is elected has a good chance of dying if office.

    For those who don't know, the 'Zero Curse' is that every U.S. President, starting with Lincoln, elected in a year ending with a zero has died in office. The only exception was Reagan, and not for a lack of trying.
  • ...that most people would be smart enough not to fork the kernel. Even if they did, they would have to release the code and anything good would eventually make it back into the main tree anyway, so why not just put it there in the first place? Although, I guess that might a problem with a monolithic kernel: all changes have to go through one person basically, and that can take some time...
  • ...but nothing else seems to be happening today *grin*

    How could you possibly say that? CNN is reporting [cnn.com] that helicopters are tailing [cnn.com] the ballot-toting Ryder truck in Florida at this very moment! This is better than O.J.!

    Who knows, maybe the helicopters are following the worng truck and some guy moving to a new apartment is fearing for his life as these helicopters chase him.

  • For years I heard people saying that a monopoly like Microsoft is eventually bad for the market, because of a lack of competition. And now, when there are rumours about Linux forking, it is suddenly bad to more have more competitors on the same marketplace. Does this make any sense?


    How to make a sig
    without having an idea
  • I guess that quote is true if your talking about the "free as in beer" cost of linux. But that's not what makes the OS attractive, it's the "free as in speech" aspect. And that makes linux free no matter how long you spend with it or how much you pay for a copy of it.

  • This is a textbook case where the term FUD (Fear, Uncertainty, and Doubt) could be used correctly. Lately it seems as though people try to call any negativity "FUD" -- a careless use of the acronym. Here we have a major company attempting to frighten anyone who will listen uncritically about the uncertain future of linux, by planting the seed of doubt within their minds.

    They are doing this by using the dreaded word "fragmentation", which is merely the negative spin for heterogeneity. "Strength through diversity" is the positive spin, and would be the appropriate dogma to respond with. 8^)

  • by VegeBrain ( 135543 ) on Thursday November 30, 2000 @06:40AM (#591989)
    "Linux is different than HP's Linux is different than Dell's Linux and (a customer) will have to recompile five times. You've broken it effectively. So you cannot depend on one Linux."

    What he's saying here is that if you want your software to run on Solaris you only have to compile and test it for Solaris, while with Linux you have to do this for several distros. There's one teeny eensy detail he left out: Solaris is only one "fragment" of Unix. To get your software going on AIX, HP-UX, Ultrix, and SCO Unices you have to guess what: get it to compile and then test!

    Then there's other issues he's conveniently left out. Getting software to run on different Linux distros is a lot easier than doing the same thing for Unix variants simply because the amount of variation between Linux distros is much smaller than Unix variants. Different Linuces have the same kernel and C library while Unices don't, among other things.

    He's grumbling about the fragmentation of Linux while claiming that his own fragment of Unix is the one that will solve all your software compatibility problems. It should be obvious that if you only use one variant of Unix then you don't have to deal with any other variants. DUH!

    It's the same old marketese that Micro$oft is always saying: use our stuff and all your problems will go away. You'll be able to retire at 15 to a deserted desert isle where bodacious babes will attend to your every need and want. It's also amusing to hear someone from Sun grumbling about Linux fragmentation while at the same time holding up their own fragment of Unix as the solution to the fragmentation problem!

  • Bag the opposition.

    I don't see them bagging Linux, moreso bagging Sun's hardware and software competitors (IBM et al), suggesting they are the ones that are going to screw GNU/Linux.

    BTW, a fork of Linux means forking the kernel. He never described that.
  • Actually, William Henry Harrison (elected in 1840) was the first to die in office.

    Tecumseh's Curse (named for the Shawnee Chief defeated by Harrison) is described in detail here [yowusa.com].

    --

  • by mftuchman ( 66894 ) <{moc.liamg} {ta} {namhcutfm}> on Thursday November 30, 2000 @06:49AM (#591992) Homepage
    I would argue that the fundamental question should be developer expectations.

    An expectation is a requirement to use a certain library, or programming methodology to get the job done. If I am contemplating creating an open source application, what am I required to know to run under OpenBSD? FreeBSD? GNOME? KDE? Self contained environments such as LispWorks?

    Two 'Operating System' are sufficiently different/fragmented if there are sufficiently different expectations and requirements to make a running applications under them. If it requires a full time job to resolve the differences, then they are fragmented as far as I am concerned.

    On the other hand, with the vast number of programmers willing to tweak my brilliant program :-) to run on their favorite *nix variant, perhaps the differences aren't so great in terms of cost after all. So this is really subjective, and I do realize this.

    Thus, what is 'sufficient' is deliberately left vague. Or perhaps we can define a metric - the distance between two operating systems is the amount of work required to get a program running identically on both OSes.

    The verification of the Triangle Inequality is left as an exercise for the reader.

    I would be the first to admit there are some problems with the above way of thinking, but as with many questions involving language, they will never really be resolved satisfactorily.
    ---

  • by josepha48 ( 13953 ) on Thursday November 30, 2000 @06:57AM (#591994) Journal
    Uh, news flash this guy is behind the times. The distributions are already in some sense binary none compatible. That is one of the reasons why the LSB was formed.

    I think that today they are more compatible than two years ago. However if you look at any two distributions it does not take a rocket scientist to figure that they are binary none compatible. Differnet libraries exist in each distro. Each program can be compiled against different version of the libraries with different parameters adn settings (configure --what options you pick). That is why Linux is Open Source and you get the source. You then compile the program yourself. This then becomes a none issue. So what? So I cannot take a binary from SuSE and install in Redhat. I can still build the rpm myself or get the tar ball. It's not that difficult.

    I don't want a lot, I just want it all!
    Flame away, I have a hose!

  • Fighting for fragmentation or against it?

    Seriously, these guys never solved anything, and despite the Single UNIX Standard, they never healed any wounds. Instead, the problem dissapated when they ceeded the desktop to Microsoft and the UNIX boys just hunkered down to sell big high-profit servers. So, Zander's seen it all before, but his take come from the background that Sun is just as clueless as RedHat or anyone on how to sell multivendor unix to desktops and small servers. At least Linux has a philosophical solution ot this problem - open software.
    --
  • Why the fuck do you people always say that?

    Why?

  • Wait... let me explain.

    It all depends on whether you want to talk about the user experience or the kernel. Of course the kernel isn't fragmented, much. But the user experience very much is and has been for a while.

    Different distribution default to different X environs, different system tools, and most importantly, dramatically different ways to add/remove programs. This is what the user sees already.

    For example: Take two distros meant to be infront of desktop users: Corel & RedHat. These things are so far apart as to be nearly incompatible. Moving an application between the two pretty much requires that you be an expert. Your typical user isn't gonna want to compile, iron out libc conflicts, work out differences in file system structure, etc. And that's if the software maker provided the source. Otherwise, good luck installing it on a distro other than what it was packaged for.

    Yes, at its core and by its licensing, it'll be hard to truely fragment Linux, but by the time it reaches the user (where it counts), it's practically broken already.

    --
  • "concerning the eventual fragmentation of linux into non-compatible vendor-specific linuces"

    Oh, lets see here... DOS (6 versions) Windows3.1, Windows 95 (3 versions), Windows NT (two versions, I won't even go into the service packs.), Windows ME, Windows 2000, Whistler.......

    It seems fragmentation hasn't hurt some OS's marketability. Shure some of these versions are compatable on the same machine, but there is usualy a fair amount of screwing around that has to go on.

  • I hardly see any reason for Linux fragmenting. About the only thing which might cause this is if Linus got run over by a bus tomorrow, where there might be a scrabble for control, and a divergence of opinion in which way the kernel ought to go. But I personally believe and hope the leading developers would be able to get it together enough for the kernel to go on.

    There is plenty of scope for divergence in Linux already by making a different distribution. A mahor example of this happened with the formation of the Mandrake distro, which IIRC was specifically to incorporate KDE on top of a standard RH distro when there were arguments over KDE licensing. Distributions often attempt to emphasise different things, e.g. Bastille emmphasises security, Debian tries to stay as GPL as possible, RedHat tries to be as buggy as possible :-) ....etc.

    In summary, there is plenty of leeway for all sorts of Linux enthusiasts to make and get the exact type of Linux that they want or need.
  • Why has this post been marked as a troll?

    It seems any post criticising Linux is marked troll.

    Why - it's completely true. Linux is not even the best free Unix clone, never mind the best OS.

    What Linux does have is:

    a good name
    a great publicity team
    a cute penguin

    Hell if FreeBSD was called Linux, it would do well too.

    Why don't these morons who don't know shit about kernels or operating systems, but who just instinctively censor the anti-Linux posts keep their mod points to themselves.

    The point about professionals is very true: a controlled program is the way it should be.

    This isn't a troll - it's the truth - why the hell should companies like Adobe and Corel invest their money in Linux when they have three hundred different versions of Linux already - for example, Photopaint doesn't install with Mandrake 7.2.

    This doesn't happen with Windows - with it, when you release a new program, you're pretty damn sure Microsoft have taken the trouble to make sure it works with all the software.

    A controlled OS made by professionals is better for everyone - just try telling me that Linux is a good as Windows or OS X.

    Although the established parts of Linux are often well written (the kernel, things like mail utilities), the newer stuff, like KDE, is cobbled together by a bunch of amateurs, many of whom are writing their first programs as KDE.

    PS. I'm sure that someone will mark this as troll as well, but it's not.

    The fact is that Linux is a massive black hole of resources and effort - people trying to cobble layers of stuff onto decades of cruft - whereas a proper OS like Solaris, Windows or OSX is actually
    managed - people say Windows sucks and Linux rules, but it's just a lie - you can't even configure the thing without using a hundred different text files, each with different formats; even projects like linuxconf have to be maintained separately because of the *massive* existing fragmentation.

    Linux is already more fragmented than anything - how else can each different distribution be configured differently, therefore presenting a nightmare for developers.

    Linux doesn't stand a chance while we have a hundred different, poorly tested, distributions deterring developers.

    Those who say that Windows only succeeds through its publicity department are lying - the fact is that Linux has much better publicity than Windows - how else could people seriously promote it as a usable GUI when I can't even do something as simple as copying something from the best web browser, Mozilla, to the best interface, KDE, because of their using different toolkits.

    I mean 'cmon people. If Windows' success is really due to MS' publicity, then Linux's publicity department must be run by an army of Goebells clones.
  • Why has this post been marked as a troll?

    It seems any post criticising Linux is marked troll.

    Why - it's completely true. Linux is not even the best free Unix clone, never mind the best OS.

    What Linux does have is:

    a good name
    a great publicity team
    a cute penguin

    Hell if FreeBSD was called Linux, it would do well too.

    Why don't these morons who don't know shit about kernels or operating systems, but who just instinctively censor the anti-Linux posts keep their mod points to themselves.

    The point about professionals is very true: a controlled program is the way it should be.

    This isn't a troll - it's the truth - why the hell should companies like Adobe and Corel invest their money in Linux when they have three hundred different versions of Linux already - for example, Photopaint doesn't install with Mandrake 7.2.

    This doesn't happen with Windows - with it, when you release a new program, you're pretty damn sure Microsoft have taken the trouble to make sure it works with all the software.

    A controlled OS made by professionals is better for everyone - just try telling me that Linux is a good as Windows or OS X.

    Although the established parts of Linux are often well written (the kernel, things like mail utilities), the newer stuff, like KDE, is cobbled together by a bunch of amateurs, many of whom are writing their first programs as KDE (e.g., see proof here [kde.org]).

    PS. I'm sure that someone will mark this as troll as well, but it's not.

    The fact is that Linux is a massive black hole of resources and effort - people trying to cobble layers of stuff onto decades of cruft - whereas a proper OS like Solaris, Windows or OSX is actually
    managed - people say Windows sucks and Linux rules, but it's just a lie - you can't even configure the thing without using a hundred different text files, each with different formats; even projects like linuxconf have to be maintained separately because of the *massive* existing fragmentation.

    Linux is already more fragmented than anything - how else can each different distribution be configured differently, therefore presenting a nightmare for developers.

    Linux doesn't stand a chance while we have a hundred different, poorly tested, distributions deterring developers.

    Those who say that Windows only succeeds through its publicity department are lying - the fact is that Linux has much better publicity than Windows - how else could people seriously promote it as a usable GUI when I can't even do something as simple as copying something from the best web browser, Mozilla, to the best interface, KDE, because of their using different toolkits.

    I mean 'cmon people. If Windows' success is really due to MS' publicity, then Linux's publicity department must be run by an army of Goebells clones.
  • What we need to do is set up a standards committee, a group of experienced linux users/coders who know the basic AND the advanced structures, etc. along with some basic users who can draft a set of standards along the lines of cross-distro compatibility.

    NOT, I repeat NOT a governing body, but more a voluntary process by which each distro can proudly annouce "We're Linux2000 compatible" and the end user can look for the commitee's seal and know that when they purchase it, it's not going to be a waste of their money and time. They will be able to install it without great pains, and that the software they download, or the hardware they have will be compatible with ALL distributions that have passed the inspection process.

    I've had this thought for a while and will put more time into it later.
    If you're interested in bouncing ideas around about it, email me [mailto]

  • by Hard_Code ( 49548 ) on Thursday November 30, 2000 @04:45AM (#592003)
    The thing with Linux today--I call it the bathtub. I can throw source in there. It's all floating around and it's available to everybody. But I as a vendor can take anything I want out of that bathtub and call it Linux.


    Now if you think that's going to work for application developers, call me in a year or two when IBM's Linux is different than HP's Linux is different than Dell's Linux and (a customer) will have to recompile five times. You've broken it effectively. So you cannot depend on one Linux.


    How is this not true? RedHat decides to take a certain version of the kernel, KDE, a peculiar flavor of gcc, and some other stuff, RedHat-ize it, and make a distribution out of it. Debian chooses another version of the kernel, Gnome, uses apt-get, and has a different distribution. Mandrake throws in some nice Mandrakish features. Others yet, take whatever other pieces they want and create a customized flavor of "Linux" (ok, perhaps not a customized codebase). We champion this as serving different needs. But isn't it still true that the same process has the potential for many conflicts? File system formats, hierarchy standards (file system standard and LSB notwithstanding), versions of applications, system policies, configuration tools, init scripts, custom scripts, etc. For all intents and purposes, Linux, as seen by the consumer, is fragmented. I think there should be a strong cohesive force to keep Linux, as the gestalt system, not just kernel, on track. Maybe LSB is it. Maybe not.
  • Two scenarios :-

    Someone adds something worthwhile to linux - the other distributions will simply incorporate it as allowed by the GPL.
    Someone adds something which nobody uses. Yes linux has forked but who cares if nobody uses it.
    The GPL ensures that while forks are allowed, and perhaps even encouraged that the best of all forks becomes common to all of them fairly quickly.

    I was more concerned by his comments on java. Saying that java would become open source but that nobody would be allowed to make any changes to it that sun didn't like. That's hardly open.

  • I'm no expert, but I think it would take fairly major changes to fundamental peices of the system (like a serious kernel change, or a change in the runtime) to make two versions of linux incompatable. Considering this, is it even probable that some organization (excluding microsoft) would go through the trouble of making an incompatable version of linux?

    I mean, just look at what companies like Indrema (DV Linux), Palmpalm (Tynux), and countless others have accomplished with releltively minor (if any) modifications to the base linux system.
    Behold the 2 cents.
  • I wasn't being sarcastic at all; I meant that. Compliments are rare enough around here, you should try to accept them well when they're given.

  • The real fragmentation in today's world of computers is the complete and utter incompatibility between UNIX and Windows.

    This is of immediate concern to me (and I mean really immediate) because I'm currently working on packaging our company's software. We support UNIX (Solaris, Irix, Linux) and Windows (NT, 2000). The headaches caused by differences between the various Unices pales in comparison to the headaches caused by the differences between Windows and UNIX.

    I wish Microsoft would follow Apple's lead and adopt BSD for their next OS... (heh, yeah, right.)


    --

  • Is the goal of Linux to make substantial inroads into the desktop market space now held by Windows? If so, I'd argue that's a target audience that thinks the OS equates to the GUI and the a "kernel" is a little piece of corn, From that perspective, if it looks different, it is different. All these different install routines, the variations in directory use and structure (why oh why do they do this??), and the games going on with startup scripts drive me nuts, and I like to pretend I sorta know what I'm doing. Expecting a newbie -- who sees this Linux thing as just another tool and is no more fascinated by the OS itself than most people are in the workings of their car's gearbox -- to somehow put in the effort to chase down all these apparently pointless differences between distirubtions is expecting too much. Take a lesson from McDonald's -- or Windows, for that matter. Wherever you buy it, it's still the same thing.
  • It's fashonable on Slashdot to point out "Fragmentation!? Let me list all 200 versions of Windows blah,blah,blah" (Thanks holding back the dramtical florish by not reciting every service pack and B version with UL elements BTW!)

    It's an OK point, but it ignores that fragmentation has hurt Microsoft from a technical standpoint. Specifically, the WinNT versus Win9x divide that we've been living through for the last 5 years (and for 2 more at least if we are lucky) has screwed both casual 9x users by dumping a crap product on them and the professional NT users by refusing modern hardware support (etc).

    The only thing MS's fragmentation has helped is their bottom line. Because they can segment the market with their monopoly, they can charge three times as much for the product that actually works (NT) and deliver it only to moneyed corporations sophisticated enough to pay for it and deploy it. Everyone else gets a comprismised hack for their $50 OEM fee.

    As for Win3.1, DOS, OS/2, the original Win95 and all of the other bizarro turns in Microsoft's historical OS strategy, it's been bad for users, but it can sorta be explained away by the fact that PCs were pretty limited machines until fairly recently, and PC OS design was always a comprimise for backcompat and low memory requirements.

    Of course, as bad as MS fragmentation has been, the UNIX side has always been worse, which is the big reason that MS won the desktop wars.
    --
  • This guys entire argument is a straw man! Linux will fragment... buyer beware! You mean like how Solaris is fragmented from BSD, IRIX, Tru64, HPUX, AIX and... of yes, Linux?

    In short, so what. This is Sun FUD. Sun is clearly afraid of Linux and this is the best response they can come up with. Pathetic. Frame the argument in their own terms, and hope that everyone takes the bait and wants to argument the point about why Linux won't end up being fragmented.
    Python

  • The Linux development methodology is broken. There's quite obviously an inner core of developers who do whatever they want, and then there's the public list, which is largely a decoy.

    It's broken, is it?

    How many other development methodologies have produced mature, stable, reliable, highly portable operating systems in under ten years? Do better, and then tell Linus his methodology is broken.

    Let's face it, of course Linus listens mainly to people who've earned his trust and become his friends over a long period of years. That's human nature. He doesn't have time to listen to all the people who want to grind their own particular axe. He's a dictator. This is a good thing: there is one person who takes the final decisions, He doesn't have to get them past the technical architecture committee. He doesn't have to get them agreed by marketing. He doesn't have to get the board to buy in. He just decides. And because he decides, we get a decent platform in a reasonable time.

    Like I say, if you can do better, go ahead and do it. There is nothing stopping you.

  • Perhaps a system that mimicks the RFC process should be created and a 'reference standard' implementation of the 'core' operating system should be defined.


    Or, perhaps the way that the BSDs do it -- unified kernel + userland development. The ongoing struggle between the glibc people and the kernel people doesn't happen in the BSD world. Look at the way that threads aren't supported fully under Linux because Linus refuses to provide anything more than clone(), regardless of the glibc people's need for more support. Linus doesn't care about userspace and is unwilling to help its development. It's unfortunate.

    ________________________________________
  • by davecb ( 6526 ) <davec-b@rogers.com> on Thursday November 30, 2000 @05:10AM (#592019) Homepage Journal
    Hey, Ed Zander lived through the BSD/Bell religious schism, the fragmentation of the vendor Unixes and the Unix International -vs- OSF standards wars. Of course he's going to worry about fragmentation: his career's been spent fighting it. That said, I think he's wrong: the older members of the Linux community also remember those years, and will "educate" the community. With a large hammer, if necessary (:-))
  • Brilliant! That's a very astute observation, well put.

  • Zander stated in the article:
    The thing with Linux today--I call it the bathtub.

    &lt perv mode &gt
    I prefer to think of it as a hottub with lots of compliant co-eds in there willing to perform my every whim! :-)

    &lt /perv mode &gt
  • At last! Uncontrovertable proof that Darwin [apple.com] is useless and we should all avoid Evolution [helixcode.com] in favour of the superior Creation [creationengine.com].

  • Sun believed and invested enough in Linux to buy Cobalt and to put major effort into a Java port. (Yes I know some of the original work was down by Blackdown but Sun has teams wokring on it now side by side with their Windows and Solaris teams.)

    IMO there seems to be some segment of the Linux community that over-lap with the conspiracy lunatic fringe. Neither can be happy unelss they can find someway to think everyones "out to get them."

    Remember, Sun doesnt make money on Solaris. they nmake money on Sparcs. They are still fundementally a hardware company.

    Frankly I think Zander was just expressing some very honest concerns.
  • It's not so much that they are binary-incompatible, but that different distros put different things (such as KDE) in different places. The LSB is supposed to lessen this somewhat by specifying where common libraries and scripts should be.
  • >It's broken, is it?

    Compared to BSD....yes. At some point Linus will have to admit the kernel needs CVS or some form of control. It will be interesting to see how Linus handles the transition.

    >How many other development methodologies have produced mature, stable, reliable, highly portable operating systems in under ten years?

    Lets see, what was the methodology?

    1) used SYSV Unix as a model (not much DESIGN here)
    2) Used other people's BSD and GPL code. (again, falls short on design)
    3) Used Minix as a base (again.... design)

    Methodology - Copying and using parts that already work from others. Not alot of heavy mental lifting on design when you use others code.

    Stable and reliable. Sure, compared to Windows 3.1 or Windows 95, or older versions of itself. But Linux is 'reliable and stable' compared to BSD? How about Solaris? AIX? Tru-Unix? Sco? QNX? (this is subject to debate....debate away)

    Mature - BSD has the WHOLE CODE HISTORY of UNIX behind it. Linux - A unix copy. BSD wins here....no argument.

    Highly portable - NetBSD says they have the highest portability.

    >And because he decides, we get a decent platform in a reasonable time.
    The long delayed release of 2.4 kernel is an example of this?
    Or, how about all the userspace programs that make the kernel useful? Mostly unix code....and nothing that can't and doesn't exist on other unix kernels (BSD/Sun/SCO/Qnx etc la)
    The stuff that makes Linux useful is all userland....and nothing Linus has control over. I maintain your 'decent platform in a reasonable time' is the hard work of the 100+ linux distro companies.

    >Like I say, if you can do better, go ahead and do it. There is nothing stopping you.
    Looks like it has been done. It is called BSD.
  • Totally. Sun's trying hard to sow the seeds of fud over fragmentation of Linux in a lame-ass attempt to convince everyone that we're better off with them holding on to Java rather than releasing it to an international standards body. I like Java but I ain't buying Sun's BS.

  • by gregholt ( 90624 ) on Thursday November 30, 2000 @04:56AM (#592042)
    If you're referring to Linux as the whole OS with all the good little tools, applications, etc., then it has already fragmented.

    I've been using Red Hat Linux for quite a while now, and I could comfortably work in most any version of the distribution. But plop me on a Caldera machine and I start to get lost quickly. Debian uses yet another file structure and configuration scheme. I haven't even used Slackware since the days of downloading 40+ floppies, but I know they've got their own standards. And don't forget the other distributions: Mandrake, StormLinux, Corel, etc, etc. Although many of them are just modifications of other distros.

    I think this will just get worse over time. Right now, it doesn't take too much time to learn how a new distro is put together. But, with the addition of all these graphical configuration tools (linuxconf, yast, etc) that are very particular to each distro, it won't be too long before you're spending an hour just to figure out to tell sshd to not allow root logins.
  • This all comes down to the question that no one seems to be able to answer: Is Linux a kernel, or is Linux a distribution.

    The kernel has showed no signs of fragmenting, something that I really attribute to Linus.

    RH made some interesting/debatable decisions with RH 7.0. Is that a fragementation? Only if Linux is an operating system.

    To be honest, in a 'commercial' OS environment, I'm starting to think that the definition of Linux has to be a combination of the kernel AND a set of libraries.

    Perhaps a system that mimicks the RFC process should be created and a 'reference standard' implementation of the 'core' operating system should be defined. By 'core' I'm thinking things like the Kernel and a set of libraries and compiler(s) (i.e.- gcc, libc glibc, gtk, etc.). Call it the GNU/Linux Reference Implementation.

    That would allow app developers a reference point when stating compatability. It still leaves room for the distro manufacturers to 'value add' to the product, but it's a little better than just saying '2.2.x compatible'.

  • Cardinal Biggles wrote: the GPL [...] makes irreversible forking-fests like the UNIX wars less likely with Linux I rather disagree: The GPL helps reduce the advantage of forking, but it doesn't prevent large competing camps (e.g., UI vs OSF) from growing up, each with favorite sets of components. To a limited degree, this is what happened with KDE and Gnome: that break very much reminds me of the Bell -vs- Berkeley split.
  • > Someone adds something worthwhile to linux - the other distributions will simply incorporate it as allowed by the GPL.

    For the kernel yes. But not for userland.

    If a vendor make a linux distribution with a proprietary thing on it (say a critical user-space library), he doesn't have to make it GPL. Or a vital application (say a VMWare distribution with a bundle VMWare).

    What keep linux together is that the GPL prevent linking with a proprietary component, so any proprietary add-on must be self-contained.

    But, I repeat myself, IBM could do a BlueLinux distribution, with a libibm (containing interfaces to the transaction manager, MQ-series, anythiung you want), and getting application suppliers (or their own db2) to link and use those libraries. Another case would be a very hypothetical AppleLinux with Quartz. Could be free-as-beer, but if you buy an AppleLinux application, it is only going to work with AppleLinux.

    This is a threat when big names will produce their own tweaked linux. Don't think it will be all-GPLed.

    Cheers,

    --fred
  • doesn't break memory management for smaller computers

    But that would take talented people.

    Perhaps exceeding the ability of the Uber-hacker Linus to do. If it was easy, it would have been done.

    But who gives a damn about Big iron? (Ok, IBM and its users do.) The bigger number of sales of units and total profit is the embedded world. When you are writing your autobiography about your talent and helping the computer world, the metric of others (not to mention your employer) will measure you by the total profit.

    Now, try to say with a straight face that the kernel is going to be the same for a limited resource machine (4-8 meg DRAM 32 bit address, 8 or 16 bit data bus) and your average desktop machine (128 Meg +, 700+mhz machine)
  • This only would happen if there is only _one_ good Linux tree. But what if there were two, the first one supported by five companies and the second one supported by five other companies, both with a large number of users? What if these two trees became incompatible?

    Actual fragmentation is unlikely with open source. The only way you'd get such a senario would be where the two different groups had users with different requirements.
  • If we get lots of different version, then some of them are going to be better than others. The bad versions will die. The good versions will merge into a "Best of" version. And I think thats a worst case scenario.
  • by FeeDBaCK ( 42286 ) on Thursday November 30, 2000 @04:13AM (#592060) Homepage
    ...big, bad Sun has said that Linux is going to fragment!!! Well, since Sun said it, it must be true... after all, they *are* the dot in dot com.
    *snicker*

    This is one of the great scare tactics used by both Microsoft and Sun to get the PHBs to avoid Linux. Linux has not fragmented, and probably won't for a long, long time, if ever. Too many of the key players (Red Hat, Caldera, Mandrake, Turbolinux, et al) have too much in stake with Linux to allow it to fragment into incompatable operating systems. I think it is more likely that Microsoft will give up on their appeal than for this to happen... hehe
  • by bbay ( 192854 ) on Thursday November 30, 2000 @04:13AM (#592061)
    IMMINENT DEATH OF USENET!!! (these caps are important for the sake of the joke, it's a quote. stick this in your filter.)
  • ...and no, I'm not swearing. If it's suspected that there might be a rift in the Linux community, then let's investigate right now.

    An ounce of prevention...

  • Don't say anything negative about teh Holy OS!

    Moderation Totals:Flamebait=1, Troll=1, Insightful=1, Total=3.

    ________________________________________
  • A long time ago, in an almost but not quite pre-email era, I read an article suggesting that we need several new kinds of punctuation to augment the familiar exclamation point, question mark, etc. I don't remember most of them, but the one that stuck in my mind is the "irol" (a play on "irony" and "eye-roll") to indicate sarcasm. The author even proposed a glyph for it, but I can't quite remember what it looked like.

    If anyone knows anything about the article, and particularly if they know of a copy online, please send me email.

  • Each with incompatible API's and behaviors. The problem is how fragmentation is defined.
  • "Linux is only free if your time has no value"

    This adds to my bottom line, I can charge a lot for my time. Now my client can afford a stable platform that is easily trouble-shot remotely. As opposed to say hundreds or thousands of dollars spent on the operating system that may or may not be stable. Thus I make more, and the customer pays less.

    I guess the probable source of a split would be if some Linux people take training to the MS extreme. i.e. memorize a hundred questions and here's your certificate that says your an "engineer". This by it's very nature brings people into the technical world as workers that our ill equipped to deal with real world problems. They also would not be equipped at all to deal with a Linux distro they are unfamiliar with.

    A failure to understand the underlying principles or be able to THINK gives us a world where techs can only deal with what they know by rote. They are slow to adapt to new things, they are unable to read manuals and glean basic understanding from them. They make the job harder for those of us that know more than "point and click". These people would drive any Linux schism. Usually these people are also the MOST vehement defender of any one distribution simply because they don't know any thing else.

  • by DeepDarkSky ( 111382 ) on Thursday November 30, 2000 @05:26AM (#592081)
    umm...that's controlled, market-driven, half-ass backward-compatibility, forward-API-breaking-so-you-can-get-an-edge-on-you r-vendors-to-eventually-grab-their-business fragmentation to you.
  • I wonder if it'd be useful for some sort of a standards body to set a distro standard. Kind of like ANSI C (only you could probably get by with a more expedient process than they use for ANSI stuff). Just a broad-based effort to say "You can stick whatever you want into your own Linux, but you can't call it Linux Standard (or whatever) unless it meets all these conditions."

    Assuming the process had the right mix of being 1) open to most voices in the community and 2) fast enough to incorporate new innovations into the standard core, it might be helpful to prevent fragmentation.

  • > ...that most people would be smart enough not to fork the kernel. Even if they did, they would have to release the code and anything good would eventually make it back into the main tree anyway,

    I hope not. You can't reasonably expect _everything_ be right for both embeddeed markets and 32 processors. Or the latency/bandwidth trade off are definitely not the same in normal and real-time environment.

    So basically, kernel forks won't be that bad.

    Much more painfull would be vendor forks (ie: where there is no technical reason for the forks), that would deliberately make incompatible versions to lock users in their marketshare. Nothing that can't be hacked around, but it would be painfull to have to use specific distributions for specific applications. You'll end up emulating 'flavors' of each distro on each other, and well, it would sucks.

    Cheers,

    --fred
  • I think UNIX forked into so many slightly incompatible vendor-specific distributions (one of which is SunOS BTW) because the original Berkeley UNIX was licensed very liberally.

    Linux is not so liberally licensed (namely, under the GPL) and that makes irreversible forking-fests like the UNIX wars less likely with Linux.

    Proprietary (==non-free==closed-source) Linuxes can't happen because of the GPL. So if an incompatibly forked version is ever released, the itch that this creates can and will be scratched.

  • I'm sure to lose karma for this, but:

    The only people who see a distance between Gen 1:1 and 1:2 are those who came up with some form of gap theory and needed to justify it.

    1:1 looks a lot more like a chapter heading to anyone else reading it. After all, it is a book being written that didn't have the nice chapter and book headings we use now. That's the original text, and it seems the book was called "In the Beginning: God Created the Heavens and the Earth".

    In verse 2, you have the beginning of the details.

    note: there are two copies of the story of creation in Genesis ... one with less detail, one with more -- so a summary followed by details.
  • Newton's experiments don't prove an old earth either. Considering modern realisations about carbon dating (its not accurate -- you need surrounding evidence to substantiate the possible dates) and the fact that we've basically created a circular argument (we must have evolved, evolution takes a long time, the earth is very old, that's time to evolve, we must have evolved ... ), the teachings of modern evolution theory really need to be revisited.
  • I don't think Linux is as susceptible to fragmentation as others make it out to be. I tend to believe that the cream tends to rise to the top of the kernel, so to speak. There are definitely some nice new features that I would like to see make it into the kernel, things such as a journaling file system [linuxworld.com]. But the benefit of Linux is that you can simply recompile your kernel to add the features you really need as well as remove the features you dont. In a sense, Linux is already fragmented and has been from day 1, but this is its most powerful asset.

    Penguins need lovin too. The Linux Pimp [thelinuxpimp.com]

  • PS, what's wrong with just admitting "we don't know" ???
  • by abdulwahid ( 214915 ) on Thursday November 30, 2000 @04:21AM (#592103) Homepage
    What they seem to forget is that Open Source projects like Linux excell at bringing out standards. Further more, it is standards that allow the highly compatible environments (like the Internet) that we all enjoy working in. Linux developers have always tried to bring Linux in line with standards and have contributed to creating new standards. Most of what we use, HTTP, FTP, SMTP, POP3 haven't been developed by the closed source developers like Microsoft and Sun. Rather they have been hacked out by a group of distributed people working openly to produce something that is not fragmented. In contrast, it is peole like Microsoft that always try to do things against the standards and hence fragment themselves from the rest. (Front page server extensions as an example).

    It therefore seems absurd to even talk about Linux fragmenting. In reality they should talk more about Linux providing a solution that will work on many different architectures and providing high interoperability with other Operating Systems like Windows and Mac (through SAMBA, Appletalk, etc) let alone other Unices. Let alone other Linux distributions!!
  • These kernel patches (Mandrake and SuSE) are not fragmentation: they
    are applied by groups of developers whose patches `track' the official
    kernel release, to provide features that in general Linus has agreed
    will go into the kernel at some future point. If any incompatibility
    is found between these patched kernels and the official kernel, the
    patches will be fixed (which is not what happens if there is
    fragmentation).
  • yes maybe you're right.

    but are we suffering? i'm certainly not.

    i run redhat on some machines, debian on the ones i like, and something obscure and small on my router, and i don't have any problems.

    the only real problem is binary incompatibility, which seems to be creeping in at the moment, but may be just a temporary thing.

    although of course, binary incompatibility only creates problems for people shipping binaries and no source. and that's a very small portion of the software available for linux.

    matt
  • The `peculiar flavour' of gcc (egcs) is the official compiler for the linux kernel.
  • Sun COO Ed Zander pooh poohs Linux as not suitable for use over his company's proprietary version of UNIX. Says it will ``fork'' or ``fragment''. This is news?

    Linux fragment? Says who? Oh! Wait a second! I moved around some code in /usr/src/linux/drivers/scsi/hosts.c to override the default controller detection order on one of my servers. I guess Linux has forked! Looks like he's right after all.



    --

  • (from the article)
    > what makes a McDonald's french fry is there is a spec and you have to conform to it

    Doh... I don't get it? What makes a McDonald's french fry is some fake potato slices and a TON 'o grease.

    Somehow I think something was lost in the comparison :/

    Oh well...

The only difference between a car salesman and a computer salesman is that the car salesman knows he's lying.

Working...