Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Linux Business

IBM, HP, Intel, NEC Announce Open Source Lab 83

cmuncey writes: "Salon has an Associated Press article that IBM, HP, Intel, NEC have announced an 'Open Source Testing Lab' for testing Linux for large corporate systems that will open by the end of the year in Portland, OR. The main four sponsors are putting up a couple of million and Red Hat, Turbolinux, Linuxcare, VA Linux, Dell and SGI are also kicking in. The lab itself will be run by a nonprofit corporation that will be neutral in picking the projects to be tested. Sounds a bit better Mindcraft, doesn't it?"

In case you were wondering, the article tell us that "Linux is seen as an alternative to proprietary operating systems like Microsoft's Windows and Apple [sic] OS." Certifications, labs like this, and Official Stamps of Approval mean perhaps more than they ought (corporate decision making being what it is) but that's hard to get around. And it sounds like they'll get to play with cool toys! ;)

This discussion has been archived. No new comments can be posted.

IBM, HP, Intel, NEC Announce Open Source Lab

Comments Filter:
  • by Anonymous Coward
    Bios information is not stored on the hard drive. What is stored on the hard drive is a diagnostic utility that the bios can call up immediately after it loads. That allows a quick diagnostics to be run on the drive when there's not even a partition on it.
    The reason this is done is to save on hard drives being replaced when there's nothing wrong with them. Guess what, it worked. Dell saved money on it. RedHat 7.0 is coming out soon. Right? Don't you think Dell will push the hardware vendors a little to start writing drivers for certain devices?
    Guess what, they already are.
  • Compaq still has a 'Compaq diagnostics' partition on the Proliant servers, but it doesn't inhibit the use of Linux at all. It's simply so that if you hit F10 on boot you can get to a bunch of their utilities, no matter what OS you run.

    To show it, here's output from fdisk on /dev/ida/c0d0:
    /dev/ida/c0d0p1 * 10 138 526320 83 Linux
    /dev/ida/c0d0p2 139 2176 8315040 5 Extended
    /dev/ida/c0d0p3 1 9 36704 12 Compaq diagnostics

    ----------------------------

  • I haven't looked at the CREDITS lately, but I know just from the LKML that a lot of the people working on the kernel have real Linux-related jobs too, like working for SuSE...

    Is it still "Mostly Volunteers"? Even in lines of code? I'm sure a lot of these people are still doing it out of love, but realize that they're also getting paid now. :)
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • What, you don't think I realize that? Don't be a moron.

    I want to do *exactly* that; I was just pointing out that now they aren't necessarily 'volunteers' all the time, and might have other conflicts of interest as well.
    ---
    pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
  • Salon didn't do its homework. Here is the full list of supporters:

    IBM
    NEC
    Intel
    SGI
    Dell
    HP
    Caldera Systems
    SuSE
    Turbolinux
    Red Hat
    VA Linux
    Linuxcare
    LynuxWorks

    Why Salon didn't report all the names is beyond me.
  • I dunno about this, but if you're looking for Linux work in Portland, check out WireX [wirex.com]. We're hiring, and it's a pretty good place to work.


    Wil
    --
  • This CNET [yahoo.com] article, found on Yahoo, makes this far more significant than a testing lab. They claim it is a development lab, to be used to co-ordinate efforts of getting Linux to work well on multi-processing hardware and mainframes. If this is correct, the impact could be huge.
  • Now, the article seems to indicate that this will be a seperate company with backing from various larger interests: Anyone know if this is true or will this end up as some kind of 'holding company'? Also, if it is a seperate company, any word on an IPO?

    The article says that this will be a nonprofit organisation. So, an IPO doesn't make any sense here.

    Chilli

  • Great innovations comes from a few or individuals not that often committies.
    I hope it make linux better but it also allow for innovations
  • like fun that's offtopic, try "+2: Hilarious". Time to metamoderate again.
  • I'd really like them to set up a usability lab. Videocameras, staff that knows how to DO usability testing, test subjects (or contracts companies to find them). All that.
  • RM101 should read the ipchains howto and stop writing about himself in the third person.

  • From the AP article, it sounds more like sourceforge's compile farm, where you basically get an account where you'll download, compile, and test stuff:

    "The founding companies said the lab will be run by a nonprofit organization that will select the software projects that gain access to the lab in an "open, neutral process."

    That's all they'll do -- they're simply going to screen the applicants, so that they don't waste their resources on every joe-shmoe with a helloworld.c. They're not going to do any kind of testing or benchmarking, ala mindcraft. They'll just screen applicants, and it'll be up to the applicants to get the software loaded and tested.

  • You're either karma whoring or trolling, and I'll give you the benefit of the doubt on the former. So let's see...:

    Is it just me, or is assuring the quality of open source projects (both in terms of openness and functionality) more or less impossible? I mean, by its nature, open source holds no associations to any governing bodies that carry sway.

    Nope, it's just you. There's nothing inherently anti-authoritarian about open-source software (unlike perhaps for how you can make such a case for free software). In fact, open-source software often strictly adheres to the will of one central author who deigns to accept patches from others and folds them into his own project. Linux is perhaps a complicated example, but other projects like Ghostscript illustrate the point.

    Industries that market tangible products have no problems creating standardization bureaus and bodies, usually because these sorts of things can be governed in turn by governments, by qualified authorities, by laws.

    While this may be true, it's misleading. Standardization bureaus are created all the time in the absence of official government intervention -- they normally go by the name "cartels". If companies can benefit through collusion and can especially marginalize those who do not agree to participate, then they consistently have done so. All you need is a mechanism to keep them from stabbing each other in the back and fragmenting the resulting efforts, and the GPL adequately addresses that problem.

    Could the FCC have been created without respected, universally trusted leadership? Doubtful. Who then will take on the challenge of developing an overseer for open-source?

    The FCC was created by fiat, not by consensus. Moreover, its purposes --regulating a scarce commodity (spectra) and preventing broadcasters from degrading each other's signals through collision-- make for no remarkable analogy in the software industry.

    Software is quickly becoming a commodity, and initiatives like these merely encourage companies to take up an open-source project, rebrand it, and sell it. It's much more like IETF standards in that typically, an existing effort is recognized and given an official version that other commoditized versions can be patterned after. Occasionally a little kick in the pants is necessary to keep people from merely churning out yet another instant-messaging clone, but it's hardly the sort of heavy-handed operation your comment would seem to imply.
  • Each distro has its own way of customizing programs and directories -- would you want to see that standardized?

    When it comes to directories? YES. I want to know, no matter what distribution I'm using, that /etc/XYZ is in /etc/XYZ, period. Ok, doing a find / -name XYZ isn't all that hard, but depending how big the disk is (and how full), that can take a long time. Even just moving between different distros of RedHat can be confusing, because things aren't always in the same place.

    This especially applies to things like initrc scripts that IMHO should be in the same place, no matter what (In RedHat they're in /etc/rc.d, in other distros they're elsewhere). Same goes with where you put the kernel. For the things that are essential to the system, there should be a standard place for them.


    Not reading .sig
  • While it's true that the strength of Linux is that will always belong to everybody, it's also true that most people are only ever going to buy (or download for free) an off-the-shelf linux distribution from one of half a dozen commercial companies. This means potential fragmentation and several incompatible versions of linux, or possibly one or two "Linuxsoft" super-companies that get to decide the future direction of Linux in the same way that we hate M$ for doing with Windows.
    I would argue that some kind of semi-official standards setting body would be a good thing, therefore, as it gives the Linux community a forum to discuss the standards it would like to see in Linux. Remember the GPL means that people will always be free to ignore the standard if they want to, and if enough people ignore it, it won't be standard anymore. This acts as a constraint on any standards body as well, they are accountable to the Linux community in this manner.
  • > RM101 grumbles

    Hahaha. Until I read the replies, I didn't know you were refering to yourself. I thought you were talking about RMS. (RM101 = RM5 = RMS)

    --
  • The fact that Linus has not been too authoritarian in the development of the kernel has lent the aspect that all projects are in a "will of the people" manner constructed. But who didn't understand that enlightenment was Rasteman's baby. Most of these projects are carefully controlled by a few individuals. Once some of these apps begin to mature, to catch up with some of their Windows counterparts, we will see some truly phenominal work. How old are some of these projects started by single individuals?? Kde is maybe 3 years old and is coming close to getting the desktop to the mark set by Microsoft in 7.

    Produced on a budget that would make Microsoft shudder, patched by home users and sysadmins, refined in the afterprocess, and freely available. When we have truly caught Microsoft and KDE4 is on a 3 year program schedule, not a 1 year one, we're going to see some really revolutionary ideas come out of open source projects.

    For these companies, they can see that Linux will soon be on more equal footing with Microsoft. Would XFree benefit form this dandy testing lab?? You bet. Wanna write drivers for our hardware?? Here, try it. Let's face it, Linux is no longer the towel boy, it is now officially a contender.

    My boss was braggin' to his boss today that we run our company intranet on Linux when all he used to do is beg me to move it to NT. I was one of those guys who always craved a *nix box when I has back in the 8086 days. Our generation grew up with computers and has learned to expect more from them.

    Most of these companies have already announced that Linux is going to play a major role in the future of their companies, this effort is their way of heling Linux spring into prime time. We still have alot of gaps to fill before we can go toe to toe with the champ.

    A hearty huzzah(!) to all the folks involved. Now I'm craving an SGI box with Linux.... Blender... And a big mehonchin' Raid box for my MP3's and porn.

    Thank you and goodnight.
    ~Hammy
  • Is it just me, or is assuring the quality of open source projects (both in terms of openness and functionality) more or less impossible?

    Well, I don't think so. First, there are known standards for freedom and openness - take Debian's guidelines, for example, or the Open Source Initiative you mentioned. The License Wars have left most people a little shell shocked though, so I suspect most people just want stuff to be under one of the well known licenses.

    More interesting is the idea of assuring the quality of open source software.

    Certainly, nothing's stopping them from taking some particular distribution(s) and doing really extensive testing with them. They can audit the source code, just like the OpenBSD people have. They can publish MD5 checksums of "approved" binaries, together with the sources. That will let users ensure they have the "approved" software, if that's what they want.

    Then they can state with confidence that their hardware and software works with it. Nothing wrong with that, of course, but hopefully they will be "open" enough to work with several significant distributions, including Debian. If they do a good job, and are seen to be helping the community (finding and fixing bugs and contributing the fixes back to the maintainers of the projects) I'm sure that they will earn the respect of the Linux community (such as it is).

    This will be useful for big companies - and that's probably the target audience of this effort. Also, they could produce a Posix-compliant distribution of Linux, which might be helpful for government work. From what I hear this is mostly a matter of applying a bunch of patches (and then testing, of course).

    At the very least, this will help deal with the style of FUD that "open source is dangerous because anyone could modify it! You wouldn't know what's running on your computer!"


    Torrey Hoffman (Azog)
  • And here is the Intel's [intel.com] press release [intel.com]
  • Quoth the Associated Press > Linux is an "open source" operating system
    > that anyone can modify, as long as the
    > modifications are made available for free
    > on the Internet

    Oh dear. Why do so many journalists write about this stuff without understanding it at all?
  • This may sound better than Mindcraft to us, but what about a reverse perspective?

    Imagine the following scenario.

    A couple big companies that have using Linux as their mainstream OS for a while have started to embrace an alternate OS. (Not too big a deal)

    Now, these companies partner up with some vendors of this alternative OS to form a nonbiased testing platform? I say nonbiased because that seems to be our major complaint about Mindcraft. Would this scenario seem nonbiased to you folks? It doesn't to me ... it would sound like the odds were being stacked in favor of this other OS...

    Just my $.02. 8)

  • *sigh* Why can't you kids play nice?

    And as to your response: No.

    Multics was a multi-user OS that never really took off. Most of the guys that developed UNIX had started off with Multics. Even the name 'UNIX' is a play on the name of Multics. A quick web search will enlighten you.
  • Each time I hear about this and that and the other company forms an agenda to make a common effort ot achive something, it never happens.

    But sometimes it spins off neat projects afterwards (like Multics -> UNIX)

  • Quoth the poster:
    "You may suggest to recompile the drivers. Good idea, unless you've got a binary only driver from a company that got sued into oblivion as happened with Aureal (they won the lawsuit but went broke on expenses for their attorneys)."

    Linux Torvalds has in the past expressed no sympathy for your position. His argument is that it's easy to recompile a kernel module to support a new kernel. If you're using a binary-only module, then you have to bug your vendor - that's your "penance" for buying non-open hardware.

    I tend to agree with Linus (on this). Windows is a standard that's ossified - it's standard is to suck. Bugs can't be fixed because programs depend on them. The thing about a free OS is that programs that depend on bugs can be fixed when thouse bugs are (pardon the unparsability).


    -Dave Turner.
  • why would Company F dish out the money for some HP-RISC with HP-UX licenses when they could get some cheaper x86 with Linux on it?

    One big reason is downtime. I realize that a good portion of downtime is related to hardware, but some of the commercial UNIX OS's might have a better record against downtime than Linux and *BSD's. Downtime == loss of lots of money. When a day's worth of downtime is equivalent to several million dollars, paying $100,000 for an OS sounds cheap.

    OT: when did HP-UX decide to rename rsh and replace it? I spent my first day on HP-UX (a year ago) trying to 'rsh' over to a different machine. GRRRR!
  • by NoWhere Man ( 68627 ) on Tuesday August 29, 2000 @05:14PM (#817858) Homepage
    Take 1 Large Room.
    Add some open sourced software.
    Add 5 tons of computer equipement.
    Stir in a number of different techs over a number of months
    Add caffine, pizza and other assorted junk food.
    Bring to boil

    Serves large corporations

  • Of course it's important that the kernel developers get hold of these machines, but people seem to be missing the significance of the application developers also having access to MP machines. Applications can be developed easily enough on uniprocessor machines, but actual hardware is needed to make sure they take advantage of the multiple processors and bandwidth offerred by the big iron.

    I think it's the fact that application developers can make applications to take advantage of MP that is the main focus of this effort, and the attraction for the big names.
  • Yahoo has an article about it as well. You can find it here. [yahoo.com]

  • by CMU_Nort ( 73700 ) on Wednesday August 30, 2000 @01:40AM (#817861) Homepage
    There's also a NYTimes article about this here. [nytimes.com]

    blah blah free registration required blah blah

  • by CMU_Nort ( 73700 ) on Wednesday August 30, 2000 @01:44AM (#817862) Homepage
    CNet is also carrying a version of this story here. [cnet.com] Unliks some of the others, this isn't just a copy of the Reuters story.

  • Sorry, I parsed your previous "error in the config" as "error in the configuration file", and not an "error in the hardware configuration as the program currently sees it".

    In the case you just described, yes, it was rather idiotic, either the driver or the userland program should've acknowledged the problem and did *something* other than just blanking out.


    --
  • How hard can it be to detect an error in the config and drop back to the console - IANACP (I am not a C programmer)

    ITYM, "IANAP" (I am not a programmer). As anyone with even a small grounding in language design will tell you, Syntactic errors are bloody easy to detect. Just look at all the typos that your compiler will spit out. Semantic errors, however, are another story. Bloody things are almost impossible to detect at the machine level.

    Just about any monkey (or a Turing Machine, whichever) can find that something is syntactically amiss. Tools like lex and yacc make it rather simple to build even the most complicated of configuration scripts. However, validating for semantic correctness in even a moderately complex configuration script is another matter; it is multiple orders of magnitude more difficult and computationally intensive to verify this (and for some languages, like English, whose synctactic definitions certainly do exist, the technology for verifying semantic correctness doesn't yet exist).


    --
  • by Nexx ( 75873 )

    You're forgetting that both IBM and HP are rather large companies with rather large support services divisions. Take, for example, IBM. IBM Global Services, their consultancy group, will support anything from CP/M to IBM SP2 (and probably Sun E10k) to OS/390 and AS/400 mainframes. Same with HP. They tout themselves as a one-stop support center for everything. Large companies pay huge amounts of money to these organisations for support.

    So why are they bulling ahead with this certification lab? It makes supporting these applications (yes, they support applications too) easier if they know the ins and outs of the app. They'll of course find this out while they "certify" the piece of software.

    Currently, because of the lack of certification, they will support any app, even the ones that I built at home while drinking wine, and is this unmaintainable clusterf*ck, if it made them some dough. Suppose they certify someone else's OSS project, which does essentially the same thing, except missing features x and y. Will they support my app for the company? Hell no. They will tell them to move to the someone else's app, because it's actually possible to maintain the bloody thing.

    See Caldera? See IBM? See HP? See SCO? Hell, see MS? They're all trying to move to a constant revenue model that comes from providing a service, not maintaining the one-time revenue of a product.


    --
  • Very true, but if the PHB's really valued technology they wouldn't be using x86 to they extent they are. x86 is a 20 year old arch with MANY flaws compared to something modern like a sun4u for example.
  • The reason these big companies are starting to back Linux is mainly a GREATLY reduction in R&D costs. Plus Linux has a lot of hype going for it whether it deserves it or not. Many commercial UNIX are technically superior to Linux but why would Company F dish out the money for some HP-RISC with HP-UX licenses when they could get some cheaper x86 with Linux on it?
  • Re: Now, the article seems to indicate that this will be a seperate company with backing from various larger interests: Anyone know if this is true or will this end up as some kind of 'holding company'?
  • Um thanks. Just a little advice to Kids in the Hall fans out there: Never get drunk and read slashdot.
  • And how long after that before we start hearing people moaning on slashdot about "paper" CLE's?
  • Not really. Open Source companies are as inclined as MS to release FUD. Do you really think this thing will be totally 'independent'? Some bad findings/benchmarks, and suddenly someone gets the shits and pulls funding.

    Linux just isnt suited to this big name corporate stuff. Its hard enough to tryu to get small/medium companies to take a more open source attitude. Hell, its impossible around here to get Linux installed on a single box, even as a file/print server. But I guarantee you, its even more difficult to try to apply big-business practices to open source development.


    Simon
  • Whats on the compaq partition is the diagnostic utilities. Hit F10 or F12 when you see the block in the upper right corner.
  • Sure, there may be bad standards around, but at least you have something against which you can compare your code or your application.

    For instance, you can use an old windows driver or old windows code (or even dos) under all different versions of Windoze. This may not always work perfectly and sometimes will hang the system but at least in 80% of all cases you can. Now try to do the same thing with a device driver you got for linux 2.2.12 and use it with 2.2.16 (I'm not even talking about major releases). In about 80% of all cases it will hang the system or not even load since lots of the interfaces changed.

    You may suggest to recompile the drivers. Good idea, unless you've got a binary only driver from a company that got sued into oblivion as happened with Aureal (they won the lawsuit but went broke on expenses for their attorneys).

    In other words, what open source needs now are open standards and with that I mean standards that are documented and that do not change whenever someone decides he wants to add on yet another feature. Or at least keep them backwards compatible.

  • Two of these companies already have their own version of UNIX. Why don't they just open-source AIX and/or HP-UX? I mean, not to say anything bad about the linux team, I use it all the time. But is it really that technologically ahead of the commercial unices? The whole thing sounds more a product of the marketing departments than R&D.
    --
  • Do we need Norton Utilities as such? The info required is already there, albeit it not presented in a window, and the tools are there for monitoring. All that's really required is crash-protection for XFree86 - sorry guys DRI is cool and all, but fixing the 'screwed up config file black screen of death' problem should have had some time devoted to it. How hard can it be to detect an error in the config and drop back to the console - IANACP (I am not a C programmer) however, so any justified flames will be noted.
  • I am a mainframe programmer with 10 years experience (yes a dinosaur), and, to me, leaving show-stopper bugs in several consecutive releases of a system shows that something is fundamentally wrong there. The emphasis with 4.0 seemed to be more about writing a DirectX competitor rather than cleaning up previous messes. My problem with X was when I upgraded to 2.2.16 with the USB backport. On 2.2.14 the mouse worked fine, on 2.2.16 it didn't and X crashed locking up the whole machine, not even allowing me to switch screens and kill it. Are you seriously telling me that it is hard to write a piece of code that recognises a problem with the mouse, spits out a message 'no mouse detected' and drops back to the console. Please. That sounds more like an MS excuse to me.
  • Howdy, What I saw in the article was that this would be a non-profit corporation backed by all the companies named. The likelihood of there being an IPO for a non-profit company seems pretty slim to me. Anybody know of a non-profit that offers stock?
  • you say "I doubt very many of the PHBs in the world are going to pay much attention to the alternatives when there's a "Certified" option out there. "

    i answer, yeah but PHBS now say "Linux? whats that? oh thats hippie evil anti-profit stuff". even if they say "we will only use software thats certified by some big corp" it's still better that admining a NT box, ne?

  • This means potential fragmentation and several incompatible versions of linux, or possibly one or two "Linuxsoft" super-companies that get to decide the future direction of Linux in the same way that we hate M$ for doing with Windows.
    This strikes me as an excellent point, adopted standards on such basic things as the directory structure, levels the playing field between distributions that much more. In fact, I believe it would greatly reduce the temptation for companies to standardize on one distribution. I would even argue that it gives us (SA's, DBA's, and other IT Professionals) more freedom.
    As an Oracle DBA, and one who wholeheartedly endorses Oracle on Linux, I 've learned that Oracle doesn't install or administer the same on all distros. In fact, the introduction of more than one distribution makes things that much more complex than they really need to be. As a result, I've decided, in the future, I would standardize on one distro. If there was even a minimal standards, it would make it that much easier to administer multiple distributions. I now its a stretch for some, but I really think a base standard would be even more liberating.
  • Because, they're playing both sides of the fence. If Linux and the Open Source model ultimately fails, then they haven't given anything up, like their source code to their rivals. On the other hand, if they give Linux a nudge and it really takes off they are in a perfect position to "ride the wave". In that case, the time will come when Linux will far surpass their proprietary offerings, at which time they no longer have anything to lose by opening their code.

    I know I'm being a little naive when I say it, but Linux could finally bring the other Unix vendors together. I just hope that IBM and HP are really serious about making Linux work.
  • Right, and Compaq wrote their own BIOS because blah blah blah. Doesn't explain why they still do it 18 years later when every random boxshop just pays a nominal fee to Phoenix.

    When somebody thinks they are smart by telling you about PC clones is where the sighs starts on my end.
  • Why is using a disk-based config program "proprietary" and why does hinder Unix installs? The reason that Compaq does it that way is to make the configuration OS-independent because they've always supported things like SCO UNIX (although there's a copy of Windows 95 in that system partition).

    My suspicion is that the reason that Compaq, Dell, and IBM write their own BIOSes is that they feel that the generic Phoenix sorts aren't very good. Since there really isn't a PC BIOS spec to speak of, of course there will be small proprietary differences and different bugs.
  • MAybe because Windows has Dell-supplied drivers? I don't know, I just know I have trouble :)
  • by abrager ( 175240 ) on Tuesday August 29, 2000 @04:59PM (#817884) Homepage
    Does anyone think that this will have an effect on the amount of proprietary hardware (for lack of a better word) that is included with their systems? For example, Dell saves some of its BIOS information on a hidden hard drive partition (much cheaper than CMOS). Compaq used to (I don't know if they still do) do the same thing on their Proliant servers. Hopefully we'll get some standard hardware here and less trouble getting systems to work with n*x.

    Comments, anyone?

  • RM101 should read the ipchains howto and stop writing about himself in the third person.

    So what you're saying is that Linux is so insecure that I have to block services in order for it to be secure.

    RM101 does not subscribe to that philosophy. RM101 wants to use the services, not turn them off.


    --

  • RM101 grumbles that he hopes that Linux will finally get some professional testing, and the security holes in his up-to-date version of Linux will finally be fixed, so his system won't be broken into again (and he still doesn't know exactly how it was done, which is what really scares him).


    --

  • write their own BIOSes is that they feel that the generic Phoenix sorts aren't very good.

    Your suspicion would be slightly off there. It isn't that the Phoenix sort of BIOSes aren't very good, it's in fact that they are too 'good' (they offer too much flexibility).

    Here's the story we got when I was working at Gateway. A 'normal' bios (a Phoenix type of generic BIOS) is made for maximum performance and maximum customizability on the given hardware. An OEM doesn't necissarily want that. An OEM wants a BIOS that is 'stable' (in other words, as little customizability as possible) and anything that gets in the way of that is 'removed' from the generic BIOS.

    Of course, this also causes some problems with performance, and is one of the reasons that Gateway is no longer allowed to show the name of the original BIOS maker on their new systems. They have to list it as a 'Gateway' specific BIOS. And if you find out what generic BIOS you can load on your system, you usually will see performance gains right away (I know I have on the two Gateways that I own). This is true of motherboard BIOSes as well as vid card BIOSes.

    An OEM wants a simple and stable BIOS, a generic BIOS maker wants a high performance, extremely customizable BIOS. That's the difference.
  • From the article it appears that the lab will be providing OSS developers the chance to have their software tested on high end hardware.

    I think that the corporations are starting to realise that OSS Software is good - but they need some way of testing it on enterprise size servers before they jump into putting this software on live systems.

    I don't think their intention here is to start defining any 'standard' for Linux applications in general. Even so, perhaps some guidelines might be a good idea - more so for client apps.

    The one thing MS does well is have a relitivly consistant interface. While I don't care much about UI's for a Server, having some form of Client App UI guidelines might be useful. I mean guidelines not some kind of absolute restrictive standard.

  • What, you don't think some people don't get lucky and get to do the very thing they love? Don't be a cynic.
  • >Why Linux lacks the most, as compared to the *BSD (especially the OpenBSD) is the AUDIT, both in security and in scalability contexts.

    OpenBSD's AUDIT is crap. Their code base doesn't grow at 1/100th of the speed Linux's does. Not only that, but they have never finished the audit, it's just one of those forever "work in progress" things (unless some miracle has happened since my last review of Open BSD).

    OpenBSD is a nitch OS for security only, any requirements beyond that are always better served by another *BSD or Linux.

    By the way, Linux comes secure to damnit (box must be delivered turned off for Linux, just like all the abilities are in OpenBSD.)

    Laugh dangit.

    -Nathan
  • *shakeno*

    Linux is still missing some features in the mainstream Kernel for C2.

    Auditing & ACL...

    I know you can do ACL's swith special setups - I am talking about the core Kernel source.

    C2 is crap anyway, what Linux could use is the LSB putting out some requirements for security based configurations by default where you would have to open something up to get nailed by it.

    (OpenBSD people -all three of you-, I know you have that now, I don't care. If the system can't do anything but be tight, then it's useless anyway. Got SMP?)

    At any rate, does anyone know the status of the work SGI is doing for the B level certs in the Kernel?

    -Nathan
  • Aw hell, now I'm going to have to quit my job in radio and start camping out on their doorstep. Something like this going on in MY town? Hell yeah!

    *ahem* I will now come back from dreamworld and return to my regularly scheduled humdrum existence...

    --- Karel P Kerezman
    ---
    Karel P Kerezman

  • aureal.sourceforge.net [sourceforge.net] if these don't work then well....
  • these posts, I start to wonder if the linux community wants to get paid or not. I am sure I am going to get flamed but are standards for things that bad. Would it be all that bad if everyone used a "system" directory for all of the standard libraries. We it be all that bad if software "venders" where required to put there libraries in a standard directory to be linux certified. And is it all that bad to have major hardware companies funding some real linux work. The linux community is it's strong point. If you can focus that community a little bit with some basic standards we wouldn't have problems like oh, you didn't compile for Debian, or the constant problems with dependencies. Why would making things easy for a "user" (as apposed to guru) to install because of a few standards be so bad? If people in the linux community really hate microsoft so much why are they apposed to an effort that will help get linux into the mainstream faster. We can always work on the bugs later, after all isn't that what we do anyway, submit our code waiting and hoping for peer review so it can all get better?

    DBLO_P
  • I hate to have to be the wet blanket here, but I think these kinds of standards are something that the Linux community is better off without. As soon as we start letting major corporations assign labels as to what's "Linux Certified" and "Not Linux Certified" for Linux use, we run the risk of losing Linux's independence.

    Part of what makes working with Linux so exciting is that everybody's free to do with what they will. Each distro has its own way of customizing programs and directories -- would you want to see that standardized? IBM, Intel, and the other companies involved here are all hardware manufacturers; what's to prevent them from refusing to certify programs by competitors or that favor competitors? Sure, you don't have to use "Linux Certified" software... but let's face it, as soon as a bunch of big corporations start pushing a standard, it catches on whether it's a good standard or not -- just look at the Kerberos fiasco! I doubt very many of the PHBs in the world are going to pay much attention to the alternatives when there's a "Certified" option out there.

    Linux may remain an open source project forever, but the freedom to change it doesn't matter much if no one will use those changes. And that's exactly what will happen when you adopt standards as to what's "Linux Certified" and what's not.

  • Embrace and asimulate...

    ---
    From: HAL
    To: BigBoss, my dear partner
    Re: Linux compatibility with our new S/361 hw

    HALinux/361(R) 100% compatible, certified
    Linux1 90% compatible, certified
    Linux2 84% compatible, certified
    ...
    LinuxN 50% compatible, certified.
    ---

    Now, which one do you think your boss would choose?

    Think!

  • Yeah, may be I am a moron, but the fact that I have my reservation is about the trust to this "independent" party which is essentially formed by the big players.

  • I've been working, albeit nominally, in the Linux SH* project, providing hardware information for HP's Jornada 420, in an attempt to aid in retrofitting pocket PC's with the Linux operating system... So far, all attempts made by those working in the project have reached a "If you aren't working for Microsoft, we aren't telling you anything" response...
  • From the article:
    But an economist who has surveyed the personal computer software market contends that the 70,000 figure is grossly overstated. There are probably fewer than 10,000 Windows programs in use today and most users probably have no more than a handful on their computers, according to the economist, Richard B. McKenzie, a conservative scholar at the University of California at Irvine.
    There are 100,000's or millions of enterprises out there that have written their own Windows applications. There is no way in hell you can justifiably limit this kind of survey to shrinkwrapped applications when you're trying to judge what the barriers to entry really are.

    The 70,000 figure is conservative if anything.

    This looks like just another piece of Microsoft-sponsored astroturfing to me.
    --
  • If Microsoft can get it for 'some configuration' of NT, we should be able to get it for Linux. The only sticking point is the testing, documentation, time and money required. Maybe this lab can help.

    Negative results are ok. Give us some negative results stated in nice clear terms and we'll fix them.

    Maybe we can go Microsoft one better and get some kind of security clearance for a machine that's actually connected to a network. :-)
    --
  • What's this "we" crap? Are you a linux developer? Just because you like the whole idea of open source doesn't mean you make one bit of difference.

    Oh yes.

    Get over yourself.

    You're in more trouble than you think, Mr. Coward
    --
  • If corporate interests turn Linux into a de facto proprietary system, which I believe is semi-inevitable, that system will ossify and become less useful. Linux users will peel off, reverting to an open, standard-free version, or ditch the thing entirely. Remember, Linux is only the means; freedom is the end.
  • I'm tired of your humor you fucking american
  • I think they need to set up a lab where I can test my open source software on the typical enterprise administation staff.
  • Certifications, labs like this, and Official Stamps of Approval mean perhaps more than they ought (corporate decision making being what it is) but that's hard to get around.

    Very, very true. I wonder how long it will be before a company like CompTIA begins to offer "standard" Linux software technician certifications? Are we going to start seeing John Doe, CLE?

    Or is this already going on and I'm a day late, dollar short as usual?
  • You're either karma whoring or trolling, and I'll give you the benefit of the doubt on the former.

    Whoring...i think. Give the newbie some time (look at my big number). I'd prefer to be known as a karma pimp, now that i think about it.

    Standardization bureaus are created all the time in the absence of official government intervention -- they normally go by the name "cartels".

    Touché. Of course, no one who is aware of a cartel trusts it, (for those of you who haven't had enough coffee, i mean any sort of economic or political collusion, in the dirty sense of the word, not drug organizations. Necessarily.) and what i really meant to imply in my comment was that it was difficult to establish a legitimate organization with widespread influence on the web without a consensus (or close to it) among industry leaders. I really hope that when you say the GPL will keep developers from stabbing each other in the back, you're not insinuating that the GNU project is a cartel ;)

    Moreover, its purposes --regulating a scarce commodity (spectra) and preventing broadcasters from degrading each other's signals through collision-- make for no remarkable analogy in the software industry.

    I knew someone would bust me for this...i meant that really large standards organization that brands all sorts of things in the US, but being Canadian i can't remember what it's called (ANSI?). In any case, our version is the CSA. Eveything from bike helmets to Barbie dolls get CSA stamped north of the border. If the point you're making is that commercial industry standards organizations have no relationship to software standards, then i would have to disagree. In both cases the enterprise fails without a general acceptance by industry, government, and consumers.

    Occasionally a little kick in the pants is necessary to keep people from merely churning out yet another instant-messaging clone, but it's hardly the sort of heavy-handed operation your comment would seem to imply.

    Honestly, you're probably right, and that's why i put in the brief disclaimer ("do we even need the standards?" to paraphrase myself). Evidence of the fact is that we're here, right now, and i'm perfectly happy on my linux box with un-certified software written by code monkeys like me to do all sorts of things without any sort of 'official' approval whatsoever.

    -j

  • by Jon_Sy ( 225913 ) <{big_guy_} {at} {hotmail.com}> on Tuesday August 29, 2000 @05:09PM (#817908)
    Certifications, labs like this, and Official Stamps of Approval mean perhaps more than they ought (corporate decision making being what it is) but that's hard to get around. And it sounds like they'll get to play with cool toys! ;)

    Is it just me, or is assuring the quality of open source projects (both in terms of openness and functionality) more or less impossible? I mean, by its nature, open source holds no associations to any governing bodies that carry sway. There's the argument that accepted standards organizations for open source just don't exist, but that's not even true...it's more a case of public trust being a fickle thing.

    Industries that market tangible products have no problems creating standardization bureaus and bodies, usually because these sorts of things can be governed in turn by governments, by qualified authorities, by laws. Could the FCC have been created without respected, universally trusted leadership? Doubtful. Who then will take on the challenge of developing an overseer for open-source?

    It has been tried...there are any number of open-source websites that act as collectives for development. There have been attempts to create instituions of authority as well, notably the group led by Eric S. Raymond, the Open Source Initiative [opensource.org], which has had undetermined effectiveness, as far as i can tell. Still, i can't help but think that, currently, excellent open source becomes accepted by reputation, and reputation alone.

    I wonder if this lab will have the power to start the responsible monitoring of open source...just an interesting idea. Really, do we even need such a system, or can the open Freshmeat bazaar and word of mouth serve as adequate testing grounds? Sometimes i think it would take an organization with direct influence over the net, like the IETF [ietf.org] or ISOC [isoc.org] to get the ball rolling...from innovators to watchdogs.

    If anyone else knows of any other certitification programs for open source, i'd like to hear about them.

    -j

  • by Kierthos ( 225954 ) on Tuesday August 29, 2000 @05:18PM (#817909) Homepage
    Potentially, it could be a very Good Thing if these companies are all working together, and as the contributor points out, the current plan beats the heck out of Mindcraft (the more you pay, the better your benchmark). It will also, if it doesn't go up in smoke, provide for a lot more portability of applications between various Unix flavours. (I've seen a few Unix apps that didn't work quite the same depending on what flavour you had running.) Or at least, that's the theory...

    Now, the article seems to indicate that this will be a seperate company with backing from various larger interests: Anyone know if this is true or will this end up as some kind of 'holding company'? Also, if it is a seperate company, any word on an IPO?

    Kierthos
  • man, due diligence is SUCH a binding term.
  • Why Linux lacks the most, as compared to the *BSD (especially the OpenBSD) is the AUDIT, both in security and in scalability contexts. Hopefully, the new "neutral" lab will carry out such a task, and let other developers continue with their innovative tasks in creating even more new and exciting utilities. BTW, one thing that the Linux-scene still lacks is a utility that resemblers the Norton-Utility for DOS/Windoze arena. Will someone begin such a thing?


  • Monitoring per se is definitely NOT enough !

    What Linux needs the most is someone to do a neutral AUDIT, a massive and thorough audit is what will make Linux be respected in many corporate boardrooms.

    Linux may have the reputation of being a "replacement" for M$, but it has yet to acquire the reputation of being a "safe" and "scalable" OS.

    *BSD has passed the "scalability" tests, and one flavor of it, the OpenBSD has even passed the rigorous security audit. Therefore, it is hightime that Linux does the same thing.

    I am not here to jump start a Linux vs. *BSD debate. I am merely pointing out what Linux needs to do to gain the trust of the suits who have the ultimate control over what the Fortune-500 will do in the coming years.

  • I think some of you are misinterpreting the motivation of these companies to standardize Open Software. Facts: 1. All of the listed "partners" in this project have a vested interest in the success of Linux. The interest of the Linux distributors is obvious. The interest of the hardware OEMs is that they've all announced support for the OS and development within. 2. Open software is cheaper for a hardware OEM to support than is proprietary software. So there is a natural desire to levitate towards it. 3. The majority clients of hardware OEMs (particularly UNIX hardware OEMs) are large companies. 4. Large companies do not, in general, blindly install large-scale software/patch updates just because their hardware/software distributor tells them to. Such installations require an extensive certification process. If the software does not pass the certification process, the software does not get installed and the distributor does not get their money. 5 There is currently no standard way of certifying that Linux (for instance) meets the various performance and security requirements that the average business customer demands. 6. Certification requirements are largely the same for a given type of application. 7. Linux (at least the important bits) will largely be the same no matter which vendor is supplying it. Thereby multiple certifications by multiple vendors would be a massive waste of time, effort, and money if they're all hawking basically the same software. 8. A politically-balanced certification authority will give business customers a high degree of confidence that the software meets a wide degree of standards and performance requirements. 9. An "objective" certification authority (one independent of the interests of the OEMs/vendors) may end up being ignored/discredited if their results are not politically acceptable to the vendor. This is not politically possible if the vendor sponsors the certification process. 10. Because it's agenda is not always obvious, an "objective" authority is more susceptible to corruption than a "not-so-objective" authority. .: A joint effort amongst the interested companies seems to be the most economical way of ensuring the an open software product is commerically viable in an industrial setting.

Without life, Biology itself would be impossible.

Working...