Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
GNU is Not Unix

Linux and GNU at their best 42

Mapc writes "...Couple of years ago (in 1990) there was a study called fuzz which checked quality of UNIX utilities. It has been revisited since then - in 1995 (and in May 1998?). Interestely enough the study shows that GNU utilities and Linux utilities are the best made ones (lowest - 7-9% fail ratio) The paper is here , the original fuzz program lives on the ftp as well. Test it on Solaris 7 and tell us the results"
This discussion has been archived. No new comments can be posted.

Linux and GNU at their best

Comments Filter:
  • It does indeed work fine on Aladdin Ghostscript 5.50 (MGv, the best GS frontend, works quite nicely too, I might add :).
  • ...which was the point alluded to before.

    Funny. Quality is impossible in business, but unstoppable as a non-business activity.
  • Fuzz is really just one form of black-box testing. It's pretty primative, surprisingly informative, and would be well adapted to a standard set of GNU or OSS test utilities. In the same way projects run continuous builds of submitted software, a fuzz server (or multiple servers) could hammer away at the resulting binaries.

    Are there other tests in the GNU arsenal?

  • by Dom2 ( 838 )
    The BSD utilities are in the BSD distributions. More to say, there source code is under ftp.freebsd.org, ftp.openbsd.org and ftp.netbsd.org.

    They are usually a lot simpler than their GNU counterparts.
  • Isn't the failure rate of UNIX utilities affected by whether you have root access or not? Since most Solaris boxes are owned by companies with very few root users and most Linux boxes are owned by individuals with root access of course the failure rate is going to be lower.
  • GNU Ghostscript 4.03 (1998-5-1) works fine with this document. Be sure you're running the latest version before reporting bugs.
  • What would be useful is to have a small team of people doing this sort of testing continually, rather than every five years, to provide bug reports and feedback to coders. Most people find formal testing a chore so they don't bother, but it looks like this sort of "low-knowledge" testing could be done by anybody for any application.

    Why don't we go through that list of utilities the good Professor said failed under Linux and get those bug reports filed, rather than hope that the coders see this paper for themselves? This looks like a perfect means by which non-coders can give something back to the Open Source community.

    Worth a thought?
  • You can view HTML in a command-line environment as well, either with Lynx, a HTML to text converter or just from more (crude but effective). Those old man pages need converting.
  • by Gus ( 2568 )
    I had the good fortune to see this paper presented after the 1995 revisiting. Professor Miller gave an excellent talk, and is the best CS professor I ever had.

  • A breif summary of the results can be found at here [wisc.edu].

    Also, there is no mention of any 1998 revisit to the study, nor did Prof. Miller mention it while I was working for him in the first half of last year. The "last modified" date on the file is in 1995, so there was probably not a revisit last year.

  • The failure rate of Unix utilities under these conditions has nothing to do with root access. These tests measured the ability to handle strange and unusual input data; they could be run as any user.

  • This is not really a cause for celebration. 7-9% failure rates for an incredibly primative test is far too high. That commercial vendors are even worse does not make it any better. This kind of stuff damages the reputation of *nix as a whole. If anything it should be a call to arms. It would be interesting to get the source code and see how
    it performs on an up-to-date linux system. I wish I had more time to do a bit of this sort of testing/fixing.
  • Regarding the duality of options, frankly, '-i' means very little to a new user, but '--initialize' provides a hint as to what the option might accomplish. The idea behind long options is to provide a human-parsable alternative to the shorter versions. The overhead of adding this support is extremely low, and the benefits for first-time users more than pay for it.


    Info format is awful. You'll hear no argument from me. But think about when it was proposed; the only alternative really was roff. Man pages fail to solve one question in an important two-part problem: how do I learn this, and how do I look up information after I've learned it. Man pages solve the latter. SGML manuals, properly written, can solve both, providing both a tutorial, and a reference.


    Attracting newbies isn't the goal; helping people use the system is, on both counts.

  • Those failure rates are old, and if run on the current GNU tools, you should get better results (since some of the failures have been fixed since then).

    It would be interesting to re-run the tests with the latest tool versions.

  • failure reading a paper about the quality and low failure rate of GNU utilities on a GNU utility.
  • I just thought you might want to know that there is a project [pacbell.net] specifically aimed at fixing the bugs described in this paper (the 6-9% failure rate). (In fact, many of them have been fixed.)
  • Here.. I converted the file to PDF 3.0 if anyone is interested. If you need it in 2.x e-mail me. It's only 50kb in 3.0 format.

    http://lonestar.texas.net/~la ndrum/fuzz-revisited.pdf [texas.net]
  • my web usage meter has gone up a bit since i posted this, so i guess ya'll have used it.

    yesterday my total usage for the week was 18.45 megabits (my isp is weird and measures it in bits) and today my total for the week is 72.43 megabits.. since my other sites on there basically suck, i guess this thing is popular :)
  • Can I get his report as a zipped RTF, or as an acrobat file, or HTML, or anything other than a Tar/Gzipped _Postscript_ file?

    Kris.
  • First of all this test is updated for 1995 if it's true that it wasn't revisited in 1998.
    GNU utilities got a LOT better since then, and measuring by the size of patch clusters for solaris, sun didn't make much progress.
    And secondly, the source is on this ftp as well, fetch it and try it on anything you want.
    * When I get a new computer I first of all wipe native utilities, and then install GNU utilities.
    They're much more consistent, faster and better.
    Linux wouldn't be what it is today if not GNU.
    I think that the GNU people deserve a big compliment.
  • What ? GNU the worst set of utilities ?

    I don't agree a single bit with that. Without all the wonderful GNU utilities, my life would be really painful.

    The -- for long options is a nice feature. Why do we need two options, you ask? Why not? What's wrong with it? I think its well worth it. They do all your normal non-GNU utilities do and more. I don't use long options that much, but I don't have a problem with 'em. I think the typical newbie you want to attract will find programs easier to use with long options.

    Well, I wouldn't completely agree about the info stuff either. While its true that GNU should put more attentions on the manuals, documentation on an hypertext source is really better when you want to learn how to use *new* stuff. As I see it, both things, manual and info, are useful and have different objectives. May be info is not the format to use, but hypertext is the thing. And I have found info useful anyway.

    So, you see *data* saying GNU behaves better than anything else and you say they're the worse set of utilities out there ? You think they cheated or do you simply care about "-- and -" and "manual pages missing" more than having a utility do what its meant to do?

    Azulejo.
  • by mattc ( 12417 )
    The BSD utilities ARE the Gnu utilities. At least on the newer freebsd systems this is true.
  • Sorry folks, but the GNU utilities are the worst set of utilities out there. What is all this -- and - crap?? Why do we need two options that do the same thing? Why are half the manual pages missing? "Read the info page instead" aka Our programs are too bloated to use normal documentation methods. Why does it seem that everyday programs like "tar" have been made deliberately difficult to use? I sure don't know. Linux is my favorite OS, but we are NEVER going to attract newbies until we get our basic tool set cleaned up!


    I suggest proponents of "creeping featurism" read Gancarz's "The Unix Philosophy" -- and someone send the FSF a copy while you're at it ;)

"Ninety percent of baseball is half mental." -- Yogi Berra

Working...