Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Editorial

Systems Research Is Dead? 394

Manoj writes "Rob Pike of Bell Labs (Yes, that [Rob Pike]) says that systems software research is irrelevant. At a time when computing is almost the definition of innovation, research in both software and hardware at universities and much of industry is becoming insular, ossified, and irrelevant. Where is innovation? At Microsoft, mostly. Exercise: compare MS software in 1990 vs MS software today. He states that Microsoft has been working hard, and that on many (not all) dimensions, their products are superior technically. Linux is the hot new thing, but it is merely a copy of the same old stuff. And anyway, the exciting thing about Linux is the development model, and researchers contributed little to that. He states that the excitement, the art, of systems research is gone. "
This discussion has been archived. No new comments can be posted.

Systems Research Is Dead?

Comments Filter:
  • by Jason Earl ( 1894 ) on Tuesday June 06, 2000 @08:16AM (#1022023) Homepage Journal

    Mr. Pike is just upset because his beloved innovative Plan 9 is being completely and totally eclipsed by Unix-like operating systems like Linux and FreeBSD.

    He bemoans the fact that computer scientists still tend to use such crufty old tools as Unix, Emacs, and TeX, despite the fact that these tools are still very capable and that they grow more capable every day. The Novelty that Mr. Pike is so obsessed with is only useful when it constitutes an improvement over what is currently in use.

    Perhaps the most absurd part of Mr. Pike's lament was his comments on Standards. He states that "With so much externally imposed structure, there is little slop left for novelty." Clearly, however this is both untrue and unfair. Mr. Pike could easily create his own little completely novel computer world. He undoubtedly has the talents. However, unless this new system allows the end user to do the things that they traditionally use their computer for what's the point? Why would I want an innovative new system that can't send email or surf the web (or run Emacs for that matter)? I wouldn't, no matter how novel it might be.

    More importantly it is certainly possible to innovate within the standards that he complains about. Napster, Gnutella, and FreeNet are all innovative ways of sharing files, yet all of them rely on existing standards. Mr. Pike also totally belittles the success of scripting languages like Perl, Tcl, and Python. I certainly don't use C or C++ on my new development projects. These languages all represent innovation, but they probably aren't available for Plan 9, so they don't count.

    If Plan 9 were in Linux's position you can bet that Mr. Pike would be singing a different tune, even if all Plan 9 were being used for was to run Emacs, TeX, and Netscape.

  • They gave us a new UI to Windows 3.1 and called it Windows 95.
    -They created NT by adding features of VMS and Unix to Windows 3.1.


    Now wait a second, let's not get carried away. I'm sure you realize these statements (especially the second one) are false. Windows 95, as horrific as it is, isn't just Windows 3.1 with a new UI. About all you can say is they borrowed some of the code. They are different.

    And NT has about 0 to do with Windows 3.1. Hell, Dave Cutler (the architect of NT) didn't want *any* compatibility with Win16 or DOS. The compatibility exists now as a layer on top of NT.

    --GnrcMan--
  • I think Microsoft started to make strides in improving their software through research when they hired Nathan Myhrvold, way back in 1986. Myhrvold took a leave of absense from MSFT in mid-1999 [microsoft.com] (which may be permanent), but he seems to have been the driving force on the development of a research business at Microsoft.

    Microsoft often doesn't get much credit from this community, and they certainly don't receive praise from me too frequently. But, I think Rob Pike deserves credit for having the guts to talk about their research in positive terms. Microsoft has improved the state of end-user computing in a number of ways, even if many of us disagree with them on their design decisions.
    --

    Dave Aiello

  • by YoJ ( 20860 ) on Tuesday June 06, 2000 @07:43AM (#1022027) Journal
    Rob Pike is right. Systems research is waning, in amount of research done and influence. New operating systems are hard to find (there are lots of experimental kernels, but not too many complete systems). This is sad, because I think operating systems have a long way to go before they are finished evolving. The Unix model has stood the test of time. Other systems have come and gone, but the Unix model remains. It is like an old cathedral. Many cathedrals fall, so new builders base their designs on the ones that kept standing.

    Here are some random features I want to see in an operating system.

    • You turn on the computer and it works instantly.
    • Whenever you feel like stopping you turn off the computer.
    • You never have to save your files, you always work with up-to-date information
    • You never have to drag or resize a window
    • My mom can store her weaving project ideas on it without any help
    • You never have to remember obscure names, everything is built out of a small set of simple blocks
    • You never think about "connecting to the internet", you just work with data that happens to be located somewhere else
    • The kernel continually decides how to configure itself for maximum efficiency for the task at hand
    • There are never any library problems -- library specifications are written in a formal mathematical language, which can be verified against the library code
    nojw
  • by Matthew Weigel ( 888 ) on Tuesday June 06, 2000 @07:44AM (#1022030) Homepage Journal
    Give Microsoft credit for attempting to creating an object-oriented operation system, while trying to maintain compatibility with the past. If you look at the internals, there is a considerable amount of power in their object methods.
    No. I'll give them that credit when they deserve it; as it is, OS/2 provides a real object-oriented API, and maintains rather equivalent compatibility with the past (I am comparing OS/2 and Windows98 here, so 'compatibility' is used in reference to DOS and Windows 3.1). OS/2 also doesn't have the DOS background that Windows98 does, had preemptive multitasking at its inception, and is mostly dead because of a feud between IBM and MS.

    Far more than Apple, I might point out, who has taken 16 years to give us preemptive multitasking (technically, they still haven't, of course).
    First of all, MacOS X Server has been shipping for, I believe, a year now. If you regard Jobs as being Apple, then Jobs has been shipping preemptive multitasking since before OS/2 had a GUI (which is before Windows had preemptive multitasking too), and he's been working on bringing that back to Apple since '96.

    Further, Apple hasn't delivered yet because they're more ambitious. They've gone through several attempts that simply didn't live up to their standards. If you want to look at places that innovate with backwards compatibility, look at Apple -- architecture change, no problem. Complete replacement of everything from the kernel to the userland, your old programs will still work. MacOS X is really innovative; it's more modern than UNIX, more stable than Windows NT or OS/2, and easier to use than anything.
  • by jpallas ( 119914 ) on Tuesday June 06, 2000 @08:17AM (#1022032)
    Your example doesn't support your argument:
    Is there anything new that Linux has added, beyond the model? (Which it just borrowed from the FSF, anyway!) Yes! The aforementioned CODA is a good example.
    According to the Coda documentation,
    Coda was originally implemented on Mach 2.6 and has recently been ported to Linux, NetBSD and FreeBSD.
    So, Linux didn't contribute anything to Coda except a means of distributing the work. Also, Coda is more than ten years old as an idea (the first paper is from 1987), which is in line with Pike's claim that research has been stagnant for ten years.
  • Virtual consoles, one of the delights of many a Linux user, don't exist for DOS, Windows, or many flavours of Unix.

    SCO (yes, SCO) has had virtual consoles since at least 1987 with the Xenix 2.2 system. Which, incidentally ran on a 286 (it used swapping, not paging).

  • There is certainly value and merit in having made UNIX more accessible to "mere mortals."

    But that isn't systems research.

    Pike is observing that there isn't effort going into finding the abstractions and combinations thereof that can provide systems that might be downright better than UNIX.

    Making Linux more "user friendly" may be a worthy enough goal, but that falls into the category of "Development," not "Research."

  • He states that the excitement, the art, of systems research is gone

    That sounds.. horrible to my ears! I really enjoy digital electronics/CPU/system software development. That's what I studied at the Uni, and I dream of working in that field. Please do't tell me it's too late!

  • I see much misinterpretation of the term "systems programming." It has become a euphemism for low-level bit tiwddling and device driver hacking. In fact, the term is quite literal, meaning "the programming of complete systems." If you look back at the original UNIX, it was an OS, a user-environment, and a programming language, all of which were developed to work together.

    A goal of systems programming is the usability of the entire package. This is one of the stumbling blocks for what Linux has grown into. The basic command line environment is comparable to the original UNIX in this regard. Cryptic, but powerful for a certain type of user and application. But that's where the plan stops. GNU tools tend to be messy critters, with way too many command line options. Look at the man page for ls, you'll see a note about how the man page is no longer maintained (use texinfo instead). X Windows feels like a giant hack to bring UNIX into the bitmapped age. The desktop environments for Linux all feel like attempts to bring Microsoft Windows into the realm of free sofware, but they're duplicating the same mistakes: too many gadgets, too many meaningless icons, too much focus on fiddling, too many questionable interface descisions. In the end, we have an interface that's not sure what it wants to be, outside of a slap at Microsoft's market share, put on top of a byzantine graphics system that is gernerally unliked, on top of of mostly unrelated early 1970s era nuts and bolts. It is free, and this is good, but it can be viewed as a way of keeping programmed running in circles until something truly new comes along and makes everyone re-assess their goals.

    It is also worth considering that the era of big operating systems is coming to a close. Who could have believed that a Palm, with it's little 68K chip, would be so bloody useful? Or that the Game Boy is closing in on 100,000,000 units (that's 100 million)? You could argue that web browsing and word processing and coding are outside the realm of such toys, but is that really true? There's a cool little Forth for the Pilot, that's probably the slickest programming system I've seen in years. And there's going to be cool stuff in the future from other non-desktop hardware. Is trying to tack Windows onto Linux really such a worthwhile endeavor?
  • by GnrcMan ( 53534 ) on Tuesday June 06, 2000 @07:49AM (#1022043) Homepage
    Is there even one core product on the Microsoft line that is an original Microsoft innovation?

    Of course there is. Bob.

    --GnrcMan--
  • Where is innovation? At Microsoft, mostly. Exercise: compare MS software in 1990 vs MS software today.

    But ask yourself this; where are all those companies which invented most of this (if not all) new technology today? Consumed by Microsoft. Take the most recent example; Microsoft is now a major player when it comes to voice control in the software. Surprise, surprise; I happen to remember a company in Belgium which was (very) small but a very strong player on this market sector. And they started an alliance with Microsoft. Today that company is no more and Microsoft presents a new technology.

    Innovation? Yeah right... But not in the market sector we are talking about now.

    He states that the excitement, the art, of systems research is gone.

    And you are truly surprised? C'mon, where lies the fun if all your hard work will be swallowed by one massive "monster" and you can do absolutely *nothing* about it? And to prove this, IMHO, very narrow minded statement totally wrong: If this were true, why is it that there are a lot of people out there who still enjoy developing stuff and actually make progress? Dunno about you guys but I call stuff like finding out how to operate DVD's (read: black boxes) pretty excessive when it comes to system research. If this was no fun, like mentioned, this could not be.

  • I'd submitted this story a while back, and HTMLized it [rice.edu] for easy viewing. HTH.
  • by b_pretender ( 105284 ) on Tuesday June 06, 2000 @08:21AM (#1022050)
    Get off your stupid soapbox about amiga!

    If anyone should Be on a Soapbox bragging about innovative OSes, it should Be someone who's talking about a high-caliBer OS. Behold the one true Multimedia Befitting OS. Benchmarking shows that this OS Belongs among the Best OSes in performance. Of course, I'm speaking on Behalf of...
    darn I forgot!

    Of course, the company essentially took a briBe by ditching their desktop platform. Now we have to scriBe on a emBedded version if we even want to Begin to use this OS.

    (Please don't laBel me a BeOS fan, I just thought this was funny ;-)
    It's amazing what: 'grep be /usr/dict/linux.words' will do!
  • Most of these "IT" curriculums are a "learn the latest language" crap that can be outdated in a couple of years. You end up with zombies that don't understand the fundamentals, and just know how to script/code certain things.

    Good computer science curriculums focus on theory, and theory is something that last a lot longer than learning language X. I can't stand the amount of people coming out of school, claiming to compete in this field and they barely know what an algorithm is , not to even tell how to analyse it. (O complexity, what ???)

    To think of it, we had so much theory we didn't even get formally introduced to programming in C. It was just expected of you to pick it up. Same with Ada and other languages.
  • by b_pretender ( 105284 ) on Tuesday June 06, 2000 @07:51AM (#1022052)
    If it aint broke, then don't fix it!

    As far as I'm concerned, the parts of Unix that Linux copies are for the most part, good.

    I do think that Microsoft has come a long way with there line of products. This is a product of having a huge market share and constantly increasing it by adding new features and expanding into new markets. It's easy for large corporations to spend money making their products better (or at least making them look better).

    Now it is time for Linux to show the power of a huge base of developers. Since the '90s Linux has had the power of many programmers putting there not nessecarily marketing skills into a product that they like and use.

    It is apparent that Microsoft has put a lot of money into products when you look at what they did/do. They started with some existing technology and continuously morphed it with new technology. With lot's of money/market share they were willing to add new features without always ensuring that the existing features were refined well. Having a large developer population on the other hand means that you don't necessarily have as much capital, i.e. you're not as willing to invest in new 'features'. Concentration is focused on making things work smoothly. Ideas are not purchased, they are discovered/invented. Sure, Linux is almost Unix, but that sure was a good place to start.

    It'll be great to see where the current Linux momentum takes us. Will there be a large new generation of coders, all willing to contribute to the open-source movement?

    --
  • Scientists who work in the natural world are always bringing new technology (i.e. hardware!) to bear on "old" problems and discovering new things. This seems to me to be little different from updating UNIX to run on new microprocessors. It's not invitation-to-Stockholm research, certainly, but it is research. I'm surprised at Pike's attitude. Perhaps he was one of those folks who urged their Bell Labs colleagues Penzias and Wilson to stop working on that old radio telescope microwave noise problem and do something new for a change. And I really think that research into multiprocessor high-performance computing *is* thriving, ongoing systems research into new things. I don't mean to say people shouldn't try and develop radical new approaches to systems, that would be great I agree, but it's also exciting to explore new heights in old systems as well.
  • Hey, Linux has come a long way since 1990. In fact, 100% of the features in the new 2.4 kernel did not exist then.

    He does have a bit of a point that Linux has mostly been about copying others. But that's not necessarily the rule. ReiserFS is pretty innovative.
  • I do not hate Microsoft only their business practices. I.E. I do not think that they play fair.

    Did they really invent COM? Or did they get the idea from someone else?

    They did not invent the IDE either I remember using pascal and C ide's way before Microsoft did anything with an IDE. I thought Borlan had it's ide's way before Microsoft.

    Other than COM, I cannot think of anything that was not already in existance that all they did was extend and embrace.

    send flames > /dev/null

  • Read my (BoLean) translation to text comment here [slashdot.org]. for a quick method of converting postscript to text:

    Step 1: View source of file in tect editor

    Step 2: Replace all "("s--open paren with an open script html tag.

    Step 3: replace all ")"s -close paren with a break tag then a end scrpit tag.

    Step 5: save the file as filename.html and view in a browser.

    The output is reasonably legible.

  • The only innovation between 1990 and 2000 was at Microsoft? Wasn't there a little thing called a "browser" invented somewhere around there outside of Microsoft?

    Now, this is amusing.

    Pike was talking abouty innovation in operating system design, and I think we can all agree that a browser is not an integral part of any OS.

    Well, except for Microsoft, who argue explicitly that the browser is part of the OS. Now, you might claim that the argument that "the browser is the OS" is innovative, except that then it's not clear that Sun (with the HotJava browser, and the idea that Java was a platform) wasn't the first to make that claim.

    On the other hand, I have to agree with Pike that there has been relatively little innovation on the operating systems research front, or at least innovation that has had any commercial success. I'm not an expert on the topic, but it's very possible that things like COM, maligned as much as they may be around here, are about as innovative as anything else. (Or not, but that will be a separate post from me...)

  • by pkj ( 64294 ) on Tuesday June 06, 2000 @08:23AM (#1022065)
    Ok, first off. These are slides, which are in most cases pretty meaningless by themselves. I'm certain that the real meat of Mr. Pike's presentation is his talk that accompanies them. Unfortunately, that does not seem to be available, so I can only comment on what is presented.

    My biggest beef is that he treats Unix (and Linux) as if they were a static entity that never changes, but nothing could be further from the truth. Linux (and most Unices) have embraced the results of all new operating systems research. In fact, most research is actually done using Unix as a base, even thoutgh the technologies are by no means Unix-specific. SMP, threads, IPv6, journaling and global filesystems, real-time extensions, just to name a few.

    The point is that Unix is that it is so modular that you can rip out entire pieces of it and replace it with something better. After all, at the core, what really is unix? Do you define it as the V7 system call set? Or do you define it as the implementation by a specific vendor?

    What really gets my goat is the statement comparing MS software in 1990 to 2000. That's trivial because MS completely ignored the fruits of the research of the previous 20 years. It wasn't until the advent of NT that they even began to try to do things right. But even then they blew it by integrating the GUI into the core of their OS. Grrr...

    The statement of "twenty years ago, students would be exposed many operating systems each with good and bad points" is pretty much bunk for the same reason. Over the past 20 years, Unix has tried to ammend the bad points, and incorporate much of the best. Is this a bad thing?

    Lambasting gcc and emacs in favour of Visual Studio? I'll stick with my current tools, thank-you-very-much! VS may look pretty, but what does it really do that I haven't been able to do with my current tools for the past 10 years? Ok, except lock me into a one very specific platform that I cannot customize. And what's this cor to do with operating system research in the first place?

    Statements like Linux's [GUI] interface isn't even as good as Windows are what really get my goat. Rob, what were you thinking when you wrote this? Surely you know that the only interaction with Linux is by system calls to the kernel. What is the GUI? Well, in most cases that would be a glob of X/Windows, and a desktop package running on top of that. But gee, take a look... People are developing all sorts of new interface paradigms that do not use X or look anything like what is currently being used. And where do the run? Yes! Linux and other Unix platforms.

    Yeah, sure, I'll agree that the number of OS research breakthroughs has been dwindling on the past decade, but that's the be expected. Much of the easy, obvious, and general stuff has been well researched and documented. What's left is the really hard esoteric stuff; the stuff that most people just do not have enough background to understand.

    Maybe Unix (or Windows, or the presentation to the user of each) has become like the automobile of the transportation industy. Most automobiles are pretty similar. Mostly they have four wheels and are powered by an internal combustion engine. In fact, in most general respects, they are remarkably similar to the machines produced at the turn of the century. Does this mean that they have reached the pinnacle of design? Or maybe it just means that the concept has reached a fundamentally usable point.

    Now, let's take the automobile analogy to the next level. Most people can learn to drive a car in about an hour. Most cars will run for years with little to no service. Can the same be said of computers? Of course not. Does the user really care that billions of dollars of research have gone into making the car 10% more efficient? Not really.

    Are computer's as usable by the general population? Hardly. I'd make a strong case that what is under the hood is pretty much irrelevant at this point if most people can't figure out how to take the machine out of the garage.

    And this statement applies equally well regardless of whether you prefer Windows or Linux or anything else.

    -p.

    (sorry for the ramble... too busy even to be writing this...)

  • Linux's supposed crowning achievement -- GNOME

    Alright, I take massive issue with that. I won't pick at the possessive here, I'll just assume you meant "open source" or "free software" 's greatest achievement.

    But in the case of open source, I would say that the original BSD is the crowning achievement. Berkeley pretty much built what we have today as the internet. It was original and forward thinking -- not a copy. What of the internet itself? It pretty much sprung from the same community.

    What about Apache, and PERL? Apache grew out of NCSA, and PERL (sort of) grew out of sed and awk, but it's not as if they were knock-offs of other products.

    When it comes to free software, I would have to say that gcc takes the crown. Writing a C compiler isn't the most original thing in the world, but gcc is among the most standards-compliant compilers in existence, and as a result we enjoy a new level of portability. It was a rude surprise for me when I went from Linux / Solaris with gcc to Solaris / IRIX with stock system compilers. Suddenly my ANSI C code just plain broke.

    Linux the kernel, nice as it is, is certainly derivative. The desktops are most certainly derivative (and I'd say KDE is more of an achievement than GNOME). Apps like GIMP, KWord, and Evolution are undeniably free software knock-offs of existing, commercial packages. But there are definately innovative> pieces of free software as well. They don't spring up very often, but they are glorious when they do. Innovation in any field is a precious quantity, please don't disregard the innovation that is there.

    Sorry, I just couldn't let this one slip by...

    --Lenny
  • About the only thing the forthcoming new Amiga [amiga.com] has in common with the old "classic" Amiga is the name. If you read up on the Amiga and Tao [tao-group.com], you'll see the new Amiga is quite innovative - a generalised virtual machine and operating system that dynamically recompiles for the target architecture, using optimising compilers running within the virtual machine (currently supported languages are C, C++, Java, and assembler for the virtual machine).
    Sort of Transmeta-Crusoe-backwards. I also suspect the Tao Virtual Processor machine code will turn out to be quite similar to Crusoe native code, but that's just a hunch.

    You may have seen the recent slashdot articles connected to Tao, such as the one about heterogenous multiprocessing CPU cores on a single die, with each core at least semi-automagically executing the parts of a Tao/Amiga application that it is best suited for.
  • Is there even one core product on the Microsoft line that is an original Microsoft innovation? How about the flight simulator? Maybe they did that first.

    The Microsoft Flight Simulator was purchased from Bruce Artwick's Sublogic corp..

    So I guess that just leaves Bob (tm). Unless somebody else knows why they can't even claim to have innovated *that* piece of carp.
    --
  • by samantha ( 68231 ) on Tuesday June 06, 2000 @07:54AM (#1022073) Homepage
    An object oriented operating system? Give me a break! MS is to enamored of VB. They just barely got a little bit of object based (not object oriented) stuff into VB6 that is really only a cover for ActiveX/COM stuff. Not object oriented at all. InProc COM as an attempt at an OO system? Sort of but not really. InProc grew out of DLL stuff which I was using back in 1985 or so when IBM/Microsoft were playing with OS/2. You can use it as part of an OO OS but it is certainly not the whole enchilda nor will it make it happen by itself.

    Innovators? Like having VB6 a single thread for IDE, debugger, program which means any real error will blow the entire misbegotten thing out of the water? Don't make me laugh. MS has mainly been working out possibilities of DLLs and ActiveX plus some UI work. That is not exactly major innovation. Some of their component stuff is only beginning to catch up with things that have been part of CORBA world for some time now. They finally, for instance, came out with a publish and subscribe MOM. About freakin time.

    Feedback? When you have to pay to even get a bug report filed and dealt with? Never mind actual feedback on features and design. Those go nowhere but, if you're very lucky, into Microsoft's list of things to maybe someday do with no credit to you and to charge you big bucks for. BAH.

  • When I went to RPI [rpi.edu] to begin my undergraduate education in 1985, I wanted very much to study Computer Science. After all, the only conclusion I could make as a high school kid from Bell Labs Country was that CS graduates were the ones who wrote the software that people used on their PCs.

    Wrong.

    It took me until mid-1987 to realize it, and until 1990 to be able to articulate it, but Computer Science was a terrible field to study if you were interested in writing software that people actually used. Of course, research that came out of these environments gave us UNIX and a lot of the fundimental networking technologies that are the basis of our world today. But these breakthroughs were the exceptions.

    After 10 successful (and profitable) years in the consulting business, I firmly believe that the universities themselves are not the problem -- it's the classical CS pedagogy. There are ways to learn skills at universities that will help you produce usable software. But the best field of study is not likely to be CS for many of the reasons that Rob Pike suggested.

    A lot of schools, including Rensselaer, are coming up with Information Technology curricula that blend the core science and engineering courses with business school courses, humanities courses, and fundimental programming and application architecture. You may laugh, but these IT curricula stand a much better chance of graduating productive software developers who can think outside the box than even the most reformed CS programs do.
    --

    Dave Aiello

  • by Ralph Wiggam ( 22354 ) on Tuesday June 06, 2000 @09:21AM (#1022075) Homepage
    Just because someone wrote a paper 13 years ago and people have spent the time since then implementing it in the real world and then going on to improve it, doesn't make it stagnant technology. In 1890, what if people said "Alexander Graham Bell wrote that paper on the telephone 13 years ago, it's stagnant technology"? Here it is, 125 years after those initial papers and a lab still bearing Bell's name is doing cutting edge R&D in the same field. The goal is supposed to be that you learn something from one idea, and use that knowledge to pose questions that the initial person could never dream of (like fiber optics and cell phones). People who claim that anything besides punch card storage is "stagnant" don't have a good enough imagination.

    -B
  • by Lion-O ( 81320 ) on Tuesday June 06, 2000 @07:55AM (#1022082)
    And the answer is.... Epoc32. Allthough I must admit that the Internet connectivity isn't as clear as you mentioned here but in the overall the OS comes very close to your demands.

    He's right? Then why didn't Microsoft come up with this instead of another company (Epoc was there before WinCE, another "surprise" (see an earlier post)).

  • Rob Pike is a bit in his own world, and that has always given him a bit of an edge in the research world, but let's face it: Linux is just more of what's gone before?! Since when did UNIX involve putting graphics acceleration in the kernel? Since when did UNIX involve httpd acceleration in the kernel? What's the best way to accomplish those two things? What's that I hear? You'd need to do research to figure that out...

    Let's try a few more examples:

    XML-based run-time UI loading in GNOME
    Truely distributed file sharing using gnutella
    Robust HTTP acceleration through caching and load-sharing (squid)
    Adding alpha layers and antialiasing to X (XFree)

    If these aren't valid research projects for any given CS student/researcher/hacker then I've obviously been in a different industry for the last 12 years.
  • Pike's claim was that nothing was coming out of research departments. Are all those examples from research departments, or are they commercial efforts? He didn't say commercial entities weren't innovating, only that research entities weren't making ripples in the ocean.

    --
    Marc A. Lepage (aka SEGV)
  • Pike's self-styled polemic seems to hit a nerve, while missing entirely on a number of key points (for example, he at once argues that the problem comprises: (i) systems-development can only happen in industry because academia cannot sustain long-term development; and (ii) systems-developme cannot happen in startups and industry because the time-to-market horizon is too short for long-term work. He may well be right on both points, but this doesn't seem to prove the propositions for which the respective points are cited.

    In any case, I think he says a number of exciting and salutary things. But the KEY thing seems to have been buried in the comments so far, in his suggestions of things to build:


    Only one GUI has ever been seriously tried, and its best ideas date from the 1970s. (In some ways, it's been getting worse; today the screen is covered with confusing little pictures.) Surely there are other possibllities. (Linux's interface isn't even as good as Windows!)


    Indeed, the world is not devoid of new GUI ideas, its just that we are all trying so hard to be Microsoft-killers that we spend all our energies trying to do a better version of Windows.

    Jef Raskin's recent book, "The Humane Interface," introduces many new and old ideas that deserve a great deal of attention.

    I think building a novel (not just better) GUI is probably one area with the greatest degrees of freedom for researchers. Unlike OS, it is not burdened by requisite (albeit there are de-facto) standards, and there is desperate need to improve on the status quo, and some reason to believe that the status quo's problems may be inherent.

    Jef suggests one way to do this. There are certainly many other ways.

    Build a novel GUI. Give a great Demo. These are the best words I took away from Rob's writing.
  • Being facetious... So is EVERYTHING we do as a species, isn't it?

    The whole point of everything we do, from throwing rocks at rabbits to e-Commerce, is about survival and procreation.

    A journaling file system is still a file system - nothing new there. 64 bits is an extension to the 32 bit deal we're used to. Are you saying that if it's not revolutionary, it isn't an innovation? Please, just about everything we have now can be traced beckwards, down an evolutionary path, in tiny little increments of development.

    I think Mr. Pike is/was a revolutionary when the world changed really rapidly. I think this comes across in his statement. (If you have a handy PS reader for NT, please send it my way, I'll be glad to read the article -- truly innovative of Mr. Pike to send out his treatise on the lack of innovation in an old, and non-ubiquitous format)

    I'm sure Ben Franklin would be devastated by the lack of innovation shown in our current government - well, that's a poor point, as what HE had was probably much better than what we have now.

    There still are flashes of brilliance in the industry, they are just few, and if they can not show immediate profitability, they are squelched by their funding managers. It's a pitty.

    Where was Pike when the Web caught fire? Where was he when PalmOS and the Palm devices exploded? Where is he as cable internet touches more homes each day? Networking has become so huge a function of the OS as to be considered a "systems level" function. Yeah, the bright guys of yesterday came up with the routing algorighms back then, but IP6 is still something that's looming, and implementing it in the backbones is DEFINITELLY system level work.

    There's plenty of innovation happenning in the world - I think the issue is that Mr. Pike isn't the one doing it - Where's OS9 from outer space anyway?

    Pike's hidden point seems to be that UNIX (as a concept) hasn't changed much over the years. So what? So a few guys in the 70's hit the nail on the head.

    Is distributed.net not innovative? Is it not systems programming?
  • Flame me, but I agree. When people decry Microsoft's lack of innovation, they forget, although not really major accomplishments, Microsoft's innovations here and there. Take the browser for instance: Customizable toolbars...auto-completion...images with alt tags that appear in popups. These aren't really great things to begin with, but yet were quietly copied in almost every current browser.

    I think some of the unix/open source world is sometimes stuck in the rut of "We're right/better so we don't HAVE to innovate". Take traditional file systems and security mechanisms. These are really feeling the strain when scaled up to a global network with thousands of users and terabytes of data.

    To fend off some flames - open source DOES lead to great innovation in many areas. Mozilla, Gnome, KDE, and a slew of other projects obviously are doing new and very cool things. We have to be careful not to think that "just" because open source is better, that means we don't have to aggressively innovate. Unfortunately for most users, ungraced by Stallman's philosophical ideals, a proprietary product that fulfills or exceeds their requirements is better than a "pure" and free one which does not.
  • Am I the only one who felt this was just an official way for some unrecognized researcher to whine? I mean, most of the points he makes are just silly.

    Now INRS(I'm no research scientist), but I don't understand the basis for these complaints. Here's what I gathered from his arguments:

    'There's no reason to innovate because everything out there is just a copy of something older. While many of the things out there are improvements of older things, that doesn't really matter, they're still copies. We need new operating systems, not for the sake of innovating, but for the novelty sake. Because, well, you know... things used to be that way. Back in the good ol' days. And I think they should be that way now too. Basically, I just want to see other things so I'm not so bored. If I can stay interested in a new OS or something, maybe I won't have to leave and get a job.'

    Is it just me?

  • by ucblockhead ( 63650 ) on Tuesday June 06, 2000 @07:57AM (#1022099) Homepage Journal
    Amiga was innovative. A new Amiga would not be. That's the very nature of innovation.

    We worship innovation far too much. We need to ask ourselves, is it a worthwhile innovation.

    (Though Amiga was, IMHO.)
  • by cascadefx ( 174894 ) <morlockhq@@@gmail...com> on Tuesday June 06, 2000 @07:59AM (#1022103) Journal
    Asking single line questions doesn't really expand anyones understanding of any subject.

    To answer your question however:
    Though I am not a "purist" as I make my living as a benevolent parasite (ie tech support/analyst) for Microsoft products, I really do enjoy an have a lot of respect for Linux.

    Mr. Pike has some good points, but I don't think he is completely right in every respect. First off, stable operating systems are somewhat new (linux/freebsd being among them). For all its "innovation" I'd like to see windows 2000 still run after a hard drive crash (a friend just recently told me how his hard drive crashed but bsd kept going with the processes that were still in memory... my friends, THAT is innovation).

    I would also say that the idea of customizability/infite choice in OS is an innovation that even microsoft is picking up. The fact that you can tweak almost everything about an OS is relatively new (thanks to the Open Source Model).

    Sure, we have a long way to go, but I think the very open nature of Open Source/Free Software encourages even more research (especially in areas of application) instead of less. You have to consider that Linux (et al) have spent thier time getting the right things done first (like stability and versatility) and now are beginning to branch out.

    Yes, GNOME/KDE may be copies of prexisting things, but the wonderful thing is that they have ability to be extended in a myriad of ways by anyone (hello, Eazel). These projects are perfect examples of people accomplishing feats that were previously dismissed. Now, with the education and experience gained from overcoming the hurdles to get these things running, they can now extend those capabilities as far as thier imaginations can take them. I, personally, can't wait to see the new developments in store for KDE and GNOME. I have a feeling that they are going to prove Mr. Pike wrong.
  • >I'll give you something to ponder: is there a better way to implement virtual memory other than spilling over into hard drive? You have to be able to index them and swap them in and out--on things called pages.

    Yes there is. Spill them to someone elses RAM across a fast network. A lot of the time they aren't using all their RAM, and it's a LOT faster to do that than write it to disk.

    >Give me a faster general purpose index (or for that matter, OS-specific purpose) than a B-tree.

    Easy: skip lists

    >Until there is a new and radical type of hardware (other than storage, input, and output) there is little need to change what and OS does or how it does it radically. Rather, as mentioned way above, the techniques used now are optimized for a particular piece of hardware and/or software.

    You don't need new hardware in fact.

    As the relative performances of the various components change you get sudden points of inflexion where you discover that doing something entirely different suddenly is worth doing. The Internet being a burgeoning example...

    Another example: for a short while OS manufacturers wrote compression routines in the file system. Suddenly the disks went faster- phoom the compression was actually slower again- so they took it out...

  • While he does raise a few good points, I think the manner in which he does so is more grandstanding than anything else. It actually distracts from the main message. Reminds me of the time at a Usenix about 15 years ago that he delivered a paper while dressed in a harem girl outfit.

    Yes, in some ways Linux is playing "catch up" -- it has to, it didn't magically spring into existence fully formed. But in other ways (eg, look at some of the details of the kernel architecture) it is leading edge. At least with respect to real-world software, as opposed to academic exercises like Version 8 Unix or Plan 9.

    As for MS innovation -- well, they've boldly copied the UI from such as Mac and NeXT, and they've boldly gone where no sane software designer has gone before -- dissolved the barrier between application and OS making such wonderful features as ILOVEYOU.vbs and a whole plague of viruses and BSODs possible. If that's innovation, I'll take playing catch-up, thanks.
  • by Animats ( 122034 ) on Tuesday June 06, 2000 @09:35AM (#1022112) Homepage
    The triumph of vanilla hardware has resulted in a world of system software that use only generic hardware features. This limits OS architecture. Some examples:
    • Hardware and OS support for software components is weak. It ought to be almost as fast to call an object in another address space as one in the same space. It's not. There's some hardware support for this in Pentium and above machines ("call gates") but nobody uses it. And nobody uses the protection ring hardware. So vast amounts of middleware end up in the kernel or in the app, all able to crash more than themselves. This was done better in Multics.
    • All the UNIX variants still use the fork/exec model, which is stupid. It was done that way because in the original UNIX, forking was done by swapping out, then duplicating the process entry, memory being so limited that in-memory forking might deadlock. Win32 at least has this right.
    • Concurrency locking is a mess. It shouldn't take a system call just to lock something, and it didn't in pre-PC machines. Only Java has locking in the language, and their concurrency model isn't airtight. It's slow, too. Things were better when we had P and V.
    • The window damage/redraw model was created back when machines didn't have enough memory to keep copies of obscured windows, or enough speed to blast-copy a full screen in one retrace time. It overcomplicates the window system and bends the whole window architecture out of shape, but we're still using it. Why not just let each app draw its own window in its own memory, then ask the window system to blit it to the screen with appropriate clipping? For that matter, you could have a sprite-type MMU in the graphics system and do windowing in hardware. 3D hardware should belong to the CPU, not the graphics board.
    • Copying should be free. This requires hardware support, along these lines: The code executes MOVE src dst length which takes one instruction cycle and starts up a background copying operation in a copy unit. Reads of dst are remapped to reads to src until the copy completes. Writes to src or dst cause a pipeline stall, as does running out of copy units. This is similar to the machinery that synchronizes other execution units in a superscalar CPU. The effect is that in-memory copying no longer takes time. This is a big win for message-passing systems, CORBA, DCOM, RPC, etc.
    • OSs should handle memory protection, CPU switching, and message passing. Everything else should be protected-mode middleware or applications. With a little hardware support, this doesn't require any speed penalty. (Sun built the hardware for a message-passing OS into the later SPARCs, but that group ended up producing the Java JVM, instead of a new Sun OS.)
    • The robustness of databases should be in OSs. Tandem did this years ago, but their hardware cost too much. Time to look at that one again.
    • Paging was invented when CPUs ran around 1 MIPS, and paged out to storage turning at around 10,000 RPM. Today, CPUs run around 1000 MIPS, and page out to storage turning at around 10,000 RPM. Give it up. No more page faults. The best a paging system can do is double the effective memory size. Better to get more RAM and fix the memory leaks.

    All these areas need a rethink. Too many decisions made in the early days of UNIX are now obsolete.

  • by BoLean ( 41374 ) on Tuesday June 06, 2000 @08:02AM (#1022119) Homepage

    Academics for Profit:The day that academics got into bed with big industry, innovation died. If a concept isn't economically profitiable they aren't likely to funt the research. The other part of the problem is that mostly the same questions keep getting asked, over and over.

    Take the Road Less Traveled:What we need is to ask new questions. Like: What if somone used a transfer protocol as a storage meduim? Instead of, Whats the best UI? One question has been beaten to death and the other may lead to new innovations. New questions like: What if data storage was perpetually dynamic? Would the information be siphoned off when accessed or copied as it passes by? Are there any advantages/disadvantages to this.

    But like I said, if you were to pose that question in modern day academia as a research project they would scoff at you and you would likely be off on your own. Getting funding for a project that won't likely be profitiable for the college/university isn't too likely to float.

    Teacher dillution:Not to mention the quality of CS professors. I actually had a course in sotware design taught by a guy who had written exactly one full blown program application (say >5000 lines of code as a criteria.) for a warehouse program while he was in school. He didn't have a clue.

    Student dillution:Worse yet, since CS/CE has become a top ten major the quality of people going into these programs has gone from those who were truely interrested in computer science to those looking to make >40K upon graduation. Why would these guys hang around in a graduate program for four extra years when they could be making the big bucks?

    Before you pick up the torch on my post, yes I understand there are the rare few who really try to do innovative work. Keep chuggin'. places like SlashDot and Sourceforge are our meeting places where we can meet and hopefully share ideas.


  • Hired most of the core guys who worked on DCE...

    --LP

  • In the old days, every type of system had its own operating system, often several. When I had a PDP-11, it could run RT-11, RSX-11M/M+, IAS, RSTS and a large number of obscure commercial and research operating systems. Operating systems, for the most part, were written in assembler or obscure systems programming languages.

    Later, when the VAX and BSD UNIX became popular, companies stopped writing new operating systems for new hardware. Instead of writing a new operating system, they ported UNIX to the new hardware. That was much cheaper and safer than designing and writing a new operating system. Universities still did operating system research, but little of it was commercialized. UNIX and UNIX clones were taking over the world.

    The IBM compatible PC is now the generic hardware platform. What operating systems does it support?

    • MS-DOS
    • OS/2
    • Windows 95/98
    • Windows NT/2000
    • Linux
    • FreeBSD/OpenBSD/NetBSD
    • UNIX (SCO, Solaris, etc.)
    • BeOS
    • QNX

    Notice the large number of UNIX, and UNIX clone operating systems. Even the proprietary operating systems have been influenced by UNIX, for example, the new features introduced in MS-DOS 2.0 show a strong UNIX influence. From the command line, they all look a lot like UNIX. They all support C and something close to POSIX.

  • A lot of "systems research" is now going on in metacomputing (see the interview with Greg Lindahl, or the Legion Project at U.Va, http://legion.virginia.edu, or Globus, http://www.globus.org). These groups are attempting to develop wide-area distributed computing systems but are purposely avoiding the low level systems issues since they cannot guarantee them to be consistent across Solaris/IRIX/AIX/Linux/etc. This is real systems research, if you want to borrow someone else's CPU power, potentially someone whom you've never met or spoken with, you need a very flexible system.

    Before you can comment on a lack of "systems" research, you have to define what a system is. Most of us would probably agree that there is little innovation in low-level, nitty-gritty device driver systems research. But if you bought a new machine and all it came with were device drivers, would you consider that a complete system?

  • by ozone ( 91697 ) on Tuesday June 06, 2000 @09:38AM (#1022129) Homepage

    I couldn't agree more. Ive been writing Win32 for three years and I've yet to see a proper object. MFC does a poor and inefficient job of abstracting the morass of Win32 API functions in an OO way. The only truely OO Desktop OS API of which I'm aware is BeOS.

    There are significant innovations - The DLL loading mechanism is pretty cool and fairly efficient but a nightmare to maintain. Win2K improves it but you still don't really know which lib is used where. COM under it all is a solid binary object solution. It's got a bad rap from the bad implementations (eg OLE) that have been built on it. There are some great ideas but all the great ideas in the world won't help if they aren't fully thought out. The MS research group has some cool stuff going on on their website but the only thing I've seen make it mainstream is MS Agent. And who cares about that?

    It bugs me too that MS tries to sell VB as the way to build Win32 apps. You can't debug it. You don't know what all the ActiveX controls are doing and why they might generate an exception. Since VB5 you can't even fire an event on a seperate thread. Doesn't sound like much but it makes multithreading and object orientation in apps that use VB much more difficult than they should be.

    But the worst thing is that whether or not what comes out of MS research is great stuff, the decision to build it into (or leave it out of) the OS is subsequently forced down the throats of all of us who don't have the luxury of running something else. MS doesn't need anybody's assurance that it's a good idea. they have their own agenda and features are not included or excluded solely on the basis of improving the lot of the user.

  • by jetson123 ( 13128 ) on Tuesday June 06, 2000 @09:38AM (#1022130)
    Pike is right in as much that Linux doesn't contain radical innovation. But neither does Windows NT. Both are really just incremental enhancement on tried-and-true OS technology. But between the two, Linux need not fear comparison.

    Pike is also right that the community needs to explore more unorthodox ideas and do research for the longer term. He is also right that startups suck energy from research.

    But when it comes down to it, a lot of his complaints are colored by his particular environment. AT&T used to get monopoly profits, so they were wealthy and didn't have to give a damn about dealing with the business side of things. And they didn't. Well, welcome to the real world. Most researchers have had to figure out for decades how to balance real world demands with a research program. Pike should perhaps learn this if he wants to stay in this business.

    Pike also used to work in the early days of systems research when there were a lot of low-hanging fruit. Well, systems research is a lot harder now because most of the easy stuff has been done. These days, any progress requires a lot of complex interdisciplinary skills and work: algorithms, systems, statistics, HCI, etc. You're lucky if you get out one good paper per year these days.

    Yes, today's systems research is tedious, hard, long term, and not well funded. But there is more work to be done than ever, and a lot of really interesting and important work. And, you know what, it actually is happening. Maybe not where Pike is working, but at lots of other places.

    If there is one complaint I would have is that the open source community isn't more innovative. A lot of open source projects are merely clones of existing systems, often existing open source systems. Open source developers could actually focus more on the future.

  • Before we start complaining about Pike Linux bashing or him being pro-microsoft, listen to what he has to say. He is giving Linux a new outlook. Instead of trying to be the best new thing, make it the best DIFFERENT thing. Basically, we have to try to look at operating systems, destop computing, mobile computing, and everything else in a different light. We need to focus on the things that linux is capable of, not the things that it can already do.
  • Linux and Unix have improved--but in incremental ways that don't introduce many new concepts to the user. That's Rob's point.
    I take his point, but I still don't know that it matters. When someone finds a need that a computer can fill, a program and/or OS will appear (eventually) to fill that need. If that need can be filled by a hacked linux kernel, then there's no point in writing a new OS. If there's no demand, there's no innovation.
    What new needs have manifested themselves? We have Be for multimedia, and Palm OS for portability. I haven't used Be often enough to comment, but I love my palm (so to speak) for many reasons, not least the fact that I've a device in my pocket with a spreadsheet, a bunch of books and copies of Elite and Wolfenstein. And lots more besides. To me, PalmOS is innovative.

    "I'll just repeat Slashdot dogma, instead of considering that a creator of Unix might know something about systems innovation".
    I was a bit wary of commenting on the words of someone who obviously knows more than I, but I stand by my non-comment. I didn't (and don't) want to explain why MS isn't innovative precisely because it's dogma; I'd just be wasting my breath. Or my fingers. I can trot out my take on this, but it's been said here a thousand times already.
    Just because it's on /. and it's anti-MS doesn't mean it's wrong.

    but the fact remains that most Windows users have these things today, and most Unix users don't.
    Which goes back to my point about demand. Do you think there'd be a Real player for linux if it had as many users as Windows? Or any of those other features you mentioned?
    They'll all be available soon; no-one doubts that. And they won't be considered innovative, but they'll increase the usability of the OS no end. Which will mean that although it'll still be Linux, it'll be a better Linux that fills the needs of its users.
  • by Alarmist ( 180744 ) on Tuesday June 06, 2000 @06:57AM (#1022143) Homepage
    This is akin to saying that there's no point in research and development for automobiles, because everything today is basically a copy of Cugnot's gun carriage of the 1790s.

    Why would anyone in a position of any prominence make such a ludicrous statement? There is always new work to be done, even if it is merely refining an old idea. The GPS we use these days is really a refined solution to the same problems that the backstaff and sextant tackled; however, it's much harder to screw up finding your location with a GPS than it is if you use a sextant.

    Is this man a shill? Is he trying to say, "No, there's no need to go behind that door. Ignore the guys back there who are developing all sorts of new wonders that will be used later on to exploit you."

  • by Jonathan ( 5011 ) on Tuesday June 06, 2000 @06:57AM (#1022149) Homepage
    Whenever someone claims that a field is essentially complete, they are nearly always wrong. Consider what people believed in the year 1900: Physics was considered practically a completed science -- and that was right before the explosion of modern physics.
  • ...being a recent RPI Computer & Systems Engineering grad ('99), and watching the IT major go through birthing pains, I can tell you that nobody will become a productive software developer because of that program. Rather, they could become a productive (anything) despite that program. The program is evolving (I hope) past the state in which it started... The curriculum looked to be straight out of the management school, with almost *no* CS or Eng classes. Some, but not nearly enough. Data Structures should still be a required course for something along this line. I found the initial curriculum severly lacking in technical content. IT shouldn't be a management degree with CS1 and a JAVA course, it should be more of an applied CS/CSE track...

    IT Revolution Myth or Reality - 2 hours twice a week?!?! Time better spent elsewhere. This could be two or three two hour sessions per semester - more of a school colloquia. Not a four credit course...

    All the people I knew who were switching into the major were people who decided that engineering or CS was too tough for them, and they'd rather do management, but this seemed like an easier was to get a job (I still haven't figured out undergraduate management programs, but that's another story).

    If you can't pass a basic circuit class or Computer Science 2 in your sophomore year (many freshmen take it their first semester), I don't want you in an IT curriculum. Ever. This shouldn't be management, it should be more of what you described. Unfortunately, it isn't.

    Some of the best programmers I've seen have been engineers who learned to program from necessity - through the same classes at the same time as the CS majors, but they seemed to get a lot more out of it. The feeling of applying programming to practical use is more solved in software engineering than it is in that IT major...

    Just my $.05
  • by johnnyb ( 4816 ) <jonathan@bartlettpublishing.com> on Tuesday June 06, 2000 @08:53AM (#1022167) Homepage
    If you read the interviews from the gnome guys, they admit to copying what's good about other people's designs. They say that first we need to get a good free software base that is consistent with the current market, and then we can start to innovate from that. So the current GNOME efforts have been just that - copying from others to get a good free software system to innovate from.
  • The problem with a lot of CS programs is that they teach Computer Science. What does that mean? It means that they teach the big, complex problems of fundamental compuation and not the practical details of coding in modern languages. The Honest Truth of it is that you need both. There seems to be no middle ground right now between superficial programs that teach how to write a GUI in a certain language and leave you high and dry in the area of general knowledge of principles of programming and the theories that underly the field, and Big Science programs that work from the assumption that everyone is going to spend the rest of their life writing compilers, designing languages, and proving problems NP complete.
    The program that I am in right now leans towards the latter, and there are some advantages to that. Yeah, they won't teach me Visual Basic, but frankly, there are all kinds of books out there to teach you visual basic. I learned C on my own from a book, I figure I can do it again with most other languages. True, I didn't really understand C until I started writing projects in it, but I think that is really the primary value of programs that teach language specific classes comes in. It isn't important to learn the concept of a particular language in order to know it; there are very few conceptual programming models. What makes you learn a language is using it, and playing with it. So frankly, I'd rather learnt he general principles behind various Things in the computer field (general ideas about computation and complexity, models and abstractions, network concepts etc.) and then learn what I need to know in the world on the fly than be locked into taking classes that teach me a language and not the underpinings of the field.
    The problem that a lot of theory programs have, however, is that they do a poor job of making the study of theory relevant to the lives of students, and they let themselves get a little too out of date in their obsession with 'big ideas'. A widely used (and poorly written) book on Computational Theory written by a dean at my school makes it seem that the only reason to study theories of intractability is that someday, your boss might ask you to solve an NP-complete problem effeciently, and you want to be able to show that it won't work. Yeah, a lot of people I know lie awake at night worrying that their next web page design is going to be for a guy who wants a web page that solves the satisfiability problem. At the same time, my entire school has not a single class that acknowledges the existance of a graphical interface. Yes, opening and closing windows on a desktop is not a timeless model of computing, but it is the dominant paradigm for user interface at the moment, and has been for a while. There is one class on 'Human/Computer Interaction', but it deals with speach recognition and optical recognition, because those are more 'algorythmically interesting.'
    Bottom line is only this: we need neither more trade schools, nor more ivory towers. Producing a lot of 'programmers' that only know how to throw up a webpage with blinking mouseovers or how to plug in values in VisBac 6.0 isn't going to advance the field very much. At the same time, where the new pressures and problems are is in the real world, the world of business and commerce. Despite the resurgance of fundamental number theory as a job skill in lite of renewed interest in cryptography, more theoreticians that don't know what the hell #include means isn't going to give much of a boost either. If anything though, I think that a lot of the imbalance right now lies in teh direction of 'Programmer in a Box/Get Rich Quick' programs that emphasize bare competancy in a single area or language as being all you need to be a succesful programmer. If only for depth of character, I think a little more is called for. In the long term, no single language or program can stay current; only the knowledge of the concepts and theories underlying the field can, but only if they seem relevant enough to keep people in the classes.
  • I think people here have under-estimated the degree to which true innovation (rather than better packaging) has been slowed in the field of operating systems.

    But Pike does make a very important point about the current limitations on possible innovation: most of them are the result of the need to interoperate with existing standards. What is perhaps not very clear from his argument is that the constraint of existing standards is universal. And it does indeed limit innovation, at least at a given level of system complexity.

    But how universal a constraint is this, really? I think the most telling example is that of biology. Life has been around for something like 3 billion years, and we now know that there has been exceedingly little innovation in some of its aspects; the basic principles (and most of the exact codes) of encoding genes into DNA, of transcribing them using RNA, and expressing them to do the real work of the organism are stunningly conserved. Almost no innovation there. And I could go on up the various levels of complexity and make similar arguments. But I won't, except to point out that in biology, innovation has almost always emerged in the form of additional (internal) complexity.

    Getting back to the world of operating systems, then, Pike is possibly quite correct that the lowest layers of the operating system architecture are now essentially fossilized, permanent, and profoundly resistant to change. But the argument that this means that innovation now has no place is, I believe, profoundly wrong. Innovation will just have to occur at higher levels of abstraction.

    So take one example: the idea of users and user authentication. This is a very basic concept, and one which at its heart seems to be very resistant to change. (But, interestingly, one where Plan Nine was conspicuously innovative; hmm...) But I think it should be very clear that there has been and will be lots of innovation in this aspect of operating systems. The problem, of course, is that users are no longer welded to the console of the machine, nor even to a local network including a particular machine, nor even to the same kind of machine on the same kind of network...and yet people do have this notion that they should be able to "log on" from anywhere with strong security and access to whatever resources they had both "there" and "here", wherever "here" is. As many slashdotters know, this gets pretty hairy pretty quickly, and there are many possible solutions that are being worked on. (And of course, the problem is not a completely novel one; the analogy is a bit rough, but the immune system is an interesting solution to a similar problem.)

    In any case, there is lots of room for innovation for certain kinds of problems, and very little room for others. I think Pike really does know this, and was worrying more about how we can get funding and support to the people who are trying to solve the problems where there is still lots of room for innovation. Academia does not seem to be the answer, nor do most corporate cultures. But what is the answer?

  • by Mr. Protocol ( 73424 ) on Tuesday June 06, 2000 @09:59AM (#1022195)
    I'm an unabashed Old Phart, from the CS generation previous to Rob Pike's. I read his diatribe some days ago, before it was publicized here, and it did ring a few bells. I know why he feels the way he does. I don't think the situation is as dire, or as permanent, as he paints it.

    At the time Rob came to the Labs, and for some years before, the CS scene was being painted in pretty broad strokes. Minicomputers allowed the software boys to explore uncharted territory for cheap. Oh, they explored it before on mainframes, but it's hard to get a whole off-brand OS started when you need access to $3 million worth of hardware to do it. But even before then, there were lots of things out there. Most operating systems were not only crap, they were deliberately obfuscatory crap, as Ted Nelson was fond of pointing out at the time. Reacting against that we had ITS, TENEX and MULTICS on the mainframes, and on the new minis, UNIX. On more experimental hardware we had things like CM*, C.MMP, and the Culler Fried Chicken system in Santa Barbara.

    The only one that survived is UNIX. It took the combined efforts of all of us just to keep ONE alternate OS alive.

    Today, almost all our efforts are, as Rob points out, dedicated to extending what's already there. Yet the actual effort expended overall is huge. The scene is a LOT larger than it was. /. reported a week or three ago on a whole new OS that I didn't even know existed. Then there's BeOS, and other older ones like Sprite. F'r'eaven's sake, you can buy BeOS in stores now. Do you know how long it took for just a book on UNIX to come out? And it was crap, at that.

    Still, the main effort of systems research today could be characterized as "embrace and politicize." I think that's normal. I've remarked in other forums that science in general is in that sort of rut now. Most of the really big inventions of the 20th century, apart from the Internet, occurred in the first 50 years of the century.

    Consider what we have to look forward to, what I think Rob wants to see:

    . We should not be inventing protocols. Our applications should negotiate them as needed.

    . TCP/IP is a point solution, good only across a certain range of bandwidths and error rates.

    . Devices should exist in a hierarchy of protocols. The protocols spoken by backbone routers should be profoundly different from the protocols spoken by doorbells, cars, and home computers.

    . Devices should not be given operating systems. They should discover and run one appropriate to their task, on their own.

    . No device should have to both run a GUI and compute (Plan 9 demonstrated the feasibility of this view).

    Broad strokes, people. Broad strokes.
  • by rde ( 17364 )
    I've got a few problems with this piece...

    1. He states that hardware is going on and on while software is stagnant, citing the fact that (amongst others) Netscape and X and Unix are still being used ten years later. But is Unix (or Linux) what it was ten years ago? Is Netscape? The fact that it's got the same name doesn't mean it's the same product.
    2. "Where is the innovation? Microsoft, mostly". 'nuff said.

    Basically, his argument seems to be that if we don't completely change our software every few years, we're being stagnant, which more or less flies in the face of the 'if it ain't broke' school of thought.
    Ten years ago, there was no instant messaging and no napster. I use neither, but I'm impressed with implemention of both, and with such things as Quake 3. I'd consider them innovative and elegant.

    I agree with the gist of his argument, but I reckon he's making bad examples that do nothing for his point.
  • by daVinci1980 ( 73174 ) on Tuesday June 06, 2000 @07:02AM (#1022202) Homepage
    At University, software researchers aren't particularly interested in the pettiness that Microsoft is. They don't care about writing usable GUIs and they don't care about how some API behaves at the top level.

    At my University, our software researchers are interested in software problems such as the dining philosopher problem (threads|deadlock), the lying general problem(tcp/ip|data security), and also software solutions to the travelling salesman problem (component design).

    None of them claim to be able to write an OS better than Microsoft, why would they care? And the problems that they're working on Microsoft has no desire to solve.

    To say that Software Research is unecessary is a very bigoted, ego-centric view. To say that Software Research insofar as high-level OS design is concerned would be much more appropriate.
    --
    "A mind is a horrible thing to waste. But a mime...
    It feels wonderful wasting those fsckers."
  • Innovation died when it smacked into commercialism at 186,000 MPH on the 'net. The autopsy revealed that it died after taking a massive overdose of patents and copyright law at the same time. Doctors report that he double-twitched to death on his bed after claiming the single-twitch death was patented by PreferredOne.

    Bad attempt at humor aside, the reason innovation died is because we can't! I can't link to another site without being sued, I can't talk bad about a product without being sued, I can't be controversial, can't talk in politically incorrect ways, can't have opinions, and can't hack together something obvious because Some Other Guy(tm)(r)(c) already patented it and I don't have the money to fight!

    Innovation is hampered every time there is a delay while research is done for patents, everytime an idea is scrapped because someone else happened to patent the obvious, everytime a meeting is held, every press release made we are slowing down the rate of progress in this country. If we continue at this rate, we may well wind up reaching a social and legal impasse where no progress can be made because there are too many cooks in the kitchen.

    That's just my $0.02.

  • Rob Pike is right. Systems research is waning, in amount of research done and influence. New operating systems are hard to find (there are lots of experimental kernels, but not too many complete systems).

    Umm... check me if I'm wrong here, but PalmOS seems like a good candidate for most of the things you've described! Yeah, one could possibly make the argument that "it's not a *real* OS" but it's nonetheless clear that they put a lot of thought (read: research) into it!

    You turn on the computer and it works instantly.
    Yup.

    Whenever you feel like stopping you turn off the computer.
    Yup, this one too.

    You never have to save your files, you always work with up-to-date information
    Hmm, not sure (I don't actually own a Palm yet) but I think this is true.

    You never have to drag or resize a window
    Okay, PalmOS kinda breaks here with the whole multiple-window concept.

    My mom can store her weaving project ideas on it without any help
    I don't think anyone will argue that there is a lot of room for ease-of-use research!

    You never have to remember obscure names, everything is built out of a small set of simple blocks
    Have you seen "The Brain [thebrain.com]"? It's certainly not an ideal solution - but it's definitely some damn impressive research in this direction.

    You never think about "connecting to the internet", you just work with data that happens to be located somewhere else
    Now don't get me wrong - I am certainly no supporter of M$ - but I give credit where it's due. I won't claim they did a good job of it, but isn't that what Billy Boy was claiming they were trying to do when they pushed for IE integration?

    I think that there is some damn good research being done - even some of it by people we "may not like" such as MicroSloth. As an example, their new optical mouse is damn impressive - not dreadfully useful *yet*, but wait until someone figures out that this technology could be used to build a industrial-ruggedized version!

    --
    Make Money on the 'Net [geocities.com]

  • Systems Software Research is Irrelevant

    Rob Pike
    Bell Labs
    Lucent Technologies
    rob@plan9.bell labs.com
    Feb 21, 2000

    1 A Polemic This talk is a polemic that distills the pessimistic side of my feelings about systems research these days. I won't talk much about the optimistic side, since lots of others can do that for me; everyone's excited about the computer industry. I may therefore present a picture somewhat darker than reality. However, I think the situation is genuinely bad and requires action.

    2 Thesis Systems software research has become a sideline to the excitement in the computing industry. When did you last see an exciting non commercial demo? Ironically, at a time when computing is almost the definition of innovation, research in both software and hardware at universities and much of industry is becoming insular, ossified, and irrelevant. There are many reasons, some avoidable, some endemic. There may be ways to improve the situation, but they will require a community wide effort.

    3 Definitions Systems Software Research Is Irrelevant

    4 A Field in Decline 1 2 3 4 5 6 7 1979 1981 1983 1985 1987 1989 1991 1993 1995 1997 1999 New Operating Systems at SOSP "Who needs new operating systems, anyway?" you ask. Maybe no one, but then that supports my thesis. "But now there are lots of papers in file systems, performance, security, web caching, etc.," you say. Yes, but is anyone outside the research field paying attention?

    5 Systems Research's Contribution to the Boom A high end workstation: _ ________________________________________________ 1990 2000 _ ________________________________________________ _ ________________________________________________ Hardware 33 MHz Mips R3000 600 MHz Alpha or Pentium III 32 megabytes of RAM 512 megabytes of RAM 10 Mbps Ethernet 100 Mbps Ethernet _ ________________________________________________ Software Unix Unix X Windows X Windows Emacs Emacs TCP/IP TCP/IP Netscape _ ________________________________________________ Language C C C++ C++ Java Perl \ a little\ _ ________________________________________________ Hardware has changed dramatically; software is stagnant.

    6 Where is the Innovation? Microsoft, mostly. Exercise: Compare 1990 Microsoft software with 2000. If you claim that's not innovation, but copying, I reply that Java is to C++ as Windows is to the Macintosh: an industrial response to an interesting but technically flawed piece of systems software. If systems research was relevant, we'd see new operating systems and new languages making inroads into the industry, the way we did in the '70s and '80s. Instead, we see a thriving software industry that largely ignores research, and a research community that writes papers rather than software.

    7 Linux Innovation? New? No, it's just another copy of the same old stuff. OLD stuff. Compare program development on Linux with Microsoft Visual Studio or one of the IBM Java/web toolkits. Linux's success may indeed be the single strongest argument for my thesis: The excitement generated by a clone of a decades old operating system demonstrates the void that the systems software research community has failed to fill. Besides, Linux's cleverness is not in the software, but in the development model, hardly a triumph of academic CS \ especially software engineering\

    8 What is Systems Research these days? Web caches, web servers, file systems, network packet delays, all that stuff. Performance, peripherals, and applications, but not kernels or even user level applications. Mostly, though, it's just a lot of measurement; a misinterpretation and misapplication of the scientific method. Too much phenomenology: invention has been replaced by observation. Today we see papers comparing interrupt latency on Linux vs. Windows. They may be interesting, they may even be relevant, but they aren't research. In a misguided attempt to seem scientific, there's too much measurement: performance minutiae and bad charts. By contrast, a new language or OS can make the machine feel different, give excitement, novelty . But today that's done by a cool web site or a higher CPU clock rate or some cute little device that should be a computer but isn't. The art is gone. But art is not science, and that's part of the point. Systems research cannot be just science; there must be engineering, design, and art.

    9 What Happened? A lot of things: PC Microsoft Web Standards Orthodoxy Change of scale Unix Linux Startups Grandma

    10 PC Hardware became cheap, and cheap hardware became good. Eventually, if it didn't run on a PC, it didn't matter because the average, mean, median, and mode computer was a PC. Even into the 1980s, much systems work revolved around new architectures \ RISC, iAPX/432, Lisp Machines\ No more. A major source of interesting problems and, perhaps, interesting solutions is gone. Much systems work also revolved around making stuff work across architectures: portability. But when hardware's all the same, it's a non issue. Plan 9 may be the most portable operating system in the world. We're about to do a new release, for the PC only. \ For old time's sake, we'll include source for other architectures, but expect almost no one will use it.\ And that's just the PC as hardware; as software, it's the same sort of story.

    11 Microsoft Enough has been said about this topic. \ Although people will continue to say lots more.\ Microsoft is an easy target, but it's a scapegoat, not the real source of difficulty. Details to follow.

    12 Web The web happened in the early 1990s and it surprised the computer science community as much as the commercial one. It then came to dominate much of the discussion, but not to much effect. Business controls it. \ The web came from physicists and prospered in industry.\ Bruce Lindsay of IBM: HDLC C HTTP/HTML; 3270s have been replaced by web browsers. \ Compare with Visicalc and PC.\ Research has contributed little, despite a huge flow of papers on caches, proxies, server architectures, etc.

    13 Standards To be a viable computer system, one must honor a huge list of large, and often changing, standards: TCP/IP, HTTP, HTML, XML, CORBA, Unicode, POSIX, NFS, SMB, MIME, POP, IMAP, X, ... A huge amount of work, but if you don't honor the standards you're marginalized. Estimate that 90 95% of the work in Plan 9 was directly or indirectly to honor externally imposed standards. At another level, instruction architectures, buses, etc. have the same influence. With so much externally imposed structure, there's little slop left for novelty. Plus, commercial companies that `own' standards, e.g. Microsoft, Cisco, deliberately make standards hard to comply with, to frustrate competition. Academia is a casualty.

    14 Orthodoxy Today's graduating PhDs use Unix, X, Emacs, and Tex. That's their world. It's often the only computing world they've ever used for technical work. Twenty years ago, a student would have been exposed to a wide variety of operating systems, all with good and bad points. New employees in our lab now bring their world with them, or expect it to be there when they arrive. That's reasonable, but there was a time when joining a new lab was a chance to explore new ways of working. Narrowness of experience leads to narrowness of imagination. The situation with languages is a little better many curricula include exposure to functional languages, etc. but there is also a language orthodoxy: C++ and Java. In science, we reserve our highest honors for those who prove we were wrong. But in computer science...

    15 Change of scale With so many external constraints, and so many things already done, much of the interesting work requires effort on a large scale. Many person years are required to write a modern, realistic system. That is beyond the scope of most university departments. Also, the time scale is long: from design to final version can be five years. Again, that's beyond the scope of most grad students. This means that industry tends to do the big, defining projects operating systems, infrastructure, etc. and small research groups must find smaller things to work on. Three trends result: 1. Don't build, measure. \ Phenomenology, not new things.\ 2. Don't go for breadth, go for depth. \ Microspecialization, not systems work.\ 3. Take an existing thing and tweak it. I believe this is the main explanation of the SOSP curve.

    16 Unix New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture and some do the first thing to build is the Unix emulation layer. How can operating systems research be relevant when the resulting operating systems are all indistinguishable? There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn't believe it. Today, I grudgingly accept that the claim may be true \ Microsoft notwithstanding\ A victim of its own success: portability led to ubiquity. That meant architecture didn't matter, so now there's only one. Linux is the hot new thing... but it's just another Unix.

    17 Linux the Academic Microsoft Windows The holy trinity: Linux, gcc , and Netscape. Of course, it's just another orthodoxy. These have become icons not because of what they are, but because of what they are not : Microsoft. But technically, they're not that hot. And Microsoft has been working hard, and I claim that on many \ not all\ their corresponding products are superior technically. And they continue to improve. Linux may fall into the Macintosh trap: smug isolation leading to \ near\ Besides, systems research is doing little to advance the trinity.

    18 Startups Startups are the dominant competition for academia for ideas, funds, personnel, and students. \ Others are Microsoft, big corporations, legions of free hackers, and the IETF.\ In response, government funded and especially corporate research is directed at very fast `return on investment'. This distorts the priorities: Research is bent towards what can make big money \ IPO\ in a year. Horizon is too short for long term work. \ There go infrastructure and the problems of scale.\ Funding sources \ government, industry\ pressures, so there is a vicious circle. The metric of merit is wrong. Stanford now encourages students to go to startups because successful CEOs give money to the campus. The new president of Stanford is a successful computer entrepreneur.

    19 Grandma Grandma's on line. This means that the industry is designing systems and services for ordinary people. The focus is on applications and devices, not on infrastructure and architecture, the domain of systems research. The cause is largely marketing, the result a proliferation of incompatible devices. You can't make money on software, only hardware, so design a niche gimmick, not a Big New Idea. Programmability once the Big Idea in computing has fallen by the wayside. Again, systems research loses out.

    20 Things to Do Startups are too focused on short time scale and practical results to try new things. Big corporations are too focused on existing priorities to try new things. Startups suck energy from research. But gold rushes leave ghost towns; be prepared to move in. Fiona's story: "Why do you use Plan 9?" Go back to thinking about and building systems. Narrowness is irrelevant; breadth is relevant: it's the essence of system . Work on how systems behave and work, not just how they compare. Concentrate on interfaces and architecture, not just engineering. Be courageous. Try different things; experiment. Try to give a cool demo. Funding bodies: fund more courageously, particularly long term projects. Universities, in turn, should explore ways to let students contribute to long term projects. Measure success by ideas, not just papers and money. Make the industry want your work.

    21 Things to Build There are lots of valid, useful, interesting things to do. I offer a small sample as evidence. If the field is moribund, it's not from a lack of possibilities. Only one GUI has ever been seriously tried, and its best ideas date from the 1970s. \ In some ways, it's been getting worse; today the screen is covered with confusing little pictures.\ Surely there are other possibilities. \ Linux's interface isn't even as good as Windows!\ There has been much talk about component architectures but only one true success: Unix pipes. It should be possible to build interactive and distributed applications from piece parts. The future is distributed computation, but the language community has done very little to address that possibility. The Web has dominated how systems present and use information: the model is forced interaction; the user must go get it. Let's go back to having the data come to the user instead. System administration remains a deeply difficult problem. Unglamorous, sure, but there's plenty of room to make a huge, even commercial, contribution.

    22 Conclusions The world has decided how it wants computers to be. The systems software research community influenced that decision somewhat, but very little, and now it is shut out of the discussion. It has reached the point where I doubt that a brilliant systems project would even be funded, and if funded, wouldn't find the bodies to do the work. The odds of success were always low; now they're essentially zero. The community universities, students, industry, funding bodies must change its priorities. The community must accept and explore unorthodox ideas. The community must separate research from market capitalization.

  • ...in fact, one could argue that everything we're doing with computers today is old hat and has been at least through the 80s, depending on the degree you want to nitpick. What innovation in the computer industry means to me is the process of making systems faster, more powerful, more reliable, more efficient, and easier to use. We obviously haven't stretched any of these characteristics to the limit yet in any operating system or in hardware, so that alone indicates to me that systems research is not dead.

    People seem to think that we need some radical new paradigm in the way we're doing things in order to indicate progress; Microsoft is all too eager to jump in with a spiffy new standard and a handful of TLAs to placate this crowd and keep us all on the frequent-upgrade track. This [symantec.com] is not good innovation (and while I'm talking about Microsoft, this [dgl.com] isn't either). Sometimes different [muppetlabs.com] and more complex [microsoft.com] doesn't beat tried-and-true [apache.org]. Can't innovation be combining yesterday's solutions with today's needs to make a new product? Why are people so willing to attach the label of innovation on things that are new but not better?

    I work in a Microsoft NT / IBM AS/400 / Linux environment. The AS/400 feels archaic, but does what we need it to. Linux feels archaic, but does what we need it to. Microsoft NT looks good.

    ---

  • OS/2 provides a real object-oriented API

    Calling OS/2 OO is like calling Linux OO. The WPS is OO, not OS/2. You do something without going via the WPS you're screwed, all your OO stuff is ratshit.

    OS/2 also doesn't have the DOS background that Windows98 does

    OS/2 IS DOS - a 32bit DOS. that's what it was designed to be. the pretty interface didn't show up until after they realised they were dead in the water against Microsoft.

    more stable than Windows NT or OS/2, and easier to use than anything

    What a load of shit. it's not released yet. it's not being used in the Real World{tm}. How could you or anyone possibly make such a comparison?

    BTW, IMO, FWIW: 'ease of use' is inversly proportional to the user's IQ.
  • I take exception to the notion that Microsoft has led innovation in the area of systems research. Mr. Pike points to the MS-led innovations of 1990 through 2000.

    Okay, fine. I'll accept that MS's technology grew by leaps and bounds from 1990 to 1996, but in the words of Janet Jackson, "What have you done for me lately?".

    Windows98 certainly was no great improvement over Windows95, and Windows 2000 (thus far) appears to have only succeeded in compounding the rat's nest nature of NT security and network administration.

    On a side note, Microsoft keeps screaming "freedom to innovate", but they haven't had any real innovation since 1996.

    If I worked the way Microsoft works, my career would have peaked at age 23 (ready to retire anyway), and would have spent the following years just "riding the wave" to a watery grave.

    Or is it just me that can't stand to see people, particularly high-profile people, taken in by M$'s propaganda?

    /Security and convenience are mutually exclusive./

  • Microsoft actually gets kind of bad rap when it comes to innovation, and they really don't get pegged for where they really do screw up.

    I would say that Microsoft's goals for their software are very ambitious. Backward compatibility is given a lot of priority, which is what leads to the instability of Win/98. However, the miracle of Win/98 is that it works at all, considering the huge amount of hardware variability that it needs to work with.

    Give Microsoft credit for attempting to creating an object-oriented operation system, while trying to maintain compatibility with the past. If you look at the internals, there is a considerable amount of power in their object methods.

    Now the bad part: Microsoft is very ambitious is what they are trying to do. The trouble is that they have been poor at actually pulling it off. The unreliability of Windows is directly tied to over-complexity of what they are trying to pull off. I think if Microsoft actually slowed down a little bit and put a little more thought into what they're doing, instead of just "growing" the software and ending up with a rat's nest, they would be a lot better off.

    On the hand, an argument can be made that Microsoft has succeeded because they don't dry-dock their software forever. They get it out into the world ASAP so they can get maximal amount of feedback (somewhat similar to "release early, release often"). Granted, releases have often been late (ala Linux, I might point out), but they get into the hands of Beta testers very quickly. And no one listens to their customer base like Microsoft.

    I detest working with Microsoft's APIs because there is clearly so little engineering thought put into them. However, it's undeniable that there is a lot of energy and enthusiasm put into them, if you know what I mean. Microsoft is a very ambitious company when it comes to trying to include advanced technology in their products, with a friendly face. Far more than Apple, I might point out, who has taken 16 years to give us preemptive multitasking (technically, they still haven't, of course).


    --

  • I have a one word rebuttal for the respectable Mr. Pike, a word used in only one post [slashdot.org] so far, and in an offhand reference at that:

    • Transmeta

  • by account_deleted ( 4530225 ) on Tuesday June 06, 2000 @07:06AM (#1022253)
    Comment removed based on user account deletion
  • MS isn't much on actual inovation. They are good at reconizing companies that have inovative idesa. Take for Example Windows. One of the biggest reasons Win95 was accepted was the "inovative" TCP/IP features. All that Dial-up Networking Jazz. Problem is, most of it was bought from Shiva Corp. Rebranded as Native MS product.

    You don't have to be smart to make smart products. You just have to reconize where your products suck, and be willing to throw some cash at the problem.
  • by riggwelter ( 84180 ) on Tuesday June 06, 2000 @07:08AM (#1022260) Homepage Journal
    ...merely sleeping.

    For innovation to occur ideas must be had. It may well be the case that a lot of recent innovations have happened in the commercial sector. That is simply because the people who had the ideas were working in that sector.


    There is nothing to stop people in the academic sector having ideas. In fact, if anything ideas in the university setting are more likely to be approved and pursued, as commercial value does not neccessarily have to be proven.

    --

  • One peice of logic I have to very strongly disagree with. Comparing Microsoft in 2000 to Microsoft in 1990 is completely bogus. Try comparing Windows, in either 2000 or 1990, to Unix in 1990. Microsoft managed to "innovate" from a 1950's era OS theory-wise up into the mid-80's. Try again.

    Mr. Pike also doesn't seem to like backward compatibility, either- either through protocols or through API interfaces. Maybe it'd be nice to throw everything out and start from scratch- but there is a huge cost to this as well. One could argue that backwards compatibility is the difference between a theoretical success and a comercial one. This isn't to say that theoretical successes aren't important, just that they don't automatically turn into commercial successes.

    I do agree that OS research is pretty much dead, but not because it's been killed by the horrid evil commercial companies, but because the problems have been _solved_. It's not the only one, either- data base design, numerical analysis, parsing and lexical analysis, and fundamental algorithms are also pretty much solved problems. Hey, haven't seen a sorting algorithm in a couple of decades- I wonder where all the innovation has gone? CPUs have gone from 1MHz to 1GHz, why haven't sorting algorithms gone from O(n * log(n)) to O(n), or even O(log(n))?

    Math disciplines die too. The (IIRC) fifth international conference on Information theory back in 50's (you remember, Claude Shannon and that gang) was canceled because no one really had anything new to say. The problem got solved, move on to a new problem.

    We're seeing this in OS theory as well. Remember the classic Tannenbaum vr.s Torvalds debate? What Tannenbaum missed was that, although theoretically better than monolithic kernels, in practice microkernels didn't have any signifigant advantage (or, rather, had disadvantages to match their advantages, and the successfull OSs would be "middle of the road" OSs with some features of both). Microkernels were the last gasp of the "Real Man" OS Theorists. It fizzled.

    Rather than viewing this as a failure of research, I'd consider this a success. Congratulations, guys, you've solved the problem. Now go on and start trying to solve another one.
  • Point to ponder.

    "Everyone takes the limits of his own vision for the limits of the world." - Arthur Schopenhauer

  • I'm not resisting "non-UNIX" things for the sake of them not being UNIX. Your conclusion that I am promoting "doctrinaire orthodoxy" comes from only half-reading the signature.

    The point of the signature is that if you don't know the "orthodox," you are unlikely to do anything better than to merely replicate many of its features, badly.

    It is also pretty fair to say that:

    Any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp. -- Philip Greenspun

    I think there should be a richer set of systems software research going on; people should be trying out EROS. [eros-os.org] People should be trying out And [zendo.com] Hurd. [hex.net]

    There should be work going on to provide OS environments supporting:

    • Persistent data structures
    • Garbage collected operating environments
    • Filesystems with semantics going beyond the UNIX "bags of bytes."
    • Security management using capabilities

    Some of them will surely fail, and that's OK. Some of them may succeed, and that's a good thing.

    One of the steps to allow people to actually learn from getting into system designs is to actually understand the systems that have come before.

    In that light, it is not doctrinaire orthodoxy to consider that...

  • by jabber ( 13196 ) on Tuesday June 06, 2000 @07:09AM (#1022264) Homepage
    BeOS

    You can't tell me that BeOS is not INNOVATIVE on the SYSTEMS RESEARCH level.

    Want another word? How's Crusoe? The Crusoe *systems software*, by what it does, is probably the most innovative thing out there right now.

    Innovation, lets not forget, is the application of existing inventions to solve new problems. Innovation is not invention.

    The wheel hasn't changed much in the last several thousand years either.
  • ...is trolling! C'mon, people, you've been on /. long enough, you should know a good troll when you see one!

  • Ah, if only I could moderate... but failing that I'll make the point Ted V makes in a different form and hope that gets moderated up. Hmm... that's about as good a lead-in as I could hope for.

    Ted says, "Science is a punctuated string of 'eureka!'s, spaces apart by periods of dull silence." Well, almost. Scientific discovery is a strange and interesting system. Major breakthroughs might come from seemingly nowhere. There might be a definite chain of events. Just as common, though is for a breakthrough to be made but not recognized or appropriately utilized for some time. Generally the person making the "re-discovery" is credited with the creation of that thing. (This is the reason my opening paragraph was a good lead-in.)

    Quite often, and this seems to be especially true in the computer science arena, a discovery or strategy will be discovered and/or used by a small number of people (as few as one) and then, inexplicably "incubate" for years before the combined "eureka" of a more significant group of people. Also, quite often, a set of seemingly unrelated discoveries are made that are only later combined to produce something truly interesting. (Transmeta's Crusoe for all the pseudo-hype being thrown around here is really just an example of this.)

    So, I ask you, where along this chain does "innovation" happen? I choose to believe that innovation (literally -- according to Mirriam-Webster -- the introduction of something new or a new idea, method, or device) includes not only the creation of completely original ideas and methods but also the creation of new combinations of existing tools and/or components. I think it further includes realization that an old idea has interesting implications and uses that weren't explored fully before.

    Using my definition/connotation of innovation, Pike is right that Microsoft has been very innovative. They have recognized good ideans and combined them with others to create new things (intentionally many times and many times not). Others are also right that the companies and individuals who orignated the "components" Microsoft combines are being innovative.

    As a conclusion, I ask the following: If innovations are often not recognized as such when the originally happen, and if much discovery takes place as the result of combining (or re-combining) seemingly insignificant, pre-existing things, can anyone claim to know whether innovation in any arena is really dead as opposed to just lurking?

  • But what did MS do with all that talent they hired? Cutler gave them NT but I don't think the infamous picture of him with the bottle of booze on his desk or his reputation for a terrible temper were exactly signs he was well utilized or happy. But maybe that is just his style.

    What did they do with the Mach stuff including built-in distribute object and object message stuff? Nothing really. DCOM just recently (COM+) manages to even address fundamental aspects of such like distributed dictionary problems.

    TP expertise? Where the heck did that go at MS? Into MTS? Don't make me laugh.

    Compiler experts? Well, counting method completion and on-the-fly compilation snippets, how much innovation has been done there by MS? The C++ compiler is reasonable but a series of incremental fix-ups rather than any real innovations I see. The link-editor is straight out of the early 80s. The include file handling is similarly primitive and no, their precompiled headers if anything made things worse. Where are reasonable innovations like source code analyzers that produce multiply viewable and browsable takes on the structures of systems? The meager stuff that you get only once everything compiles if you turn on the right options is a bad joke.

    What else? MS Repository? A half-way useful idea implemented as a total toy. I spent a few months rewriting part of it to make it remotely useful. Most of my questions to MS met with shrugs or vague promises of adding some simple feature that should have been there from the beginning (like OLE DB support) some number of months down the road.

    XML stuff? MS has done some decent work on driving various XML models for different domains. But I am paranoid they want to own those domain's communication channels and own the XML tool space itself.

    UML support? The Visual Modeler is a badly scaled down Rational Rose clone built for MS by Rational. It is a toy for feeding the Repository toy. It is talked about being tied to development environments but only so far in very rudimentary ways.

    New programming paradigms? I hear rumors but I haven't seen anything real.

    So where is the beef? Did they buy up the talent to do something good with it or simply to neutralize possibly dangerous competition? In general MS ticks me off because all that wealth is being used for so very little to really advance the state of the programming art.

  • by itemp ( 133373 ) on Tuesday June 06, 2000 @07:09AM (#1022277)
    Microsoft hasn't been innovated since it invented the browser (oops!), no the office suite (oops again!), the GUI, not hardly, the file server, not that one either, ah, e-mail, no, transaction services, nope, ... Ah the Word processor, nope, the spreedsheet, not! That would be Visacalc. How about the presentation application , oh ya Harvard Graphics. Media streamer, no, Real did it first. The mouse, no they didn't do that either. the PDA? Not even! What about DOS? Oh yea, bought that one too didn't they. Is there even one core product on the Microsoft line that is an original Microsoft innovation? How about the flight simulator? Maybe they did that first. No I think the Airforce and Atari beat them to it. So, in exactly what way are they innovative besides finding new ways to bastardize someone elses innovative ideas? And before you go claiming NT is some fresh new innovation, you might what to dig around a bit. I seem to recall that it is actually built on chunks of old VMS leftovers.
  • The only innovation between 1990 and 2000 was at Microsoft? Wasn't there a little thing called a "browser" invented somewhere around there outside of Microsoft?
  • The notion of "Everything A File" is something that Linux doesn't do.
    • Network interfaces are not files
    • While /proc provides access to pieces of processes as if they were files, that's largely read-only
    • Graphical device drivers don't provide a file-oriented abstraction
    And there are probably some other things here.

    I agree that introducing more better asynch processing would be a Good Thing; that doesn't forcibly mean moving to a different OS, when there are such options as:

    • CORBA Messaging Service
    • Message Queueing systems like MQSeries
    • Isect
    I guess the point here is that if it is an effective strategy to add an additional interface to UNIX, then that's likely to be more productive than starting from scratch.

    I disagree somewhat that C++ has provided substantial improvement (I could disagree more strongly, but will avoid flame wars on this!), as it has the problem that there is no ABI definition for it. I agree that moving to a different language, providing additional abstractions, is a direction towards innovation.

    The problem, as I see it, is that, if you use C++ (or C), what you've got is a hammer, and everything looks like a thumb.

    I'd rather see an attempt to build those "bigger and better applications" atop more dynamic languages, whether:

    • Lisp (or variants thereof)
    • Modula 3
    • ML (or variants thereof)
    I've been doing some CORBA programming, primarily targeted at the C mapping. It is a quite horrible mapping; I'm prototyping in Python, and finding that works generally acceptably, and the mapping is vastly cleaner to work with. I expect the same would be true for Lisp, Scheme, and ML mappings, all of which considerably simplify the frightening amount of effort surrounding memory management that both C and C++ thrust upon the gentle programmer.
  • Ok -I'll admit 'innovation' sounds more exciting, but it seems to me that what the general public gets excited about is the refinement of 'new' technologies -MP3, Voice over IP, desktop PC with simple UI's... In general people want things that they can use simply -this is MS's big 'innovation' -'refinement' of existing technology-

    I hardly think that this means the art of research is gone- there will always be wild-eyed geniuses that create things that are truely new, just for the love of it (not the 'market' of it). This is the way it has always been.
    -
  • The thing is you have terribly low demands for what an OS should do (no offense intended, of course). If indeed you are going to ask for thirty-year-old Unix, then thirty-year-old Unix is what you are going to get, and that can very well be written in C (as has been proven several times over, now).

    Hint 1: Delphi, VB++ and so on are not what I consider high-level languages. Think Scheme, OCaml, SML/NJ, Mercury, Erlang, Dylan, etc. These are true high-level languages.

    Hint 2: If you see anything on your system that looks even remotely "binary", your system is 30-years old. Before theoretical computer scientists invented "semantics".

  • I'm not aware of any modern computer architecture that is not based on the good old Von Neumann model [google.com], which dates back to (at least) WWII. If you want to do somthing really innovative, think up a whole new model for computing machines that disregards all existing preconceptions of how a computer should work. Surely the Von Neumann architecture can not be the only possible way to design a general purpose programmable computing device.
    "The axiom 'An honest man has nothing to fear from the police'
  • Your troll is inaccurate. Much of what I suggest has not been finished (and some is still in the discussion stage). Certainly the kernel-side httpd acceleration is totally untested, and may change in the future.

    Anti-aliasing in X is still a matter of discusssion, so you're also wrong there.

    Linux is a hotbed of what I call grass-roots research. This is the kind of research that starts with "I need..." instead of "It would be cool to play with..."

    It's engineering research vs. pure research. Pike is a pure researcher and will never accept the former category. Oh well.
  • Crusoe isn't innovative, that idea has been around for ages, they just happend to be the first one to get it to work on silicon.

    The idea for nanotechnology has been around for ages, now, if were to build robot smaller than 1/100 mm you wouldn't call that innovation? How about time travel, after all, we've all heard of the idea?

  • Yeah the problem here is money, maybe you've heard of it? Ever wonder why some open source projects are classified as 501 organizations? They are meant to be donated to. They don't make much money if at all. Theres little research I can do on equipment that I already have. I don't need to discover the interesting eletrical conductivity properties of a circa 1985 Mac floppy disk controller. Software projects are nothing compared to hardware projects. Oh you found a new way to program an Asteroids clone, boy you rox0r. Research in algorithms is where the big money lies, and if you're capable of doing it well and doing it for free you should probably try a 12 step program.
  • (Comments about seemingly squandered talent elided...)

    So where is the beef? Did they buy up the talent to do something good with it or simply to neutralize possibly dangerous competition? In general MS ticks me off because all that wealth is being used for so very little to really advance the state of the programming art.

    I think there's likely a bit of both.

    If Microsoft could turn the high-powered talent into products to make Gates a trillionaire, that would be a "good thing" (from his perspective, at least). And if Microsoft can deny others from benefiting from their talent, thus preserving the "hegemony," that is worth a lot too, at least to them.

  • by Anonymous Coward
    As an ex-Amiga user, I find that many of the "innovations" of microsoft are only innovations with respect to their own previous product line. I'm sure many mac and UNIX people feel the same way. (Beleive it or not, many amiga people simply assumed versions of MS windows prior to 95 had pre-emptive multitasking, simply because it was such a trivial feature of a usable system in their eyes, the amiga having had it since 1985. Many were thus very surprised when MS touted it as a big new feature of W95, similarly with the multiple clipboards of Office 2000 - the amiga having had them since 1985 too...)

    Anyway, my point is Microsoft does not innovate. It brings other people's innovations into the mainstream. It is a "close follower", not an "innovator", no matter what their PR dept. say.

    Also, I'd hardly say systems research is dead. EROS [eros-os.org] is a very promising pure-capability GPLd OS, Atheos was recently mentioned on slashdot, Exokernels are still in development, The Tao [tao-group.com]/Amiga [amiga.com] Enivronment is new, and ground-breaking system.

    If you ask me, Mr. Pike is either hopelessly out of touch or just spouting the MS line for money...
  • In a word, that's where systems research is. Even if we accept that MS operating systems are the best available for single-cpu machines (which IMVHO is totally laughable), SMP machines are questionable, and DSM and cluster machines are much better off running something else.

    Take a look at K42 [ibm.com] for instance. It is the next generation of a system called Tornado which beat systems like Irix not only in scalability, but also in raw performance.

    You think 32 processors makes your program run 32 times faster? Programs not specifically designed for scalability may typically run 5 times faster, and if you're not careful with cache coherence, it could be 100 times slower than on a single CPU!

    There are plenty of system-level questions left to answer.
    --
    Patrick Doyle

  • Before Einstein people thought there was nothing more to be learned about physics. The future of computing will be with quantum or biological computers (probably quantum in my opinion) and these will create the need for totally new kinds of software systems. Perhaps software for silicon is old but we won't be using it for much longer.
  • Something got mistyped, and so some links got messed up. (So I'll make the list bigger!)

    Things people should be trying out include:

    Several of these are pretty UNIX-like, albeit taking some extra "twists," while others are distinctly not like UNIX.

    Even if you look at these, and go back to a UNIX-like system, there is benefit to seeing the extra abstractions they offer.

  • Pike's been around for a while, as in version 0.1 of AT&T Unix circa 1970. So, if he hasn't seen any real innovations, he'd pretty much be an authority on the subject.

    As for your comments wrt "the standard tools", I'd tend to agree with Rob. My usage of Emacs, TeX, and the GNU C compiler hasn't changed in 10 years. However, even I have to give MS a few points for the Visual XXX environments -- they may have been acquired, but they've been packaged nicely. Try getting Emacs to manage a Java code project the way Visual J++ does (w/o manually editing and crafting Makefile.am's, etc)!

    I realize that Mr. Pike has a ton more experience than I do with Unix, but that still doesn't invalidate my basic premise. The reason that developers still use tools like Emacs, TeX and Unix is that they work. In fact, novelty in this area would actually be a bad thing in that it would make it impossible for all of the Unix codgers to successfully leverage their hard earned skills. For example, the reason that you use Emacs gcc and TeX the same way that you always have is that no one has been able to show you tools that are a substantial improvement. It is a good thing that I can take C code written 10 years ago and still get it to work on my fancy new hardware. It is a good thing that I can still typeset TeX documents written in the 1990s.

    However, nowadays Emacs does a lot more than it used to do. For one thing, you weren't using it to surf the web in 1990. You probably also weren't using it to edit DocBook SGML documents (and if you were it wasn't nearly as easy as it is now). TeX has made improvements as well. LyX didn't exist 10 years ago, for example, and LaTeX wasn't nearly what it is today. The fact that these technologies are built on the successes of yesterday is a good thing, it means that they have a solid base on which to grow.

    The comparison of Emacs to Visual J++ proves my point completely. Microsoft has spent a ton of money on their Visual development tools and yet when they are done they end up with something that is basically a fancy text editor with a system for automatically creating dependency files. I doubt that Mr. Pike would regard that as innovation either. After all, we have had fancy text editors and dependency generators for quite some time. For the most part Visual J++ is just the same old same old (repackaged attractively). If this is innovation at all it is innovation of a very gradual nature, much the same way that LyX is innovative when compared with writing raw TeX in ed.

    I did not mean to state in my original post that only old software was good. New software is quite often better than old software. After all, the new architects are able to learn from the mistakes of their predecessors. My point was that novelty is only useful inasmuch as it helps the end user. Novelty should not be an end to itself.

    Unfortunately Mr. Pike's pet operating system Plan 9 is not generally as useful as Windows (or even Linux), and he bemoans the fact that people aren't nearly as interested in novelty as they used to be. Well, I would be interested in Plan 9's novelty if it ran the host of powerful software that Linux does and was as easy to keep updated as Debian GNU/Linux, but it doesn't.

  • I love the nature of this forum... You can only make broad sweeping generalizations to make a point.

    Network Virtual Memory: I am pretty sure those pages are just pages and they are just stored somewhere else. No systems work to be done, just a driver to fool the system to flush to NIC as oppossed to HDC.

    Skip lists: fast if you know how big the list is going to be. Otherwise, planning for a list of size N can cause major overhead nightmares and bog things down where BTrees are well equipped for this. And, unless you index every node from the first node, you still get about the same theoretical performance out of a skip list as you do with a BTree.

    And yes, the Internet is a burgeoning example of doing things differently. But, no systems (operating systems, relational database management systems, etc.) work fundamentally different because you can surf the web. The fact is, surfing the web is a wonderful user interface for a system that already exists.

    And yes, the Apple compression routines, MS-DOS doublespace, Stacker, etc. were wonderful experiments to give (cough) vast amounts of space on limited resources. They all were slower than not using compression. The fact they were in the OS was a limit of hardware performance not hardware inadequacy.

    As pointed out later in this thread, new, radical types of hardware are emerging that may (I stress may) make us rethink OS from the ground up. The fact is, nowadays, you can get generally less than linear performance where not NP. There is little theoretical ground to cover with the current idea of a computer and systems engineering. It has moved into the realm of algorithms and away from system design (ie what data structure shall I use to model x attribute of a system, rather than implementing a novel approach).

  • Werent pysicists saying in the early 1900s that we'd "just about" figured out all of physics? That physics research was done? Haven't we heard the same thing about mathematics? It's completed except for a "few outstanding problems".

    The only issue is... The solution to those problems opens whole new areas in their field. Science isn't a gradual progression of greater knowledge. Science is a punctuated string of "eureka!"s, spaced apart by periods of dull silence.

    Why should computer science research be any different?

    -Ted
  • Check out all these OS projects out there. Yeah, a lot of them are *nix based but some of them are clean sheet designs. http://www.cs.arizona.edu/people/bridges/os/distri buted.html
  • by The Pim ( 140414 ) on Tuesday June 06, 2000 @07:28AM (#1022351)
    > But is Unix (or Linux) what it was ten years ago?

    Linux and Unix have improved--but in incremental ways that don't introduce many new concepts to the user. That's Rob's point.

    > Is Netscape?

    No, Netscape sucks more now. With Mozilla we will hopefully finally have some progress. Jamie Zawinski has said he still uses a personally hacked Netscape 3.

    > "Where is the innovation? Microsoft, mostly". 'nuff said.

    "I'll just repeat Slashdot dogma, instead of considering that a creator of Unix might know something about systems innovation".

    > Basically, his argument seems to be that if we don't completely change our software every few years, we're being stagnant

    No, he's saying that we haven't appreciably changed our software at all.

    Whereas, when I look around me at the Windows machines, I see integrated mail + contact list + calendar; good multimedia including streaming; input methods that allow entering Asian language text into any application; a debugger that let's you fix code on-the-fly. You can make academic claims about how this all originated somewhere else, or are merely composed of pieces that exist on Unix, but the fact remains that most Windows users have these things today, and most Unix users don't.

    Now, to me a lot of these things still suck, but I can see that to many users, they are great improvements.
  • by Zigg ( 64962 ) on Tuesday June 06, 2000 @07:21AM (#1022369)

    He does have a bit of a point that Linux has mostly been about copying others.

    I'll drink to that. Linux's supposed crowning achievement -- GNOME -- is very nice but deep down it's just a clone. It just looks prettier than what it's cloning. I got all excited about Evolution [helixcode.com] a few months ago; then I went and looked at it when Helix [helixcode.com] did the PR. I'm thinking, ``wow, look, Helix is porting Outlook to GNOME.''. Not exactly exciting or innovative.

    I wondered in a thread back when the Beanie Awards were announced -- where's the category for best new thing in the open source world? All I see are reimplementations. Then again, I guess that's what GNU [gnu.org] always did -- take existing stuff, rewrite it, and bloat it.

  • by jd ( 1658 ) <imipak@ y a hoo.com> on Tuesday June 06, 2000 @07:21AM (#1022373) Homepage Journal
    Systems Research is a very vague term, and I would argue that it exists in Linux as much as anywhere.

    Firstly, what is a system? It is a definable entity, with a definable boundary.

    Using this (loose but sufficient) definition, let us look at Linux and see how it fits in. First, of course, there's CODA. This allows you to have many computers as part of a single system, in a far more flexible way than, say, for NFS. How to get such a hybrid, networked system to work in any kind of efficient way? That's the research part.

    Then, there's Beowulf. How to modify network drivers (and, if necessary, protocols) to create an efficient, high-performance distributed virtual machine? Again, sounds like research to me.

    Is there more? Yes! Linux is the ideal platform to experiment with newer technologies, both where you have one physical machine and many processors, or where you have many machines networked. Because of the kernel's modular structure, you can add and remove different modules on-the-fly, testing ideas in a way that would be impossible under Windows.

    Is there anything new that Linux has added, beyond the model? (Which it just borrowed from the FSF, anyway!) Yes! The aforementioned CODA is a good example. Hot-loading and hot-unloading of kernel modules is another. Linux' IPv6 stack was one of the very first for any OS, first appearing for the 2.0.20 kernel, as an extra patch.

    Virtual consoles, one of the delights of many a Linux user, don't exist for DOS, Windows, or many flavours of Unix.

    ReiserFS, a completely tree-based FS, is unique in it's architecture. No other FS works on that principle. Oh, did I mention it was for Linux?

    I'm currently working on IGMPv3, for Linux, which will be one of the first v3 implementations for a mainstrean Unix. (I know of a few others, but not many. I doubt you'll see v3 supported in Solaris anytime in the next few weeks.)

    What about other areas of Systems Research, though? Networking is dead, surely! Tell that to people working on robust Anycasting, PIMv2, queue shaping (such as CBQ, RED, ECN, RSVP, etc), any of the experimental IP protocols (IPv7 - IPv9, I think), robust MobileIP (even when ISPs are, themselves, mobile), etc.

    Non-networked stuff? System Interfaces are a vital part of a system, and there's plenty of work going on there, with VR, zero-buton mice, eye-tracking, speech synthesis/recognition, OCR, etc.

    Internally, there's work on automatic parallelizing of code, using Critical Path Analysis. Neural nets and genetic algorithms get a fair amount of attention, as do Artificial Life-forms. Then, there's always Quantum Computing, Optronics (such as the purely optical router), Chaos Computing (based on non-linear systems), low-temperature circuits (such as the Crusoe, which also deserves a mention as an intelligently adapting circuit, for it's ability to handle non-native instruction sets, natively), etc.

    WRT kernels, we now have exokernels, microkernels, monolithic kernels and distributed kernels.

    Now, you tell me that ExoPC is old-hat stuff!

    Lastly, but by no means least, SETI@Home, distributed.net and Cosm are moving Systems Research forwards in a way that no other projects in the history of computing have even dreamed of. And this is supposed to be the end of the road?

    (Only if "road" means a bumpy cart track, in a rickety wagon pushed by Ivory Tower professors and paid for by Big Businesses eager to NOT get results before their cash-cows were milked dry.)

  • by Ed Avis ( 5917 ) <ed@membled.com> on Tuesday June 06, 2000 @07:22AM (#1022376) Homepage
    Looking at the sixth slide, I don't see that hardware has advanced very much more than software. So what if a high-end workstation now has Fast Ethernet instead of Ethernet? That's hardly a major paradigm shift. Ditto increasing the amount of memory or upping the clock speed.

    Hardware is just the same as in 1990, but faster. Software is just the same as in 1990, but slower.
  • by David A. Madore ( 30444 ) on Tuesday June 06, 2000 @02:24PM (#1022392) Homepage

    Again, I must go in my usual rant: is there any reason why we insist in still writing operating systems in low-level programming languages like C (or C++, which I insist on considering as a low-level language)? It may be that low-level languages are better suited for writing the low-level stuff like the bootstrap code and the immediate hardware drivers (but even then, I don't see why a high-level language couldn't be extended with the appropriate functions to do the task), but some things in modern operating systems are definitely archaic.

    Why is it for example that we still maintain the memory/filesystem dichotomy? It is about as absurd as requiring that a programmer have to handle the mainboard-level cache by hand. Why is it that whenever any program wants to save data, it must painstakingly convert it into a binary representation? Why is it that subroutines and programs still have to be distinguished? Why is it so painful to have reflexivity (e.g. for user-mode Linux you have to recompile a whole new kernel, the one already in place is not reflexive / reentrant in that sense)? Why is it that for security measures we rely on hardware control (aka MMU's) rather than formal invariance proofs? Why is it that our processors are so big and bloated and distribution / (asymmetric) multiprocessoring / clustering is so far behind?

    For more information about what a high-level operating system might be, I refer to the Hurd (which is high-level, but at the cost of an abstraction inversion, because it is still written in C; notice that the Hurd Really Runs), to Erlang [erlang.org], which deserves to be better known because it's really impressive, and to the all-ambitious Tunes [tunes.org] project.

  • by john_many_jars ( 157772 ) on Tuesday June 06, 2000 @07:35AM (#1022401) Homepage
    Systems work : Computers :: Locomotion : Cars.

    In other words, cars are not being developed to manuever in three independant dimensions. The engine of a car (cpu of a computer) is continually being refined. The console and driving instruments (UI of an OS) are continually being refined. The fact that the car travels on land is not being altered (the fact that an OS coordinates the hardware) is not being changed.

    Real innovations in computing are not coming at a systems level, but at the hardware and UI level. Case in point: POSIX.1. There have been no drastic changes in the "standard" no OS adheres to in almost 10 years. The reason? no real research into what a system must do more of to work better.

    Look at databases: there are 4 critical components to a database (ACID properties). The real research in databases today are not in implementing these properties, but in OODBMS or in RDBMS multi-dimensional queries, speed optimizations, etc. No real systems work.

    Most of these properties of systems can only be improved on trivially and any new innovation will be found to be damn close to the POSIX.1 spec for OSes, have ACID properties of a DB, etc.

    I'll give you something to ponder: is there a better way to implement virtual memory other than spilling over into hard drive? You have to be able to index them and swap them in and out--on things called pages.

    Give me a faster general purpose index (or for that matter, OS-specific purpose) than a B-tree.

    Until there is a new and radical type of hardware (other than storage, input, and output) there is little need to change what and OS does or how it does it radically. Rather, as mentioned way above, the techniques used now are optimized for a particular piece of hardware and/or software.

  • Linux is not a particularly good example of innovation; while there are some interesting bits of social innovation, there isn't all that much that isn't either a replication of what already existed, or a "tuning" of functionality.

    ReiserFS may be pretty cool stuff, but it hasn't led to really new things. There is the offer that it may allow constructing data structures reasonably efficiently via "hordes of tiny files," but nobody is really using that yet, and the "research" side of that is already reasonably well-understood.

    For there to be real research out of something like ReiserFS would require that people start studying different ways of constructing (say) DBMSes by using the abstractions provided by the new FS.

    It isn't really systems research for someone to construct a Linux emulation system to run atop EROS; [eros-os.org] what would be innovative would be to see what kinds of cool things that may have nothing to do with UNIX as we know it can be done with it.

    The problem that he doesn't comment on, which seems to be an important flip side to the notion that Microsoft is a source of innovation, is that, during the 1990s, Microsoft did an impressive job of buying up top researchers, virtually closing down major systems software research groups:

    • Hiring David Cutler and other VMS folk eliminated much of Digital's OS efforts
    • Hiring Mach folk, notably Rich Rashid, essentially eliminated CMU and IBM's Mach-related OS efforts
    • Hiring TP folk like Jim Gray [microsoft.com], author of the wonderful book, Transaction Processing Concepts and Techniques, [mkp.com] pulls considerable transactional expertise inside the Microsoft Hegemony
    • Similar "pulls" have taken place with databases ( Paul Larson [microsoft.com]), compilers (folks who worked on AST Toolkit [microsoft.com]), amongst others

    If people started doing some substantial work on exploring how to powerfully connect applications together using CORBA, that could represent some new work; unfortunately, the tools are still maturing, and the mappings to C and C++ kind of suck, at least for the purposes of generating dynamic applications.

    Remember, Pike's criticisms aren't based on some vague notion that Linux is useless or bad; they are based on the notion that it's not particularly innovative, from a systems software research perspective.

    If 90% of your effort represents dealing with the same old ordinary UNIX stuff, that would be largely familar to a UNIX hacker of the 1970s, then whatever you're doing can't be more than 10% innovative. Note his comment that around 90% of the effort in Plan 9, which was one of the more innovative systems of the last decade, represented efforts to honor external standards. That's a problem.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...