Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Unix Operating Systems Software

The End of Unix? 451

XiRho asks: "Unix has lived well beyond the era in which it was born (the era of the minicomputer) and has survived and thrived in the era of the personal computer, but now, many people believe that soon we will see the transition from that era into the age of the distributed/network system. In that case, has the Slashdot community at large ever considered what the future is? Will Unix finally die off, will it adapt as it did before, or will Unix find a way to remain the same trustworthy system it always has been? And if Unix will come to an end, what does the Slashdot community feel will be its succesor? " (Read More)

Unix may never become big on the desktop, but that battle is still being fought and it probably won't be over for a few years. However we shouldn't forget where the large strength of Unix lies: the network. Unix -runs- the Internet. I don't think any other Operating System can say that. The Internet started on Unix, the Internet was built on Unix and unless something better comes along (and that implies that we don't have "better" yet) the Internet will die running Unix.

Of course Unix, like any other modern OS, must change over time to accommodate new technologies and methodologies, but I see Unix being more able to adapt in todays fast changing Information Technology world than other operating systems based on monolithic kernels.

What do you think? Am I missing something? Is there a Unix killer in the works that I might have missed?

This discussion has been archived. No new comments can be posted.

The End of Unix?

Comments Filter:
  • by Anonymous Coward
    Unix is sort of like Christianity. It's an idea whose time has come and gone. Sure, it was useful for a while, but we're to the point now where it has served its purpose and now should be gently handed its hat and shown the door, because it is no longer needed. My guess is that within 20 years, Unix (and Christianity) will have virtually no adherents except for the rabid fundamentalist types who rave on street corners and who you constantly worry about blowing up buildings or buses. The rest of the world will have switched operating systems.

    Now, switched to what? That's a good question. Windows seems to be attractive at the current time, but let's face it: The type of slobbering invalids attracted to Windows are generally not the people that you would expect to pass their genes onto a new generation. By and large, these people will have electrocuted themselves in the shower or killed themselves in freak blender accidents before they have the opportunity to reproduce.

    So Windows and Unix are both out. So where are we at, then? Well, unless I miss my guess, I forsee a future where both BeOS and Hurd have huge followings. This, of course, is notwithstanding the fact that these operating systems are currently being used only by sociopaths, homosexuals, and extremely fat people. That's okay, though .. I envision a point in the future where BeOS stops just being an OS for the fatties and queers and becomes powerful and useful enough for the rest of us. Heck, who knows? In 20 years time we might be having the BeOS/Hurd flamewars instead of the Windows/Linux flamewars!

    At any rate, we can agree on one thing: We live in interesting times. Let's see how this all pans out.
  • by Anonymous Coward
    > multi-user/group kludgework.

    Brother, I would like to introduce you to the concept of 'the enterprise'. Or even 'the internet'. Your grasp of what is important in large-scale computing seems to be relevant to a thing called 'ms-dos'. Here in the real world, we like the idea of 'sharing' and 'interoperability'. 'Privilege systems' and 'user rights' are nice things, too, but I guess you're perfect enough to run everything as 'root', or as we like to call it, 'firehose-mode'.

    Okay, enough cuteness. Moral: you're an idiot.
    But that's okay; most of us are idiots. Your particular fuck-up is to assume that everyone use computers just like you. Which is a stupid thing to believe, about computers or any human endeavor.

    I'll let you re-think that post, and point out more of your logical flaws later.

    ciao!!
  • by Anonymous Coward
    The Amiga [amiga.com] OS is hardly dead, it's still being actively developed, in two main streams:

    1. a next-gen distributed architecture based on the Tao [tao-group.com] VM (think of it as a language-agnostic generalised VM a bit like Java, but that can run on real hardware too).

    2. the "classic" Amiga OS was extended to PowerPC with WarpOS [haage-partner.com] (no relation to OS/2) microkernel. This allowed the user community to use more modern hardware, such as G3 accelerators, and 3D gfx cards.

    The Amiga OS design, in the form of AROS [aros.org] - the "Amiga Research OS", which recently received blessing from Amiga itself, also lives on.

    For more amiga info, go to www.amiga.org [amiga.org]

  • Lots of places run life support equipment on Windows. I won't my company while I'm under a defacto NDA.

    Tektronix sells Logic Analyzers with Windows 95 embedded into them. That's a pretty "out there" environment where the OS takes a beating and survives.

    Here's the clue: They aren't encouraging (or allowing, in many cases) "Hackers" to add all kinds of kludges and DLL hell. Nope.

    Run in an embedded setting, where the hardware is completely captured, specified, and validated, it's not that risky a proposition.

    Not that it matters in this discussion. Too many zealouts in these parts for any semblance of reason.
  • by Anonymous Coward
    No, the 8.3 backwards compat is NOT a good idea, because wildcards will accidently match the 8.3 versions on NT and delete far more files than you intended. VERY BAD.
  • by Anonymous Coward
    Yep. And we're all here making our comments. Not on USENET, where there are no moderators.

    Well, on the moderated groups, the S/N ratio is usually very good. e.g., compare soc.culture.japan with soc.culture.japan.moderated.

    I download and archive select Usenet groups, and... umm... a lot of them are a total loss.

    Well this was true from day one. Nothing has changed. More crap sure, but more content too.

    The Linux groups have some of the highest traffic, but also the largest volume of clueless noise.

    We're all clueless about something. And that's what the Linux groups are for. To ask questions. Not to post braggings about how you're tunneling IP packets over email to get around the worst firewalls. And they're good at this too. When I have a question, I'll search the groups, failing that, I'll search on deja.com to search the stuff that expired off my server, failing *that*, I'll post. And usually get the answer I need within a few minutes to days. And that thread gets snarfed by deja and other news archivers to help someone else later on down the line.

    The real action has shifted to private forums like /. and listservers.

    Yeah, but the problem of everyone migrating from a common forum (USENET) to Slashdot and freshmeat and kernel.org, and redhat, the mailing lists, etc., is the fragmentation. You've lost the world audience and the single searchable source for answers. I can't know that my question was answered on some obscure mailing list I never knew I could subscribe to. Or worse, you can only discuss whatever the oficially sanction topic(s) of the day are. (see /.)

    The bulk of the Usenet traffic these days is purely Binary attachments. The most popular NNTP clients on Windows machines strip off and throw away all 'text' content, keeping only the attachments. NewsBin, Pluckitt, etc.

    And these programs are good at doing graphics. When I want to download pr0n or sailor moon images, I'll use GUI news readers, otherwise, it's good ol' trn for the bulk of my news reading. I usually hang out on rec.arts.anime.misc. Most non-binary groups with a lot of traffic, excepting the flame or controversial topic groups, have maintained a decent S/N ratio for the last 10 years and more.

    IMO, USENET is better than ever because there are more people using it. Most of the noise lands in the abandoned groups, or the flame groups anyway, so who cares?

  • by Anonymous Coward
    >The Internet started on UNIX... Excuse me, but the Internet (ARPAnet back then) started about the same time as UNIX did, and it was a long time before there was a UNIX connected to the Internet. The earliest systems on the Internet were TENEX, OS/MVT, an SDS Sigma 7, a PDP-1, Multics, etc. Bell Labs had only just dropped out of the Multics project, and Ken Thompson was just starting development of UNIX. UNIX wasn't on the ARPAnet until the Univ. of Illinois did a UNIX implementation, several years later.
  • by Anonymous Coward
    I would love to see some of the best coders and operating systems people put together a new OS from scratch using the latest techniques. Ideally this would create an ultra stable and very modular system.

    Its already been done. Check out either BeOS or QNX. Both are top notch, extremely stable (especially QNX), blindingly fast, and don't carry all of the extra baggage/bloat that legacy OS's have.
  • More reliable? I don't know what logic you're using for that, but umm... no. just no.

    Lets' see, the JVM gets compiled by a C/C++ compiler most likely, then runs on a native operating system. So you've effectively added a layer of complexity to the system where problems can occur compared to C/C++ on said native OS.

    As for Java's usefulness, it's not a bad language, but let's be honest with our evangelism.
    ----------------------------
  • A few thoughts on the future of Operating Systems. It is a truism that small, compact, specialised Operating Systems will perform those specialised tasks MUCH faster than a large, generic OS.

    It is ALSO a truism that you don't get something for nothing. There's always a trade-off. With networked computers, your trade-off is in delays: delays within the network, transmitting the information, but also delays within the computer, which now has a real-time device to constantly monitor for potential traffic.

    It follows that setting up a network of streamlined machines, each dedicated to a narrow range of tasks, will out-perform a single, monolithic system, IF AND ONLY IF the savings from the parallelising is greater than the penalty imposed by the network.

    The future of Operating Systems, as I see it, will consist of intelligent distribution of tasks over a wide-area network, and where any program is parallelized at run-time by the primary OS. However, I also predict that 99.9% of all software will be kept local and run serially.

    It would seem to follow that Unix will be extended to support run-time parallelization and fairly sophisticated task/net load-balancing.

    It would also seem to follow that there will be a blurring between =PROGRAMS= and the network, NOT the machines and the network, OR the OS and the network. Why should there be? If it's not visible to the user, it won't last.

  • I have often heard 'Unix is a Legacy system, old technology from the 60's and 70's.'

    This is patently false. What Unix _is_, is a system of tools which have been refined for 30+ years. AFAIK, 'Legacy' should be used to describe something which is no longer developed or supported. At least this is the general use of the word.

    New doesn't mean better, and in many cases, it can mean untested or unproven. Unix is proven technology. But above all it's a proven philosophy.

    Call Unix whatever you like. Text-book Sys V, BSD, or clones like Linux. The underlying philosophy is largely the same. THIS is why Unix has been around for decades (besides techinal merrits).

    In the philosophical sense, I doubt Unix is going to die, ever.
  • Comment removed based on user account deletion
  • There is no such thing as "Iron-clad" validation, but validation tied to something physical (retinal scans?) will come pretty close to this in less than 10 years, IMHO.

    Everyone always seems to think that retinal scans are a decent method of validation.

    However, all they will do is provide a key. Furthermore, it's much more secure to remember your key than to write it down somewhere (think: decapitation). Until we have mind-reading technology, I'll stick with passwords, thanks.

    Hamish

  • You can put down your weapons there, friend. I've been a registered Be developer since January 1998. I have used every release of BeOS x86. And, as far as I know, Be has *always* promised to add multi-user support "any day now." I wouldn't hold your breath.

    -jwb

  • Unlike many of the alternatives to UNIX mentioned by other posters, Vapour [rook.com.au] is an OS design that totally breaks with the UNIX concept - it has no kernel, no processes, no filesystem. Memory protection is implemented at the language level rather than the OS level. Persistent virtual memory (similar in some respects to EROS' design) replaces the traditional filesystem. All parts of the system may be modified at runtime. It's still very early in development, but the design is fairly comprehensive at this stage.
  • While I happen to think that all the Unices are probably superior platforms for this type of application, don't forget about some of IBM's offerings: AS/400, OS/390, and so forth.

    Big Blue is active in the development of Apache, and I would be surprised if they weren't sneaking parts of Apache to their other OS's, which have anemic TCP/IP suites to say the least.

    Just a thought.

  • Well "we" may have fought the powers that be, but on the whole "we" (whoever that is) don't matter much. For large coporations, and ISPs and for those trying to provide services of one kind or another (ASPs, if you like the jargon), centralisation is the only way to go.

    The more stuff you have on the user's machine the more your support costs, as it breaks, needs to be replaced, etc, etc. When you look inside large companies you see a greater and greater tendency towards web-based and other centralised systems.

    Simon
  • I don't think UNIX will die anytime soon!

    The reason is simple: both the commercial UNIX variants and the "freeware" FreeBSD and Linux variants have extended the life of UNIX far beyond it was possible in the past.

    The UNIX of 2000 can do multiprocessing, multithreading, SMP support, powerful networking, and even massively parallel processing via Beowulf clusters. And engineers are already pushing UNIX so it supports 64-bit processors like the Compaq (neé Digital) Alpha and the upcoming Intel Itanium CPU's.

    In short, I just don't see any replacements for UNIX in the near future. The technology in Windows 2000 may be a major competitor, but it won't be a UNIX replacement.
  • Given:
    1 - Engineers / Computer People need a command line interface
    2 - Regardless of the virtual reality smell-o-vision interface, there will be a command line at the kernel of a real (READ: non-consumer) OS.
    3 - The idea of a filesystem hierarchy isn't going anywhere.

    We should be able to plop down at any box and know that binaries are in /usr/bin and ls lists files. People make things similar to what they like. Witness Linux. Linus liked the UNIX he used, so he cloned it. Even BeOS (AFAIK) has a UNIX-ish filesystem hierarchy.
  • Superior Uptime- Name one non-Unix based OS that can beat a unix based OS for number of hours of uptime.

    MVS, VMS. That's two.

    There are others, such as the OS on Tandem NonStop systems.

  • 1. UNIX kills at networking

    Always has, always will.

    UNIX didn't have good network support until BSD UNIX. I used to run V7 UNIX on a PDP-11, the only networking that it supported was UUCP.

    2. UNIX has superior reliability

    Therefore, UNIX was coded for rock-solid stability.

    Don't make me laugh. UNIX was written at a research laboratory as a research project. Error recovery in the UNIX kernel was limited to calling panic(). Running out of memory or disk space did bad things to the stability of the system. The file system tended to self-destruct if power failed or the system crashed. UNIX applications often didn't check for errors. If they did check for errors, they didn't attempt to recover, they just called exit(). Kernel device drivers assumed that the hardware was in perfect operating order. DEC used to burn in VAX systems with UNIX because it was such a good system hardware diagnostic. Any flakey hardware would crash the system. VMS would run just fine on many systems with multiple hardware problems and glitches.

  • Um... Novell, SCO, Digital... ?
    Never mind the scores of 'unix-like' embedded RTOSes?
    And I know at least one vendor (my former employer, VenturCom (at vci.com)) had an embeddable version of Novell's UnixWare.
    Unix and Unix-influenced OSes have always been out there in every market. The Unix-dieback was a dieback of the workstation and a little bit in the lightweight server area.


    --Parity
  • I've looked through the comments, and seen a couple of what I regard as misconceptions:

    1) that UNIX will be killed of by Windows or BeOS or ...

    I think that if something is to replace UNIX, it will have to be something _significantly_ better than UNIX. I don't know of anything else currently out there that can claim this (although I will admit to not having looked at Plan9 and Hurd).

    2) that UNIX is immortal because it is somehow 'above' operating system paridyms (sp)?

    This is clearly wrong, is the whole UNIX model is based on a hierachy of files, where a file is a stream of bytes/characters. As a consequence of this, a most of the tools that make UNIX so powerful are text stream based.

    I see room for a new sort of operating system that works with a much higer level of abstraction throughout, including type management and garbage collection as part of the OS (relegating streams of text to where you really just want to store text information), high level messages between components allowing easy distribution, different views of the file system (or maybe not even so strongly file-system based), and maybe taking some component ideas from Apple's failed OpenDoc to end monolithic applications.

    However, I don't think that there is anything out there that comes even close to this, and there won't be for a while, and even if there is, it will take years to catch on.

    UNIX will only be overtaken when the problems of the high level of complexity required by what people want to do need a higher level of abstraction than the basic model of UNIX can provide, and even then, UNIX will take a very long time to actually die - just go the same way as DOS that is still widely used but generally regarded as inferior by most people.
  • This sounds like the Condor system at the University of Wisconsin, Madison. I have only read a little about it but as I understand, it works as you describe. Check out their homepage at http://www.cs.wisc.edu/condor [wisc.edu].
  • Amiga has died so many times, nobody takes the report of Amiga's death seriously anymore.

    True. Nut unfortunately, nobody takes reports of the Amiga's rebirth seriously anymore either.

    Two more weeks! BIG!


    ---
  • I really had an interesting talk with one of my professors a couple of days ago and pretty much found that all the major universities are using Windows type development models for their CS programs.

    That isn't even close to true. In fact it is a pretty silly statement given that it would be virtually impossible to get all of the major universities to agree on anything let alone something that specific.

    I think you would further find that even in universities that have adopted MS-centric curriculum that it is not pervasive throughout their entire CS program. Any university which would purport to offer a well balanced and rounded background for their students would be ill serviced by making their program so focussed on a single company's technology. That is the sort of thing that lower end institutions such as trade schools and community colleges do, not major universities.

  • Observe: MS-DOS was, IIRC, originally intended to be a 'simple' Unix-like OS for the PC

    Actually, MS-DOS was originally intended to be a clone of CP/M targetted for the 8086/8088 instead of the 8080/Z80. About the only UNIX-like feature it included was starting at 2.0, it incorporated heirarchial subdirectories (unlike CP/M which had a flat file structure), the original 1.0 and 1.1 versions of MS-DOS used a flat file system like CP/M.

    CP/M, in turn was originally intended to be a simplified subset clone of RSTS/11, a DEC minicomputer OS for the PDP-11 family, except targeted to the 8080/Z80 based microcomputers of the early hobbiest computer era. RSTS and RSX were the DEC predecessors to VMS. VMS, as we know was primarily architected for many years by the same person who went to Microsoft to design the NT kernel. All of the MS-hype aside, NT is largely a reinvention of Micro-VMS with the Windows GUI and the MS-DOS command interpreter grafted on, and without a lot of the stability and scalability that made a lot of people like VMS (I wasn't one of them mind you). Yes, for the inevitable Microsoft apologists, that is an oversimplified view.

    Yes, each of these took a little different take on Unix, and tried to re-invent the wheel: but the influence of Unix cannot be safely ignored.

    UNIX only influenced those other products by being a competitor that they were trying to respond to. Notice that virtually all of the proprietary minicomputer OSes are dead or at least lifeless zombies, despite all of the years of predictions of the doom of UNIX (for mostly the same reasons people predict the demise of Linux). NT/W2K is, in my opinion, the last great hurrah for proprietary OSes. UNIX on the other hand has been much more equipped to change with the times and adapt to new and different purposes. The fundamental difference is that it is built with a different philosophy, one that small is beautiful, and that simple tools which do one thing, and do it well and can be put together to solve more complex problems is a better way to do things than the huge, integrated, monolithic monsters that the proprietary OS world puts out.

  • I would basically agree with most of your assessment that traditional micro OSes (including NT) were designed with a pretty limited vision of their place.

    The AS/400 is probably about the only proprietary OS mini that hasn't died off, and I think that is partially because it comes from IBM, but also because it never really tried to compete directly with the other minis (despite IBM's intentions). It basically found its own whole seperate niche. While the AS/400 may be bigger and more profitable than Sun, of course Sun is only one of many UNIX midrange vendors. They are nowhere near 1/2 of that market, as there is still HP/UX, AIX, Compaq Tru64, SGI Irix, etc. At any rate, while the AS/400 is probably going to continue along as it has been for the forseeable future, I don't see it as suddenly starting to increase its market share or start to move outside its current niche markets.

    Compaq probably makes more on VMS than they'd like, but it is obviously a legacy system that is getting slowly replaced. They aren't pushing it to new customers and their customers have mostly targeted other platforms as their future directions. I'd classify VMS as one of the 'lifeless zombies'. It is dead, but that doesn't stop it from shuffling about a bit.

    You are right that UNIX hasn't won the midrange wars yet, but a lot of the competitors have dropped out, leaving NT/W2K as the last 'great white hope' of the proprietary OS.

  • You're right that the AS/400 is primarily a nitch product. However, IBM is trying to change that, and AS/400s running more 'normal' tasks like web servers and Lotus Domino are appearing more and more. It's market is expanding.

    It's market should be expanding, but I don't see that happening very much, and I live in an area that is about as condusive to the AS/400 as there is (the plant where they are built is only about 200 miles from here, and there are a lot of stodgy insurance and financial companies around). Mostly the newer features they are adding to the AS/400 are just slowing down the defection rate in existing AS/400 shops.

    Still, most shops get into the AS/400 because of some line of business application that only runs there, or they're a true blue IBM customer.

    Bingo. Usually it is something like an accounting system, and often the AS/400 is used only for a single purpose. A lot of the true-Blue IBM shops (tons of those around here) use AS/400's, but there shops often have IBM mainframes as their main back end system, and often RS/6000's in the middle tier as well.

    The other tasks are largely an afterthought.

    Yea, and mainly only in smaller shops where they don't have the financial means to support multiple platforms, but they want to add stuff like email, groupware, etc.

    I don't think even IBM would consider the 400 a head-to-head competitor to Sun and DECpaq -- that's why they have the RS/6000s.

    Well, I think that a few people on the AS/400 team at IBM think that the AS/400 is a direct competitor to Sun, Compaq, etc. Most of the other IBMers I've met have a bit more realistic viewpoint of where the AS/400 is, and where it is going.

  • It's pretty surprising you are running into so much new AS/400 stuff out there, it could just be a fluke or something -- maybe you are getting those assignments because you've heard of AS/400's or something. :-)

    As I said, I live in the AS/400's back yard, and I'm seeing the opposite happen. Most of the AS/400 shops are hedging their bet by implementing other platforms (AIX/RS6k, other *nix or Wintel) beside the AS/400 for other purposes. Only the most hardcore AS/400 people are putting stuff like web servers, groupware and email on AS/400's. However, few are making that serious of moves toward abandoning the AS/400 for the stuff that it is doing today, so the AS/400 really isn't losing ground too quickly either. It's kinda like where Netware seems to be these days. It isn't making many new converts, but I don't see anyone actually ripping it out either. Most of the people who were talking about it have backed off or at least slowed down their timetables to do so.

  • I can guess based on what I've seen on the net about Apple's OS X, but it appears to be an evolution of UNIX (BSD 4, right?)
    (I will state at this point that I'm an HTML and graphics guy; I am not a programmer in any sense of the term, and have desire to be one in the sense of writing big applications and pouring over code for bugs. I will leave that to the experts! :)
    However, I see Apple's OS X as a *possible* evolution of what UNIX can be: it's got the friendly GUI on top and the creamy fudge of BSD on the bottom for those who want to delve into it. From what I hear, you can code into the BSD and it works well. I could be wrong.
    In the same vein, with Linux, UNIX ain't going anywhere! More and more people everyday are installing and using and learning it, and there will be more in the future. The open source concept will live on, and I don't think UNIX is going anywhere: it does what it does to well to be replaces with anything else!

    Pope
  • The death of unix is a false prediction that's been around forever. With the birth of linux & *BSD I would say that it's experiencing the largest surge in popularity for a good 10 years.

    I see unix adapting in a few needed ways (and I think it will be lead by open source software):

    1. Adaptation of the security model; file system ACL's are something that all IT personel appreciate. In todays enterprise it has become almost necessary for file server environments and is venturing way beyond that. As I tell people @ work NT has a pretty good security MODEL, but it's not the best implementation. I'm not saying stick NT's model into UNIX, but do like unix and OS software have always done....improve upon current implementations and develop new ones. This covers not just ACL's but many other things.

    2. Fragmentation....Always a hot topic! Linux is starting to see fragmentation whether we like it or not and this is where I see companies like RedHat as good things. Take KDE and GNOME for example they are both wonderful desktops and the developers are working towards common standards for drag and drop, etc. This is the kind of thing I like to see, but unfortunately the effort hasn't produced obtainable results...yet. This is where a standards body would help us all out considerably because in the end it means more apps and functionality for everyone in the game.

    This is just a short version of my $0.02....

    -Aaron Dokey
    Gainful Employee of Technology
  • There are many things that could be done to improve the speed, scalability, security and reliability of UNIX systems (including Linux and *BSD). Some of these might require substantial kernel rewrites but they don't require a brand new operating system. Sun pulled this off when they switched from SunOS to Solaris.
  • I don't think Unix is an operating system. It is a family of operating systems.

    Look at GNU/Linux. It isn't really Unix, but is really close. Isn't it some kind legal problems with naming? And besides, GNU and Linux were actually two separate operating systems in development that eventually merged, neither really Unix but now we are all happy.

    Now Apple is putting Unix inside of Mac OS X, which I would say the opposite: Apple is putting MacOS inside of Unix. Apple is known as an innovator in the software industry, who have seen far. But they have been standing upon the shoulder of a giant named UNIX.

    And if you stare at DOS hard enough, it starts looking like UNIX (anyone know "TYPE FILE.TXT | MORE" ?) but I won't go there ;)

    My point is that you won't be able to look at a new OS and say, hey! that's not Unix (unless you are explicit like GNU). You will just kind of see it and think, hmmm, it's different but it's cool and UNIX.

    Who was the guy with the quote from Henry Spencer as his .sig, "He who don't get UNIX will make one that looks like it, only worse." I know I am way off but there may be some truth to the notion that there is something very pure and natural about UNIX, and elegant OS.
  • I don't want to start a flamewar, but what exactly is supposed to replace huge reliable Unix systems to run corporate databases, web servers etc.?!

    While I'm unsure about Unix' (Linux') future on the desktop, I'm very confident about the professional part of the computer world.
  • I was under the impression that the number of *people* using Linux was over 10 million, which would lead to a much larger number for the total # of linux boxen - probably a few times that, given the number of people with multiple boxes, the amount used as webservers, newsservers, etc, and the amount at buinesses... but UNIX is everywhere - as I type this on AIX from a building that amazingly has mostly AIX boxes and Thinkpads (one guess). There are a fair number of linux boxes around site...

    Might have to add 2-5 for the # of S/390s running linux, too ;-)
  • Unix will not 'die'; neither will DOS, or MacOS, or Windows --any technology that it's innovative and gains enough mindshare lives on in other technologies. As long as a philosophy, concept of feature is liked enough, people will re-implement it under different systems... Do you forget that Linux was a Unix re-write?

    Sadly, it seems that things aren't evolving as fast as they could. Linux *is* a Unix rewrite, and it's still not that much better than its foundation. NeXTStep re-implemented Unix a decade ago and it did a much better job --so much so, that its successor Mac OS X has still a big technological edge...

    I wonder if the Linux community can look ahead far enough to stop worrying about backwards compatibility with older Unices and start innovating in the fundamentals of the OS. Security, administration, configuration, maintenance, documentation and quality of service are hopelessly krufty and kludgy in most Unices and Linux too.

    There was a point that Linux needed to emulate its siblings, to remain relevant and useful. But in the next few years it's quite possible that Linux will become the dominant Unix-clone. Compatibility and tail-light chasing be damned --we need to innovate.


    engineers never lie; we just approximate the truth.
  • What Unix _is_, is a system of tools which have been refined for 30+ years....New doesn't mean better, and in many cases, it can mean untested or unproven.
    I sometimes think that Unix is to operating systems what Levi's jeans are to clothing. A pair of Levi's hasn't changed much since the late 1800s, because they're a highly optimal solution for covering your lower body from the elements. A few minor differences and improvements, yes - like removing the crotch rivet [urbanlegends.com] - but a pair of today's 501s would be readily recognizable by Levi Strauss.

    Fundamentally, Unix hasn't changed much since sockets and TCP/IP support were added to 4.2 BSD, because it's a highly optimal solution for running a bunch of different stuff at the same time on one machine. Yes, a few neat modifications like loadable kernel modules have come along, but today's Unix would still be largely familiar to a programmer from 15 years ago. Can the same be said of Windows, or MacOS?

  • I doubt NT was supposed to be a UNIX killer for you hardcore users. People used to 100 day uptimes and the whole multi-user thing could never adapt to NT. As for more casual users, well, NT succeded. In that market, unix IS dead.
    PS. The fact that its kicking ascii shows how dated parts of it are becoming. All major OSs (ie. NT and BeOS) have moved onto unicode.
  • I'm fully expecting that when IT managers are looking to do their next round of upgrades, Linux will be picked more often than W2K. IT people are tired of Redmond. Tired of the security problems of a bloated, closed-source OS, tired of service packs, tired of having to run around rebooting servers that forget to malloc their gig of RAM. W2K will be just another round of expensive, kludgely fixes and poor performance.

    >>> Yes, people ARE tired of the security problems, instability, and price. HOWEVER, they could care less about it being open source; with X and the current state of desktops Linux is just as bloated as windows; MS releases services packes every few months, Linux releases a new kernel rev every few weeks and core apps (believe it or not X and KDE and GNOME and their attendant libs are part of the OS to most people.) are update every few weeks. Finally, W2K does not perform poorly. It takes up space, but if you've got 128 meg of RAM, it is just as fast as KDE and a hair faster than GNOME on my system. And IE is nowhere near as bloated as Netscape. Yes, W2K is pretty bad, but don't lie about it. And Linux isn't exactly the greatest OS ever made either.
  • The BSD family are firmly embedded in many vendors' networking infrastructure - both packet and TDM (telephony). Once it's there it will be there for some time. The penetration is increasing, as new entrants in these markets use FreeBSD, OpenBSD, and the like as convenient, stable, open platforms for their networking products. (It's particularly suited for packet routing, since BSDs are where the software was developed in the first place, and the BSD interfaces now serve as the default interchange language for exchanging software sources.)

    Mainframes running UTS (mainframe-compatable clones of SVR2 and SVR4) now handle mission-critical functions for many large companies: All the Baby Bells, for instance, do their long distance billing data capture on it, and run their where-are-all-the-wires databases on it. (If it ever went down all the long distance calls would be free until it was back, which is why uptimes in years are mandatory.) Brokerages support their trading with it (even more $/second if it ever went away). Web sites run on it. (Apache has been there for a while.) And so on. And of course they fixed the Unix clock-rollover bug long ago, so they shouldn't have as many hiccups a few decades down the road when it finally rolls.

    Semiconductor design is done with tools that run on Unixes. Some have been ported to Windows to try to take advantage of the cheaper crunch - but not many, and there's little demand for them, since they can't be easily combined into a design flow with scripts. Some of them are now being ported to Linux to achieve the same cost savings. This is easy. (For many, it's just copying the source tree and running "make", for some it's a little tweaking.) And on Linux you DO have the scripting tools, plug-and-play with Unix networks, and a familiar environment. So this IS being accepted - nay, demanded - by major ASIC design ooperations.

    Billion-dollar companies in trillion-dollar industries are depending on hundreds of large applications written to run on unix. If they ever DO port them to something else, any bets on whether it will be something new, or another flavor of Unix?

    (And right now Linux qualifies as a flavor of Unix for this discussion. Windows, NT, and OS2, of course, do not. What a pity.)
  • Supposeably that's in 5, which will be coming out before the month's end.
  • UNIX is really just a way of organizing an operating system, rather than an OS itself. The essential features of UNIX (user-process-shell-kernel architecture) will probably be in use as long as there is a command-line anywhere on the earth.

    Since it's so much more versatile than an all-in-one GUI approach (like win9x but not really like the original NT architecture) it's unlikely to die out as a concept even if every single currently existing UNIX maker goes out of business.

    Eric

    Want to work at Transmeta? Hedgefund.net? Priceline?

  • I use Linux these days. It is hardly the same Unix that I learned on in the mid-80's. And that wasn't the same Unix that was developed in the 70's. And it doesn't matter. I will switch OS's again. I don't know when. But as long as there are enough people who want what the Unix view of the world provides, irrespective of the name of the particular implementation, it will continue to exist.

    I can run the same scripts under Linux, AIX or FreeBSD for the most part with very little portability problems. For me, Unix is a set of tools and a lot of leverage. It is the idea that I should be able to carry my tools and data with me until they are no longer necessary rather than making the programs that process them artificially obsolete.
  • Sorry.. I only got the lightweight corporate throwaways back in those days (89-91). Didn't see too much Novell or Digital, but SCO should have been up there.. Perhaps BSDi.. Naw, not BSDi.

    Embedded UnixWare?!? What time frame?? I'd didn't hear of such a beastie until 94-95! Granted, that was when I was trying the embedd microcontroller thing, so I probably was just out of touch.
  • I'm reminded of a quote I heard in a CS lecture:
    "We don't know what the programming language of the future will be, but it'll be called Fortran."
    This isn't to say that Unix will become a niche technology as Fortran is, but somehow there will always be a Unix, no matter how it actually works.

  • Nothing short of an asteriod collision will prevent unix for surviving.

    Sometimes I get the feeling that my Linux box will still be accumulating uptime even after the cockroaches have inherited the Earth...

    numb :)
  • Everyone has the same basic problem with their OS: Configuration nightmares.

    Microsoft invested enormous amounts of capital in self-configuration for Windows. As a result, despite complaints to the contrary, Windows _is_ less costly/risky to install/reconfigure with novel hardware. Aside from the encyclopedic knowledge base required, the logic behind this sort of autoconfiguration is fairly impressive. It approaches some of the best induction algorithms ever fielded. But it still is a pain to install new hardware/software components.

    What this means is that the OS of the future is going to have to focus a lot more on dynamic autoreconfiguration with relatively sophisticated "truth maintanence" induction algorithms that draw on, and are sensitive to changes in an enormous rule base supplied via the net by the vendors of hardware and software components. Further, OS vendors are probably going to have some sort of minimal "boot up" network configuration, similar to the dumb video modes used to get your better video drivers installed.

    The next stage beyond that is a very high level interface specification language that can describe the hardware and software to the OS which will dynamically re/generate the driver/libraries for its particular configuration (with appropriate roll-back safeguards). Such systems will eventually even have Hotspot-like dynamic optimizations built in so they can generate encached code on the fly to the spec of the high level rules based on the patterns of usage and other dynamic information. Much of the nasty code that goes into autoconfiguration and driver installation will be annealed via dynamic compilation and inference and eventually hardened and optimized -- always sensitive to changes in the high level rules and descriptions.

  • This is a very strong point on matters that will drive the market for the next several years.

    I work for a software company that is moving away from the "shrinkwrap" package we've been selling to an ASP/Portal model, and you've stated exactly why.

    Instead of supporting a million installations of the software all over the world, on 8 different operating systems on seemingly infinite different system configurations, we will only have one installation, and we will have physical access to it. If something goes wrong, we can get up and go fix it.

    Most of today's browsers aren't exactly "thin," but the thin-client metaphor fits -- they may be bloated, but they're ubiquitous. Various unices of various weights can fill just about every niche out there.

    While QNX is not exactly Unix, it's growing on a branch pretty close to the tree, and it's the underlying system on all of those i-openers everyone seems to be stocking up on these days. It's also embedded in thousands of things today, and has a toehold that Linux has not yet achieved... but it is still an example of "a" unix running the appliances. It was almost even the base of the next Amiga OS.

    Web-based systems are only going to grow in strength and number for the next several years, and the myriad of Unices and their offspring will be morphed to fit into just about every niche.

  • There have been various attempts at distributed operating systems in the past; some, such as amoeba [cs.vu.nl] and plan 9 [toronto.edu] are actually usable, to a certain point. Unix is here to stay--it's not going to disappear until someone maxes the next "big step" in computing, and maybe not even then. There are a number of projects out there that are quite interesting. On the one hand, you have OSes like QNX which were designed to be entirely distributable (for lack of a better word) from the ground up. One of K&R (I can't remember which one) once said that one of the places where unix failed to take the "everything is a file" concept to its logical conclusion was networking. (In plan9, pretty much everything, including networked stuff is a file). I don't think that distributed OSes will kill unix, but that unix will eventually become a distributed OS. For example, GNU/HURD [gnu.org] (which is getting along very nicely BTW), while not an attempt at a distributed OS, is designed in such a way that it will be easy to transform into a distributed system.
  • People now believe that birds are the descendants of dinosaurs. Stranger things could happen

    ;^)
  • I remember when Unix was a terminal-based OS. X-Windows and the Athena project seemed like a totally new world and a new way of doing things. Sure, it was still Unix, but it wasn't the same Unix.


    Although I am sure that someone did something with that new fangled thing called curses. With curses and the like you can create a windowing system on a unix environment quite easily. This was a natural outgrowth of programming.

    GNU has likewise changed what was Unix, and, despite it's acronymic denial, has become Unix. But not the Unix from before.


    Other than costing a lot and having some little quirks that are anonying to people who are using linux now how is there a terrible difference?

    The next Unix will not be today's Unix. But it will be Unix!

    The unix that I use at home for the most basic things probably has not changed terribly from what a person in earlier times thought of. Although I do rely on various graphical input methods I could say take the base install for debian and have it pass as unix. True there are some extra bells and whistles but essentially they remain the same.
  • Actually, I think only fucking morons like you would say "So far, so good!" if you jumped from the 30th story building.

    Actually isn't it really meaningless what would be said? Considering that you are going to go splat anyway?

    What I really think is that you would have to have hard evidence that unix was in fact dieing. You would also have to make an intellectual leap and define that exact moment that unix "jumped" and started doing something stupid that was largely irreversable and untreatable. The analogy is flawed and crappy.

    You could say that if running unix only on big powerful servers is dead. However something called linux came along and the definition of a "server" converged largely with what people now use for their standard computing tasks. I would wager a bet that if your machine is really, really, really, good at playing the latest computer games then you are a good choice to be a server for something relatively normal.
  • If you mean distributed computing or spreading a task among multiple computers that are linked together, you probably mean Beowulf.

    No I mean having say an application that could request data from a server and do some little thing. Now take that little chore and replicate thousands of clients that could transparently work on all of the machines on a network. Everyone from the secretary to the CEO of the company could have one of these little things on their desktop. Now say I want to figure out something really, very complex. How about how many times people have complained about product XYZ and how that correlated to the stock prices over the last 50 years. Now that data could be done on some mainframe with a high rate of failure or requiring special attention. However if you distribute the task to say 30,000 clients to work on in their spare time I would dare say that an answer could be easily found within the hour. All the main server or set of upper level area servers would have to do is just run the client solve their portion of the problem and then return the result back to the server they are responsible for and just correlate the final information.
  • Of course not. People have been predicting the death of Unix almost as long as they've been predicting the death of mainfraimes. And a year of so ago, IBM released a mainframe (don't remember the model) which became the best
    selling mainframe ever in initial 6 month sales.


    The stats? I would be interested in which one this is, how many people bought it, price, etc.

    I predict that 20 years from now, we'll still be hearing how Unix is dying and almost extinct, prolly by the same people who will still be saying the same thing about mainframes.

    I really had an interesting talk with one of my professors a couple of days ago and pretty much found that all the major universities are using Windows type development models for their CS programs. Essentially I was faced with a rather unpleaseant concept. Basically I could be forced into buying a new machine just to do standard coding.

    What people are saying is that as a percentage of people who are using the medium in which the thing is in that it is decreasing in share because more and more people are entering the fray. I am sure that if you were to look at all the computers that are using unix the figure has gone down from the best of times for unix. You can also say that for mainframes. Generally people do what is best for them and choices start splintering.
  • I find this idea laughable, we made this transiton long ago.. look at how many people use their PC solely to surf and as a really heavy typewriter.

    What you will see in the future is more things like Napster, Quake 3, etc. Things which take advantage of the network but do so by using the power of the PC and not relying on something like a web server so much. I can definitely see the web shriking in the near future. Imagine how fast, efficient, manageable, customizable, etc, etc, eBay could become if it was a simple client application with the logo graphics cached, connecting to a distributed server farm, keeping track and tracking your auctions pulling data straight from the eBay databases, etc? Think of how efficient Slashdot could be as a distributed client application, relying on the PC to do a lot of the computation like sorting, getting the slashbox data, etc, etc...

    Esperandi
    And hopefully it will be fueled by adware. Programers get paid, users get to use for free, and the babies who don't realize that advertisement is subsidizing a life they wouldn't be willing to pay for get a fe more sharp kicks to the crotch.
  • I can see it now...

    (The scene - two cutting edge technologists talking about their favoured operating systems circa 2000 years BCE)

    Gilgamesh (Bronze caster from the city state of Ur) : I tell you Nergal, this technology you're using, it's going to die out and nobody will use it any more ...

    Nergal (Basket maker from the city state of Akkad): Ahh, away with you, everybody uses this technique of making baskets. Besides all the manuals are written in Akkadian, people will still be using Akkadian 5000 years from now!

    Hahaha only joking, but hey....

    Actually I heard the US military were looking at devising language / symbols to put on their high level nuclear waste bunkers, so when humans or whatever stumble across high level dumps 10,000 years from now they'll know what's inside is dangerous and should be treated with caution. When you consider that the history of human writing only goes back about 5000 years this is some task. Can anybody help me track down some of the literature/ urls/ research about this project? Many thanks.

  • Unix is an operating system. Its purpose is to operate the hardware inside your computer, and to provide a programmatic and generalized interface to that hardwares' capabilities. As long as Unix continues to operate popular hardware and provide an interface that programmers like, Unix cannot die.

    This touches on a point that is relevant to most people here: that's why GNU is a unix-like system. At first RMS, what with his love for all things lisp, had thought about making the free OS he was planning a really big lisp environment. But he realized that in order to be a general-purpose system, it would still need to be built on top of a general-purpose OS; so he chose unix. And that's why we have our wonderful unix-like system with Emacs (= a really big lisp environment) running on top of it.

    The best way, imho, to make your wonderful ubercool environment it to build it on top of (a subset of) unix. That way you can let unix take care of the mundane things (device drivers and whatnot) that you don't want to, and be quite portable.

  • If the move to network/distributed computers happens (which I really can't see at the moment, being in a contry where you still pay for net connections by the minute), it's an advantage, not a threat to Unix systems.
    Microsoft still controls a lot of desktop machines, but the networking code in Windows 98 is so broken that people might consider upgrading to Linux or *BSD if they were doing more networking.
    Embedded devices, another part of this move, are another big chance for Unix-like systems (primarily Linux and PicoBSD) - I think Linux is in use in more embedded devices than Windows CE already.
  • VSTa [vsta.org] is a plan9ish system that is under development(not much at the moment) and released under GPL. Take a look at the homepage. I think that kind of system is the future in distributed computing.

    /Erik

  • Unix is not just an operating system. If it was just a kernel then maybe it could die. Or if it was just made by one company then it would die.

    Really anyone can make a Unix with in a year.

    Unix is a set of command line applications.
    Is everyone going to suddenly realise that grep and wc were kind of silly and should be gotten rid of?

    Unix is organised filesystems.
    People are going to decide that placing libraries at random is better?

    Unix is standards.
    More than just the posix standard there are tons of standards. Even if they change they'll still be Unix a heart.

    Unix is a philosophy.
    A pretty good one. Pipes are good. Shell scripts are good. And small programs are less buggy.

    Maybe someday people will say, "I only want to deal with the files in my /home dir. I want a graphical user interface. I don't want to type anything anymore" But still inside the wrappings it will be Unix.

  • The beauty of Unix and its deriviatives(ie Linux) is their high level of customizability. If you have a device dedicated to a specific application, and you want a dedicated OS, you could write your own, or you could start with unix/linux kernel source and customize from there. That is in fact what many embedded device makers do. After all, who would want to run life support equipement on windows :)
  • ...would love to see some of the best coders and operating systems people put together a new OS from scratch using the latest techniques.

    It's been done: Plan 9 [lucent.com]

  • Unix is any OS that I can bring to a screeching halt by typing kill -9 1 at a root prompt.
  • Empirical evidence, I think, speaks for itself. 30 years in the running and no sign of 'death' yet.

    A man jumped from the top of a 30-story building. Around the 10th floor, a person called out to him, "Hey, how's it going?", to which he replied, "So far, so good!"

  • Before you can say that Unix will die or be replaced, you really need to define Unix, which is quite difficult. Is it a multitasking kernel that treats everything as a file or a process? Is it Red Hat 6.1 running XFree86 4.0 and Gnome 1.1? Is it Sendmail? Is it 200 cryptic configuration files? Is it ls, cat, and rm? Is it a way of life? Is it a philosophy?

    Is NetBSD Unix? Is Linux? Solaris? AIX? Minix? GNU?

    Sure, an OS can be "certified" Unix or certified "POSIX compliant", but that isn't an end in itself. Unix (however you define it) has eveolved through the years, as everyone has already pointed out, but is also modular (I can replace proprietary ls with GNU ls), and portable.

    Where will Unix be in 10 years? I don't know. But I know where it won't be: lost & forgotten.

  • Some people said USENET would die.

    Yep. And we're all here making our comments. Not on USENET, where there are no moderators.

    I download and archive select Usenet groups, and... umm... a lot of them are a total loss. The Linux groups have some of the highest traffic, but also the largest volume of clueless noise.

    The real action has shifted to private forums like /. and listservers.

    The bulk of the Usenet traffic these days is purely Binary attachments. The most popular NNTP clients on Windows machines strip off and throw away all 'text' content, keeping only the attachments. NewsBin, Pluckitt, etc.

  • Well I mean think about it.. how many of us have more than one PC in our home? How many of us would rather have a single PC with a handful of "front-end" appliances scattered in convenient places?

    Centralization of data is what we're moving to. This is why things like Hotmail are so successful -- easy access to e-mail regardless of where you're connected. I just bought a couple of i-openers (see the Slashdot article a few days ago) that I'm converting to cheap X terminals.

    Some attempts to do this on a large scale (WinTerms and "Thin" clients) haven't really caught on, mainly because of the still high cost of the clients and the fact that it's not cost effective to switch. But who's to say that won't change?

    Which leaves us with the question: what type of platform do we want supporting the network devices we connect? I don't think I've seen NT or any "recent" OS compete favorably on these grounds, which leaves us with OSes like Unix, still being developed as fast as the technology does.

    Of course, that's not to say that other operating systems won't possibly step up and smack Unix out of the arena entirely, but we can't bet on that..
  • Iron-clad authentication/validation, everywhere.
    There is no such thing as "Iron-clad" validation, but validation tied to something physical (retinal scans?) will come pretty close to this in less than 10 years, IMHO.
    Portability of user authentication and of services to other (untrusted) networks
    You can never have authentication portability to untrusted networks, because, by definition, they are untrusted. OTOH, it should be possible to set up (with existing tools) a globally distributed LDAP or NIS+ tree to accomplish the reasonable part of what you're trying to do here.

    --
  • by Anonymous Coward on Wednesday March 15, 2000 @11:45AM (#1200466)
    to ask "will unix come to an end" is meaningless. let me explain. if we look at unix in the context of normal operating system space and time, we notice singularities at the "beginning" and at the "end." i have been working on this problem for many years now and am now ready (coincidentally) to offer you my solution to this age-old problem of computer science!

    to better describe my concept of unix, i have developed a new mathematics which defines unix in terms of "super-time." our beloved unix is embedded in this supertime, which contains no singularities! thus, unix itself is without bound in time and space... it simply always is.

    thank you.
  • by drix ( 4602 ) on Wednesday March 15, 2000 @05:02PM (#1200467) Homepage
    What do you mean by Unix? I can't really think of an all encompassing definiton. I'm not sure about everything here, but in my limited knowledge of Unix heritage and composition, it occured that a lot of things qualify in one way or another as Unix. Posix compliant? Nope - NT is Posix. Based on original Bell Labs code? Not even Linux qualifies. Containing Unix code? Fine, but don't forget to throw in MacOS X and BeOS as well. Command line based? No way - DOS et al aren't Unix. Unix grew from being an operating system at Bell Labs as in "Unix, the descendant of Multics" to standing for a whole slew of operating systems. Much like our DNA is different from primates by only about 3%, there are only a handful of things that separate a Windows machine from a Unix box. For the most part, they're a lot alike. Both have the same fundamental architecture - kernel, processes, etc. To me Unix means a really stable, secure multiuser OS that is remotely manageable and based on Open Standards. But I'm just pulling that out of my ass. It means different things to different people. Novell meets that description too. NT, in some ways, does too. So I guess my answer would be that everything qualifies, in one form or another, as being part Unix. Can anyone more clearly define what, exactly, Unix is?

    --
  • by fishbowl ( 7759 ) on Wednesday March 15, 2000 @06:29PM (#1200468)
    >NT evolves more
    > toward Unix every day.

    In the world I live in, it's years between NT Releases; months between service packs.

    Not that I wouldn't love to see this daily evolution of course...

  • Again it is proven that people are subject to media sensationalism. Every newspaper, every news program, even every talk show tries to get one's attention with the threat of the worst-case scenario.

    It's like when Apple was having financial troubles: "Don't buy Apple, what if they go out of business?" Who cares if they have $2 billion in cash reserves? Media sensationalism.

    And Amiga. Amiga has died so many times, nobody takes the report of Amiga's death seriously anymore. Perhaps nobody defined dying: new products are becoming available, a new AmigaOS came out 5 months ago, and so on. So what is "dead"?

    Now slashdot: anyone who has spent any time in the industry knows that Unix is the most dominant force holding the whole computing world together, so why pretend to take a question like this seriously? Media sensationalism. That's all.

    Until I see a headline like, "Unix is dead!", followed by "a young man named Unix was gunned down", I'll stick to reality.

    Now: a more apt question is: Is Windows dying? I have some compelling ideas about THAT...

  • by Christopher Thomas ( 11717 ) on Wednesday March 15, 2000 @12:04PM (#1200470)
    Network computing, where one can log in anywhere and have one's environment and files preserved and run applications without caring where they're hosted?

    Sounds a lot like any university's internal LAN. Which probably runs Unix (Solaris at my university).

    Unix was already designed to address many of the issues that come up with network computing. I only see a few things that need to be fine-tuned:

    • Iron-clad authentication/validation, everywhere.
      Portability of user accounts across the entire network, checking of permissions/licenses for running applications, etc. This is already pretty much here; you just have to know what you're doing to set it up. The challenge is to make sure that all applications on your system work fine for all users, everyone can do what they're supposed to and have access to what they need to, but to make sure that nobody can do anything they shouldn't be able to.

    • Portability of user authentication and of services to other (untrusted) networks.
      If you are a validated user at one university, you would ideally be able to log into another university with guest privileges, and have it recognize you as a specific user ("foo at bar university"). Similarly, I'd like a validated user on my personal LAN to be able to access someone else's service while keeping an individual identity. Or through another network access their "home" network's services with their full privileges. The idea being that identities and permissions carry over robustly and securely over heterogeneous and possibly untrusted networks.

    • Support for running applications on a distributed virtual computer..
      This ties into the whole "the network is the computer" idea. If I could just use the collective computing power of all of my hypothetical LAN terminals to distribute tasks, I might not need a central server at all (assuming that my tasks have low communications overhead). Similarly, it would be nice to be able to farm off tasks to another "friendly" network. Protocols and support for this is in development, but would need to be standardized for true "network computing" to come into its own.


    Again - by and large, these are capabilities that already exist, or exist at least in part. They continue to be developed - and chances are, that development is happening under Unix.
  • by jetson123 ( 13128 ) on Wednesday March 15, 2000 @12:56PM (#1200471)
    The UNIX of today is already very different from the UNIX of 20 years ago. What UNIX has managed to do like no other system, however, is to preserve logical and functional continuity.

    Programs I wrote for UNIX 10 years ago still work just fine and take full advantage of the larger memory and faster processors. That is not at all true of systems like Windows.

    UNIX is also a particular approach to writing kernels, based around a monolithic, fairly simple privileged core. UNIX kernels are also pretty closely tied to the C language. That's very different from microkernels, Windows, or other systems. And UNIX is a set of conventions for where and how to represent system configuration information, command line programs, etc.

    That kind of continuity can't last forever, and eventually people will start using systems that can't realistically be called "UNIX" anymore.

    Sooner or later, kernels will have to be written differently and in languages other than C, the file system will change into some different kind of database, etc. People will also build new replacements for system configuration information, the init/login sequence, etc.

    I suspect, however, that the transition from UNIX to its successors will be fairly gradual, and that any successors will continue to offer reasonable POSIX implementations for a long time to come.

    Plan 9 and Mach are both examples of successors to UNIX that really have some good continuity with it, and that's probably what's in store both commercially and in the open source community.

  • by moebius_4d ( 26199 ) on Wednesday March 15, 2000 @11:52AM (#1200472) Journal
    The thing is, the core unix functionality doesn't specify interfaces, unlike some other program loaders/OSs one could name. The layering of X on top of command-line unix is an excellent example of this. If future interfaces require voice recognition, passive sensor-triggered operation, or whatever, it can be added as a layer.


    Since a unix core install can be made very small,. suitable for embedded systems, I don't see a reason to throw away a perfectly good model with a well-understood API and many thousands of man-years of refinement.


    Unix is being used successfully on systems from mainframes to wearables, from super graphics boxes to the tiniest pinhead sized embedded systems. What design criteria do you envision that would contraindicate the use of Unix?

  • by SEWilco ( 27983 ) on Wednesday March 15, 2000 @01:08PM (#1200473) Journal
    Indeed, Unix has adapted and evolved so much it would have difficulty running on the original configuration. The embedded Unix/Linux versions would work the best, with great memory restrictions and slow external storage.

    Montgomery's short "An Introduction to Unix" [unix-wizards.com] points at the Unix system family tree [unix-wizards.com].

    That 1997 document does not mention Linux, which grew out of the POSIX definition, System V, NetBSD, and GNU tools (developed on many Unix flavors). The Unix History [faqs.org] segment of the Unix FAQ does mention Linux briefly.

  • by weave ( 48069 ) on Thursday March 16, 2000 @03:26AM (#1200474) Journal
    This kind of FUD gets "5"? An obvious rush of mindwashed unixers moderated this.

    Well, I was a bit surprised by the 5, but it wasn't meant to be a troll either. Damn it, I want Windows to live up to the hype. I'm in a huge NT shop at a large college where desktop security is important. Most of the pieces to make my life easier are there, but I see no light at the end of the tunnel.

    Take applications for Windows. Damn they suck. Not the apps, but the design. Microsoft can readily fix this if they got their heads out of their asses and realized that the world isn't about one person/one computer with full control.

    How can they fix it? By taking their already existing label standards for apps and strenghthing them so new apps at least behave properly. Don't follow the rules, then your app is not "compatible with Windows 2000." But then that would break all of THEIR apps too...

    Example:

    • An running program should never expect to be able to scribble into HKLM. User settings should go into HKCU where they belong.
    • A program should never EXPECT keys in HKCU to exist. If they don't, a reasonable default should be assumed and/or taken from the .default tree.
    • A program should NEVER expect to have write access to any device other than %TEMP% and a place under full control of the user to point to as needed (like a home directory, removable media, etc...) Failures to write data due to permission problems should be gracefully handled and not abend the program.
    • A program should never require that "more software" be installed to make it function. eg, if you turn on app sharing in Net Meeting it needs to "install additional software on your computer" and then fails if the user doesn't have admin rights, causing additional config hassles for sys admins.
    • All installation programs should have unattended and scripted installs and damn it, not require a reboot. Office 2000 is a prime example. It supports unattended installs, but demands a reboot and after the reboot if it doesn't find network drives mapped the way it expects (like a user home dir) for some unknown reason (to me) it fails horribly. I'm installing the app for my users. User settings shouldn't be set up during an install damn it...

    All of my bitching about NT/2000 comes from actual experience trying to make it work as advertised. When Microsoft's own fucking applications don't follow good design standards, how do they expect others to do so.

    Do you realize how long I took to get that piece of shit IEAK to work properly? First of all, the IEAK book spends about 250 pages talking about how wonderful it is and all you can do with it, and then when it gets to the point where it talks about how to implement policies, it spends exactly 1.5 pages on what all 200 settings do. During an unattended install, it throws shit loads of stuff into runonce keys but, heaven help you if, before the next reboot you or another program invokes rundll32, because rundll32 triggers all runonce keys to be processed immediately, even if they were installed during THIS boot instance. That one killed me for a long time. Then, during a user logon, you have to ensure loadwc.exe can run and *IT* uses rundll32 to do a lot of the customizations with the policy settings installed. But, get this, rundll32 won't run if it can't for some unknown reason have write access to the runonce key. But allowing users r/w access to that key violates a KB article saying what a huge security risk that key is for users to have r/w access to it. So, I have to give admin rights to users who I want to policy control their computer?! Oh yeah, that makes a lot of sense... :( And none of this is documented in the IEAK docs. No, just hundreds of pages of marketing fluff. I'm a sys admin. I buy an administration kit to get technical details, not marketing hype.

    None of this actual implementation stuff ever gets mentioned in the press. All the grand claims of how Microsoft makes administration tasks easier are taken at their word and I wonder if anyone actually tries to use these features. When I hit problems. dejanews and altavista searches for similar things usually turn up nuttin on these issues.

    OK, I'm ranting as usual. I really really hate platform bigots, UNIX and NT or whatever. I go forr the best tool for the job. I just really get tired of doing careful research, picking a Microsoft solution as the best tool, then finding out I was an idiot for actually trying to implement the solution and expecting it to actually work.

    How many times do I have to be abused before I learn my lesson? :-(

    Yes, Linux and UNIX programs have their problems too. I couldn't believe the hassles I had to go through to get Amanda to work with my DG/UX box. sendsize silently kept failing to calc disk sizes cause the fork/exec of runtar was screwed up. But you know, fixing it wasn't a big deal. I had the sources right there, went through the code, found the problem as it relates to my OS. I fixed it and will send the patches back to the authors.

    The difference is, when something breaks in an open source OS, I can always fix it myself. When something breaks in a proprietary OS, I'm shit out of luck and can only hope that the next version fixes it and the upgrade costs are not too prohibitive for my installed base of thousands of desktops...

  • by Hard_Code ( 49548 ) on Wednesday March 15, 2000 @12:27PM (#1200475)
    The previous era of computing was hardware-limited. That constraint limited the domain of data that could be worked with and upon. We are no longer hardware-limited. We are overflowing with data, of myriad types and dimensions. The next era in computing will be filtering data and adapting it to the person...not adapting the person and data to the limited hardware.

    Already we are seeing that the notion of a strictly one-dimensional hierarchical file system becoming archaic. Having a system of files is useless if you are so overrun with data that amongst the plethora of files nothing is meaningful. With the network-as-computer approaching I believe we will shift to the systems of "resources". We are already seeing this with distributed computing. URLs locate abstract "resources", whether they are on a local file system, or over the net, whether they are static data, or an executable service or agent. We have to move to an /associative/ resource database, where resources are categorized not by arbitary position on disk, but by their very attributes and data characteristics. We are reaching a new level of abstraction.

    That being said, a lot of Unix is based on the good old file system metaphor. At the time, addressing everything as a file was as novel as addressing everything as a resource (think system components as CORBA objects - check out AllianceOS). Because of the above reasons I think we need to graduate to a more abstract, associative model. Also, very simple security structures like ACLs are showing their age...they do not scale up well. We are seeing new security models, like capability-based systems, where each entity in a system, be it user, or program, has a set of "capabilities" which it can use to interact with other pieces. A higher level of abstraction. I think for Unix to stay in places requiring these features (associative data storage/filtering - desktop, new security paradigms - large networks, network OSs), it will have to change.

    Where things like this don't matter one bit (like the mainframe), Unix will continue to reign supreme.
  • by devphil ( 51341 ) on Wednesday March 15, 2000 @12:20PM (#1200476) Homepage
    This theory:

    http://home.xnet.com/~raven/Sysadmin/UNIX.html

    says it very well. "People are confusing dying with age," and that brief article has a good idea why Unix will still be around for a long time.

    (It was written shortly after Lose95 was released. :-)
  • by Junks Jerzey ( 54586 ) on Wednesday March 15, 2000 @02:18PM (#1200477)
    UNIX was poised to die out in the late 1980s. That OS was floundering on the professional desktop in workstations--back when that meant something other than a PC--from Sun and the now-defunct Apollo. But the general UNIXy environment and Xwindows just felt so clunky next to something lighter weight; it looked like some form of personal computer was a good alternative.

    Unfortunately, as personal computers became more complex, they also became more unreliable. If Windows 3.0 crashed and you could just hit a button and be back to work in three seconds, then no one would have cared. But you had to sit through an unbearable two minute boot sequence. Networking was messy. Arcane INI files were just as bad as anything from the seventies. As reliability dropped, UNIX began looking more and more attractive. It was still butt ugly, but at least it worked.

    By now, we should have something better. We've had an additional ten years to deal with the problem. We should have something very small and very stable and very easy to take care of. A computer should be able to reboot as fast as a calculator. We shouldn't have to deal with driver issues and such as much as we do. But it didn't happen. PCs got faster and more varied, but nothing improved in the reliability or simplicity department. And now, to our horror, UNIX is looking like the simpler alternative. No one would have believed it.
  • by G27 Radio ( 78394 ) on Wednesday March 15, 2000 @12:01PM (#1200478)

    I don't think it'll 'die' exactly. It may eventually evolve into something that bears little resemblence to what it is today, have a different name, have absolutely no code in common with what we know today, but it'll be evolution.

    Licenses that make the source available for reuse make this more likely than ever. The open source movement is giving Unix a lot of strength. I'm fairly confident that people will still be typing "ls" to see their files 30 years from now (assuming the keyboard isn't dead by then :)

    numb
  • by slashdot-terminal ( 83882 ) on Wednesday March 15, 2000 @12:13PM (#1200479) Homepage
    There's no reason to "convert" most of our existing Internet/networking infrastructure to anything else in the forseeable future. I agree with the prediction that things will end up moving more towards centralized computer resources,
    and lesser-equipped but ubiquitous terminals to access those resources, but Unix will still be there in some fashion.


    I don't see the likelyhood of this. All you really have to do is increase the ability for the client to work properly and increase it's capabalities. For somethings (say perhaps tactical nuclear weapons simulations) you may need mainframes however this is not the norm nor very supportable. Applications are mostly writen for PC type platforms considering how much Microsoft has spent convincing people of this.

    Who's to say Unix won't be the OS that drives the appliances?

    But appliances are just so.. well boring. What would be nice is to have a large mainframe that you could optionally use and use for massive backups of the target machine (say copy the entire image of the client in case of power failure and such) and allow the client to have responsibilities.

    Personally I don't want to have some rather fiendish god controlling my computing resources at one particular point. If someone would write an application API that would work like distributed.net and allow for say a complex process to be broken down into many smaller parts and work on any number of client machines that could be increased and decreased at will would be much better. Add into this a possibility to have "relay points" where the data could be copied for a particular portion of the network in case some machine failed or sent corrupt data.

  • by Orville ( 104680 ) on Wednesday March 15, 2000 @11:57AM (#1200480) Journal
    I suppose this question could be thought of in a different way: What if AT&T put a patent on UNIX and kept it completely proprietary? Chances are, the world of computing would be substantially different.

    The trick of UNIX: it has always been availiable and highly adaptable to different environments. While this was changing in the 1980s (the UNIX wars), RMS, Linus and all of those open source programmers have insured that UNIX in some variant will always be in use.

    If you look at recent corporate inroads, such as IBM-Intel, Phillips TiVo, etc., the market for UNIX like solutions is actually growing!

  • by whyde ( 123448 ) on Wednesday March 15, 2000 @12:53PM (#1200481)

    In the Epilogue of their book, The UNIX Programming Environment, Brian Kernighan and Rob Pike were not as arrogant to think UNIX would live forever, but did have this to say:

    "The principles on which UNIX is based--simplicity of structure, the lack of disproportionate means, building on existing programs rather than recreating, programmability of the command interpreter, a tree-structured file system, and so on--are therefore spreading and displacing the ideas in the monolithic systems that preceded it. The UNIX system can't last forever, but systems that hope to supersede it will have to incorporate many of its fundamental ideas."

    So long as this statement holds true, I'll gladly work with any future system which provides this set of core ideas.

  • by grue23 ( 158136 ) on Wednesday March 15, 2000 @11:48AM (#1200482)
    On the subject of distributed OSes, /.ers might find it interesting to take a look at the Plan 9 OS that was (still is?) in development at Bell Labs several years ago. The FAQ is here [bell-labs.com].

    Plan 9 is UNIX-like, but it treats all system objects like files. This includes objects that exist across the network. Because of this, it is very easy to distribute the OS across several machines with it being completely transparent to the user. We set Plan 9 up like this in the OS lab at my college a couple years back, it's very odd.

    It probably isn't an OS that will pick up by itself, but it's an example of a way in which an OS can be distributed with a reasonable degree of transparency.

  • by wmaheriv ( 160149 ) on Wednesday March 15, 2000 @11:49AM (#1200483) Homepage

    Unix will adapt and grow- it always has.

    The Unix that we use now has little in common with the Unix of Thompson and Ritchie. It has been in a state of continual evolution and will remain thus until ^we^ stop caring about it.

    Unix has transitioned from PDP-machine language to portable C, moved from minicomputers to microcomputers (and even to mainframes and PDAs), acquired thousands of tools and roles never dreamed of by its creators. The Unix user today has a choice of command line shells, a choice of GUIs, a choice of vendors, even a choice of fundamental architecture as the file systems and such have evolved quite differently amongst the different Unices.

    I think we'll see changes in the coming years, but no greater change than we've seen in the last 30. Unix will continue to evolve until the Unix of our children is as unrecognisable to us as the Unix of our fathers. New hardware and new markets only create now opportunities for Unix to grow; it does ^not^ ring its death-knell.


    ~wmaheriv
  • by JonKatz molested me. ( 163067 ) on Wednesday March 15, 2000 @08:36PM (#1200484) Homepage

    to make an observation, VMS is not dead. It is still being used in more places than you would realize, and still generates a lot of revenue for DEC AKA Compaq.

    I think I ran into that at work once... and it pissed me off. Let me explain... although most of our HTTP and DB stuff is served from IBM Big Iron, we do have a few Sun E450s and a couple of Sun E1Ks. The Suns are loaded with Solaris 7, as is to be expected, and most of the operators (myself included) use CDE. So I'm helping someone write some tapes the other day, and the machine had CDE, and so dumb old me (thinking I was in Solaris) started pounding out Unix commands. After nothing worked, I looked at the title bar on the Xterm. It wasn't an Xterm, but a "DECTerm". I looked under the table, and sure enough, there was a DEC Alpha box. Ah.

    The point of all that ranting is, yes, I agree, DEC still makes money.

    I don't usually go over to that side of the server room, so I looked around a bit after finding the DEC machine, and what I found scared me. A bunch of 10-year old DEC daisy-wheel printers and reel-tape machines. I had no idea the company was using such old crap. What's even more scary is that the tape machines are even needed: some of our clients refuse to rewrite their media distribtution software to accept anything besides large tapes. The more "up-to-date" clients get the same information by FTP, but these old fogeys -- and some of these are household names -- are using completely rotted software.

    My horror was complete when I discovered something I never thought anyone from my generation would ever see in use: an NEC Astra machine, from the mid-seventies! Complete with a proprietary (read: not PC-compatable in any fashion) OS, loaded from eight-inch floppies. The machine was used right up to when it died, on January 1, 2000. (70s hardware and softare isn't quite as Y2K compliant as today's ;-)

    I ran and hid behind the IBM fridge racks and haven't been over to that corner since.

    I guess I started ranting again. Let me try and make a point out of all that... uh, VMS still pisses me off, and DEC can go to hell. :-D

  • by Anonymous Coward on Wednesday March 15, 2000 @11:45AM (#1200485)
    The Unix of today is very different from the Unix first produced in the 1970s. All things either evolve or die. Unix will evolve in some way or another. Just like mainframes, Unix popularity might wan (I think it already did) but it will come back (I think it already is, as Linux and FreeBSD).

    Look at it this way Be *had* to put a level of Unix compatibility in BeOS. MacOS is now a variant of Unix and NT evolves more toward Unix every day.

    On the other hand Unix/Linux must be lost to the user in the sense that Unix/Linux at the command line or Xlib level just isn't for the user.
  • by Robin Hood ( 1507 ) on Wednesday March 15, 2000 @12:49PM (#1200486) Homepage
    One potential successor to Unix is EROS [eros-os.org], the Extremely Reliable Operating System. It's at a "hackers only" stage right now, as there is a marked shortage of drivers. BUT if you "long for the early days of [Linux], when men were men and wrote their own device drivers" :-), well, here's your chance. Start another operating system going!

    EROS is hard to describe. It's capability-based and has orthogonal persistence -- and if that doesn't mean anything to you, I'm not going to be able to explain it much better. Check out the EROS project site [eros-os.org] and read the documentation. One thing this means that I can explain, though, is this: "snapshots" are taken of the current state of the system every five minutes. If the power goes out, the system is later restored to the last good snapshot. So you could have a text editor window open, never save your file, PULL THE PLUG on your computer and then plug it back it. Within 30 seconds (or however long your BIOS POST takes to complete), your text editor window is back on the screen, and you've lost no more than five minutes of work.

    EROS [eros-os.org] is cool. I think it has potential to be the Next Big Thing. Check it out, download it (it's GPL'ed), play with it. Have fun.
    -----
    The real meaning of the GNU GPL:

  • by A nonymous Coward ( 7548 ) on Wednesday March 15, 2000 @12:21PM (#1200487)
    Fire has lived well beyond the era in which it was born (the era of stone) and has survived and thrived in the era of bronze, but now, many people believe that soon we will see the transition from that era into the age of iron. In that case, has the Slashdot community at large ever considered what the future is? Will Fire finally die off, will it adapt as it did before, or will Fire find a way to remain the same trustworthy system it always has been? And if Fire will come to an end, what does the Slashdot community feel will be its succesor?

    --
  • by technos ( 73414 ) on Wednesday March 15, 2000 @12:01PM (#1200488) Homepage Journal
    I don't know what will become of Unix, and whoever says they do is not only a fool but a liar as well.

    It was only a few years ago that I was mourning the apparent *nix recession; The only game in town was Xenix/AT&T (and a wee bit of Sun, but not in my neck of the woods), and their products were both languishing and confined to minis.. Linux and the *BSDs were infant, not worth a mention outside of academia. Now it has come full circle. People are using *nix [gasp] ON THEIR DESKTOP! I can run *nix on everything from my multi-million dollar IBM to my $100 garage sale throwaway. And it is adapting again. Embedded Unix? I would have laughed my ass off if someone had suggested running Unix on a microcontroller only a few years ago..

    I kind of suspect that *nix is just too adaptable to die, but to say whether or not it will be beaten back onto the mainframes by PalmOS run-PDA's in a decade is impossible.
  • by Jeffrey Baker ( 6191 ) on Wednesday March 15, 2000 @12:06PM (#1200489)
    Unix is an operating system. Its purpose is to operate the hardware inside your computer, and to provide a programmatic and generalized interface to that hardwares' capabilities. As long as Unix continues to operate popular hardware and provide an interface that programmers like, Unix cannot die.

    The advent of distributed, collaborative, pure-hype^H^H^H^Hjava, Active System Blaster 2000 will not make Unix obsolete. Even a revolutionary, fully distributed and autonomous network object system still needs to send bytes over the wire, still needs to access system memory, and still needs to accept input and create output. These are the things that Unix provides. This is why Unix will always be around.

    I suppose that a newer operating system could supplant Unix. However, I don't seen any in the near future. Be has a bright future, because it has networking and other nice capabilities. But Unix has the trump card over BeOS: the idea of users. In a distributed network environment, the user concept becomes much more important. Information, files, interface configuration, and many other things are associated with a person. Those things must be secured from other people, and the other people must be secured from them. Therefore any OS that wishes to supplant Unix will need to supply the same kind of protection for users' information.

    Cheers,
    jwb

  • by Squiggle ( 8721 ) on Wednesday March 15, 2000 @11:45AM (#1200490)
    Of course Unix, like any other modern OS, must change over time to accommodate new technologies and methodologies, but I see Unix being more able to adapt in todays fast changing Information Technology world than other operating systems based on monolithic kernels.

    That's funny, I thought that Unix was based on a monolithic kernel... silly semantics

    I would love to see some of the best coders and operating systems people put together a new OS from scratch using the latest techniques. Ideally this would create an ultra stable and very modular system. I would happily give up some extra CPU cycles for increased modularity and the ability to easily swap in and out OS components so that I could customize my OS to the task at hand. I find it rather ridiculous that I run the same OS when I am playing games, running a web server or working with Photoshop (etc). Rather than having a generically-good OS I would prefer a highly optimized OS for the task(s) at hand.

    How often do I run run a game, Photoshop, compiler, and web server concurrently on my home box? Give me adaptibility and modularity or give me death!

  • by jabber ( 13196 ) on Wednesday March 15, 2000 @12:33PM (#1200491) Homepage
    Unix is not a product, it is a set of evolving ideas. As such, it is not going anywhere but up.

    It's just as easy to ask: Is this the end of silverware, or the end of fire, or the end of any old thing that's proven to work... Is genetic engineering the end of agriculture? Is organ transplantation the end of death? Is The Bomb the end of War?

    Yeah, there's micro-kernel based OSes out there like Qnx and NeXT, and Hurd... But they're still Unix. The NEW OS X from Apple is more Unix than it's predecessor. NT is more Unix than Win95.. There's new approaches like BeOS.

    If one defines Unix in a very constrained way then Unix has been dead for a long time. When AT&T first released System V, and allowed it to mutate, Unix died and was reborn in a variety of ways. Umm, that was what? 1976?

    If one defines Unix broadly, as a set of concepts, a layered architecture, levels of abstraction, sets of small uni-purpose tools working together, APIs and things 'grep-like' then guess what? Unix will live forever.

    It's a pointless question, not because it's bad, but because it's completely subjective.
  • by seebs ( 15766 ) on Wednesday March 15, 2000 @11:32AM (#1200492) Homepage
    Unix is too broad a family of systems to "die". It's not like AmigaDOS, or VMS, where there's just one Unix and it can "fall behind". Unix will be replaced, but it'll be replaced by more Unix.
  • by Mucky Pup ( 21317 ) on Wednesday March 15, 2000 @12:45PM (#1200493)
    I agree. The obituaries for all other OSes will more than likely be typed up on a Unix box in ed.

  • by Ryan Taylor ( 32647 ) on Wednesday March 15, 2000 @12:19PM (#1200494)
    That's funny, I thought that Unix was based on a monolithic kernel... silly semantics

    It is, and the orrigional comment didn't suggest otherwise. Read again:
    I see Unix being more able to adapt in todays fast changing Information Technology world than other operating systems based on monolithic kernels.

    "other operating systems based on" implies that unix is one of a group of "operating systems based on ..." and that there are others.


    But that's not what I really wanted to comment on.

    I would love to see some of the best coders and operating systems people put together a new OS from scratch using the latest techniques.

    Hrm... read: http://www.gnu.org/software/hurd/hurd.html [gnu.org]
    Ideally this would create an ultra stable
    Read: http://www.eros-os.org/ [eros-os.org]
    working with Photoshop (etc).
    Read: http://www.be.com/ [be.com]

    Alternatives are out there. You just haven't found them.


    -rt
    ======
    Now, I think it would be GOOD to buy FIVE or SIX STUDEBAKERS
    and CRUISE for ARTIFICIAL FLAVORING!!

  • by weave ( 48069 ) on Wednesday March 15, 2000 @12:59PM (#1200495) Journal

    I have seen the future. The future is filled with operating systems that demand that their system binary directories be writable to all, else they fail.

    I have seen the future. It has an operating system whose applications, even those written by the OS authors, can ignore the TEMP environment variable and scribble temporary files where-ever they want and fail to operate if they can not do as they wish.

    I have seen the future. The future is filled with continued support for legacy drive letters and 8.3 file names with rename.ini kludges during installs.

    I have seen the future. The future is an operating system where you have to shell out serious dollars to buy third-party utilities to get around security deficiences in the design of the OS. After all, why fix that pesky virus problem when so many anti-virus companies would go under without that revenue stream coming in.

    I have seen the future. It has operating systems whose file systems don't support the concept of being able to delete a file yet have it not actually get deleted until the last remaining process that has it open dies first. For doing so would prevent the need to put dynamic link libraries into a temporary space and have them "installed" during a reboot. Reboots are good. They clear up sloppy OS design problems.

    I have seen the future. The future is filled with grand marketing schemes like "Administration Kits" that promise all kinds of abilities to deploy corporate policy restrictions to users yet neglect to mention that these policies are applied by a program that has to write to an area of the OS that was previously recommended be R/O due to the security problems it causes if it is R/W, hence making the ability to make the scheme work as advertised impossible for all users who do not have full permissions to their workstation.

    The future, my friends, is about image and not function. UNIX is ugly. It's doomed.

    Or in other words, resistance is futile. At least that is what they want us to believe... :)

  • by Fas Attarac ( 163334 ) on Wednesday March 15, 2000 @11:38AM (#1200496)
    There's no reason to "convert" most of our existing Internet/networking infrastructure to anything else in the forseeable future. I agree with the prediction that things will end up moving more towards centralized computer resources, and lesser-equipped but ubiquitous terminals to access those resources, but Unix will still be there in some fashion.

    Who's to say Unix won't be the OS that drives the appliances?

The optimum committee has no members. -- Norman Augustine

Working...