Microsoft's Vision For Future Operating Systems 555
Bender writes: "The Systems and Networking group at Microsoft Research has a fascinating article that details what sorts of things they believe may be important in Operating Systems of the 21st century."
Could This Be Condensed Into... (Score:2, Funny)
Re:Could This Be Condensed Into... (Score:2, Insightful)
The SunRay server even lets you know when it needs more resources when Netscape takes 5 minutes to launch
It takes some effort just to get Win98SE to see Win2K on the same damn LAN.
/.'s MS icon has never seemed more appropriate... (Score:4, Funny)
Re:/.'s MS icon has never seemed more appropriate. (Score:2)
Encryption is useless.
Your intelectual property will added to our own.
Prepare to be assimilated.
anti borg/M$ tactics (Score:2, Funny)
Re:/.'s MS icon has never seemed more appropriate. (Score:2, Funny)
While I realize they never say one OS, I am sure
that is what they mean. Could this be used in
court?
JUDGE: How would you justify that M$ soul reason
for operation is to drive your business
out of operation.
plaintiff: Well there website says: "there should
be only one system." and I am pretty sure they do not mean mine.
JUDGE: And what is your goal?
plaintiff: To drive M$ out of business
JUDGE: So its ok for you but not for them?
plaintiff: Yea, but I have no chance in hell.
M$: What makes the PLANTIF so sure we can?
BALIF: Your honor your shipment of MS WORLD
DOMINATION boxs have been delivered.
Re:Microsoft to Remove Twin Towers (Score:2, Funny)
Qua
no ms bashing (Score:2, Funny)
Distributed Blue Screen of Death? (Score:2, Funny)
Goals (Score:2, Funny)
Worldwide scalability - Every town has a garbage dump and it gets bigger everytime you dump on it
Fault-tolerance - We've tolerated enough faults
Self-tuning - Everyone needs an MTU of 1500 on dialup
Self-configuration - Icons for every desktop
Security [sic] - we'll try it just once
Resource controls - we've reduced the number of easter eggs in this one
Under goals they left out. . . (Score:2)
KFG
They really do want us all!!! (Score:3, Funny)
Translated: "Microsoft will take over every machine you put your filthy little hands on. Nyah!"
And it gets worse... "The administrator inserts a Millennium installation DVD disk into one of the machines and the system propagates across the network. After evaluating the network topology and hardware resources, Millennium might suggest that one of the more powerful machines (a "server") be moved to a different network location for best performance."
Translated: "Windows Millenium will infest your entire network whether you like it or not. Then, it will hunt out the Linux machines and demand that it be installed on those as well."
Now if those aren't goals of a company that plans on taking over the universe, I don't know what are....
Re:They really do want us all!!! (Score:2)
MS Lawyer: We see from our records that you purchased one (1) copy of Millennium, yet you have a network of over five hundred (500) machines running Millennium. You have 24 hours to pay us for those licenses your are using, but did not pay for. And to show you that we are serious, you have 12 hours.
Re:They really do want us all!!! (Score:2)
Re:They really do want us all!!! (Score:2)
Old Article (i.e. 1997) and More Recent Work (Score:2, Informative)
RE: http://www.computer.org/proceedings/hotos/7834/78
If you would like to find out more articles related to this one check out this page at ResearchIndex:
http://citeseer.nj.nec.com/21325.html
Cheers,
-ben
Old article (Score:3, Informative)
Re:Old article (Score:3, Informative)
As to why it's not mentioned, this article pre-dates the need for C#. Sun's suit against Microsoft wasn't that advanced in early-mid '97.
Re:Old article (Score:2)
but plan9 is open source already!
ack (Score:3, Interesting)
so they want to turn their entire user-base into an application? (bear with me)
it seems the only way you could have this level of hands-off "use-ability" would be to have complete control of all aspects of the hardware and enviroments your software is running under
this seems like a huge step in the wrong direction. if we move to a level of abstraction devoid of details, how can we possibly innovate and improve?
_f
Re:ack (Score:3, Insightful)
Re:ack (Score:2)
Re:ack (Score:2)
Re:ack (Score:2)
Nope. It's entirely possible to implement such a system using only the resources that are explicitly granted to it. It has been done many times, and several of those projects are referenced in the paper's bibliography.
By implementing above the abstraction, instead of reimplementing the abstraction endlessly. This has always been the case. At one time every programmer had to program down to the bare metal, jiggling interrupts and managing the movement of data between memory and backing store, writing their own schedulers, etc. Then we came up with abstractions like drivers and filesystems and operating systems with demand-paged virtual memory, so they don't have to do that any more (unless they want to). All that the Millennium authors are suggesting is that we consider treating things like location the same way we've gotten used to considering something like virtual memory or scheduling - as something that we don't have to worry about because it's taken care of for us by the system. This frees people to innovate, instead of requiring that they perform the same tedious chores for every non-trivial application they write. If I don't have to write yet another scheduler and yet another memory manager and yet another messaging layer, that leaves me more time to focus on the real higher-level problem I'm trying to solve. It's a good thing.
Re:ack (Score:2)
Re:ack (Score:3, Informative)
Oh, bullshit. The paper specifically talks about data remaining in the system until it's no longer needed or referenced; Freenet drops data whenever it feels like it. The paper talks about making storage hierarchies invisible; Freenet as currently implemented is totally non-transparent, requiring a user to explicitly download a file in the Freenet app and then operate on it in another app. Freenet's primary goal is to obscure the identities of requesters and responders - information that will most definitely be necessary to keep a general-purpose distributed OS secure.
There's nothing wrong with the fact that Freenet is this way. Those are its design goals, and they're valid ones. It's just not at all related to what the Millennium folks lay out. Continuing to push Freenet as the solution to every problem when it's really only a solution for an extremely narrow range of problems just makes Freenet advocates look like fools.
define irony (Score:3, Funny)
Microsoft [Unit]
One Microsoft Way
Redmond WA
("My way or the highway"-reminiscent)
Developers - stop bashing and start coding (Score:5, Insightful)
One big problem Linux development will face is the notion that devs are playing catch-up with MS with projects like Mono. (We blast Microsoft for its claim that it is an innovator, but has there been much innovation in Linux kernel devlelopment lately?) Instead of trying to build a Windows clone, we should build up a system that addresses computing in a way that MS system's dont.
The Main Linux Innovation... (Score:2)
Trollicious Postings A La Carte (Score:2)
Let's see
Re:Trollicious Postings A La Carte (Score:2)
I think some Mono developers would disagree with that.
"Linux is not trying to be a Windows clone, instead it is a rather successful Unix clone."
Well, "Linux" isn't trying to be anything. But the current trend is towards getting Windows users to like Linux by offering Windows-esque functionality. As others have said, what about GNOME, KDE, etc.? What is Ximian trying to do with its desktop environment?
"An operating system that addresses computing in a way that MSFT's don't? Do you mean like SE Linux [nsa.gov] or RTLinux [rtlinux.org]?"
Yes, I do, although I was thinking more along the lines of more general-purpose development. These are projects to fill specific niches.
no innovation in either system (Score:2)
Re:Developers - stop bashing and start coding (Score:2)
It certainly seems as if Windows is trying to reach the same goal that the Mac OS is, and that Linux is trying to reach the same goal that the Windows OS is...
Re:Developers - stop bashing and start coding (Score:2)
You might want to re-read your post. I can't judge your intentions but I think you hit it on the head more than you may have wanted to.
However, the goal previously refered to as
(Or World Domination, and Hammurabi has prior art to that.)
Re:Developers - stop bashing and start coding (Score:2)
I'm saying that we should not be rejecting ideas because they come from Microsoft, and this may serve as an inspiration for new open-source technology. This could serve as a springboard for future development of Linux.
Of course, many of you, being established developers, would probably be opposed to adding intelligence into an OS, preferring to work at a lower level. That's fine, but I'd like to think that development doesn't need to stand still. I think that the option of a "smarter" OS is a positive development.
Latency is a killer (Score:3, Insightful)
Some applications can be distributed, sure, but there will always be a need for interactive applications to run locally, on local data.
Re:Latency is a killer (Score:3, Insightful)
As storage space increases at a disproportionate rate to bandwidth. We can make use of that by being proactive with bandwidth use and trying to do some pre-fetching like processors do now.
Say for instance there are a few guys on your block that really love to read slashdot. Now say that slashdot is distributed. You go and download a page. Instead of downloading just that one page ... you might pre-fetch a few more while you read it (like any links on the page ... especially if other /.ers have clicked 'em).
In addition, if one of those other /.ers on you block decided to read up on the latest nerd news his local machine wouldn't have to go very far to get it.. thus reducing the latency. You'd end up with your block being your very own slashdot server with the people who access it the most storing most of the content.
Take it even further and imagine your block as a little 'group' that's trying to grab all kinds of information that it thinks it's users will like and shuffling it to those most likely to want to see it.
Lets say you'd really like Max Payne if the bad guys weren't so damn dumb. How to make them more realistic with the limited processing power you have locally? How about using some cycles from your neighbors computer to make the AI think and learn better. Beyond that, how about analyzing data on the play of all the owners of Max Payne and making general improvement to the AI?
Well, granted you could end up with AI that's pretty much unbeatable but you could dumb that down and just be left with an AI that can do something different every time!
Now I'm gonna go out on a limb here and say that there a one or two small barriers to this happening tomorrow. But I think it could be cool. Maybe. I'm glad somebody smarter than I is researching it.
And oh yeah. The speed of light is getting kinda fuzzier every day. For instance if you shine some light through some cesium atoms under the right conditions and it'll come out the other side faster than the speed of light. ... Who knows what'll be around in another 30 year?
Re:Latency is a killer (Score:2)
This is very similar to what content distribution networks (e.g. Akamai) do already, and more general/sophisticated mechanisms are on the way. Stay tuned.
Re:Latency is a killer (Score:3, Insightful)
To put this into context. In a fiber, light goes about 150km per millisecond- it can go from the UK to Canada and back again in ~40 ms; although ping times are often closer to 100ms due to delays in routers and suchlike (there's probably no reason that those extra delays can't be made arbitrarily small).
>Some applications can be distributed, sure, but there will always be a need for interactive
>applications to run locally, on local data.
Yeah, sure. Where local means anywhere within a thousand klicks or so.
Re:Latency is a killer (Score:2)
Sure. But if people around the world are going to be using the same data, that will cause problems -- because the world is larger than a thousand km.
I'm not saying that these issues can't be handled, but the idea of having everything run off of a computer in Redmond is unworkable; it would have to be hundreds if not thousands of servers strategically placed around the world. (In sharp contrast to the WWW, where a website *can* run off of a single server in one location).
Re:Latency is a killer (Score:2)
Re:Latency is a killer (Score:2)
>demanding applications.
I'll notice 12ms in the least demanding audio production environment. Video is even worse affected by jitter in the
time domain.
Re:Latency is a killer (Score:2)
Well, it seemed to me like the article's point was to remove such decisions from the programmer's concern. So maybe Quake XXI will need better performance than 50ms. The system they're talking about would recognize this and run the processes in question locally. They termed this "Introspection." Quoting the article:
In my view, they see all future applications needing to be somewhat network aware, and believe that client/server interaction could be greatly optimized and abstracted through a system like the one they describe. It doesn't preclude local execution, it just decides when it's appropriate. The thinking is that the system can often make these decisions better, since it will have the benefit of looking back on previous usage records. You can test all you want, but the system will be aware of it's actually being used.
Future M$ advertisement:
"We already know where you want to go today."
Re:Latency is a killer (Score:3, Insightful)
While it's nice to see people recognizing the importance of latency, you're way off base. First, I can assure you that the authors are not unaware of the problems of latency. They might not have spent a lot of time delving into technical details in that blue-sky paper, but I've met two of them and they are very cognizant of these issues.
On a more technical note, I ask you to consider how the problems of latency might be avoided or reduced. Someone else already beat me to the mention of trading storage for latency, caching data in multiple locations. I know a little bit about that, but I don't think the point needs to be belabored. Also, there's the flexibility gained by moving computation nearer to data (or other resources) via mobile code. Sometimes that can be a really big win, as various Java applets have demonstrated.
There will always be some cases where latency continues to plague us, no matter what tricks we throw at it, but those will be relatively few. It's just like managing caches, or memory access in NUMA architectures; some people will get down and dirty to wrestle with the details and wring out the last drop of performance, but the vast majority won't need to care because functionally the memory all acts the same. Right now, everyone has to care very much about location on a network, not just for performance but for functional reasons as well. If we abstract some of that away and do a reasonable job of implementing the abstraction, we won't see so many people implementing crappy messaging layers with broken security just because the structure of the system forces them into it even though it's not their forte. Some people will still "step behind the curtain" but they'll be few and far between.
It won't happen (Score:2, Funny)
This new generation of OS's have no idea what love means, they should be ashamed of themselves.
Duh. (Score:2, Funny)
One OS to rule them all (Score:2)
It seems odd that they haven't noticed the trend for computing devices to change size, shape, and function. Postulating a single universal "Millenium" system seems exactly backward to me. I'd rather see the research done more on the Jini model [sun.com], where many disparate devices intercommunicate. Surely that is more open to scaling and fault tolerance than the idea of one monolithic OS to meet all needs.
Microsoft appears to be becoming like the old Soviet union, where everyone has to buy in to the official ideology rather than venture out in new directions.
You missed the point. (Score:2)
I know a fellow who used to work in MS Research, and he keeps in touch with his old buddies. He has been talking about this project for some time now, and assures me that the intent is not to have a single monolithic system, but rather to have many disparate devices appear as a monolithic system.
Differences of hardware cease to matter to the user. Need more power? Buy another box and plug it into the network; you're done. Hire another employee? Plug a relatively wampy box in; if they need to do anything heavy, their code can snag some cycles from the guy who's at lunch, or the big box o' processors in the basement. No problem.
Um, this isn't new... (Score:2)
OS research has been pursueing these goals for years. There's nothing there that's really very interesting or new. It sounds like they've just browsed the web for a little while and summarized what the various projects are striving for.
One project that's come pretty far is Mosix [mosix.org] (I think they're planning to integrate bits into Linux 2.5, but I'm not sure). Then of course there's Plan 9 and Inferno from the fine folks who brought you Unix [bell-labs.com]. And lets not forget Tanenbaum [educ.umu.se]'s Amoeba [cs.vu.nl].
Operating systems should go away. (Score:3, Insightful)
Manage memory
Manage CPU time (schedule processes)
Manage access to hardware
And that's what an operating system *kernel* does.
Operating systems do not need to:
provide compilers, web browsers, colossal text editors (MS Word and emacs included)
inform users of the *really* important reasons they need to upgrade *now*
do GUI shit.
If you use a computer, you want it to do what you want. Most of the time, you want it to help you manage information. Most users don't even know that their computer *has* an operating system. Most users know that it's a really useful typewriter with an 'undo' facility.
What OS does your fridge run? your car? your microwave oven? your alarm system?
Those are all von Neumann machines, running operating systems.
A computer in a home/business environment should be useful, usable and reliable.
Get this. It's important. The people who buy computers couldn't give a flying fuck about the OS. Some want 'applications'. Those people are called 'IT managers'. Most want information. They are called 'people'.
I do *not* want my dishwasher to stop with a message of "Oops in module handle_detergent. Please run ksymoops and report to lkml". I don't want my television to go blue with advice to 'set CRASHDEBUG' for some purpose.
If you know that you are running an operating system, you are either an OS hacker, or the OS hackers have failed to protect you from their work.
Re:Operating systems should go away. (Score:2)
If you know that you are driving a car, either you're an automobile engineer, or the automobile engineers have failed to protect you from their work.
Or, if you know that you are writing characters, you are either an scribe, or the scribes have failed to protect you from their work.
A computer is a massively flexible tool. People have to learn the interfaces on VCRs, microwaves, and the rest of modern applicances. The computer is much more complex device, so it's going to have a much more complex interface, called an operating system.
If people just wanted to browse the web and do email, WebTV would have gone over better. But people want to play computer games, and write documents and keep their budget and their family tree on the computer, and a million other things. If people wanted a non-OS interface, what happened to Microsoft Bob?
I do *not* want my dishwasher to stop with a message of "Oops in module handle_detergent. Please run ksymoops and report to lkml".
And I do *not* want my car to stop working and start spitting out smoke. But you know what? It happens. If you're perfect enough to write the bug-less OS, then go ahead and write it.
Re:I think we agree (Score:2)
I think you miss my point. There are basically OS-less systems - WebTV, MS Bob. Why haven't people flocked to them?
you get a washing machine. They are both von Neumann architecure machines.
A von Neumann architecture is a CPU connected up to memory and IO, IIRC. (There are non-von Neumann computers.) A washing machine isn't a von Neumann architecture machine.
They don't want a prolonged diatribe about how "It wouldn't have broken if the QuuxBar 9.3.4 patch had been installed..."
And I don't want a prolonged diatribe about how I should have checked the oil in my car and replaced it. That does not negate my responsibility to do so, nor does it suddenly create a solution that doesn't need me to check my oil.
Re:Operating systems should go away. (Score:2)
Interesting, because, IMO, the prototypical non-OS is MS-DOS and friends. You type in the name of a program and it runs, and you don't touch the OS again. Win95+ made the OS much bigger and intrusive. Saying that Win95 is good is hardly an argument that OS's should go away.
> Characters are hardly complex interactive systems with a simple interface.
Language is a complex interactive system, but characters are not a simple interface. We spend years in school learning how to write; we come in mostly competent in speaking. Writing is often quite distant from speaking; spelling often has little to nothing to do with sound (cf. Chinese for an extreme example.)
> with Windows, even if it doesn't ship with a particular driver, is much easier to bring to a fully operational state.
When did this become a Windows vs. Linux thing? Depending on your hardware, either can be hard or impossible to bring to a fully operational state.
Re:Operating systems should go away. (Score:2)
And don't try to say that things like filesystem and networking are covered under "access to hardware". Unless you're accustomed to accessing your disk/network at the sector/packet level, the OS is taking care of a lot more than "access to the hardware".
I'm not disagreeing with your basic point that we need to better distinguish between 'the OS' and 'the applications running on the OS', but your view of a super-simple OS is unrealistic. We expect operating systems to provide a ton of functionality, quite a bit of which is not covered in your list.
I'd even go so far as to say that if you started developing a Windows competitor today (and no, Linux doesn't count in its present form) and left out something like native multimedia support, you'd be crazy. Things like JPEG decompression libraries and MPEG support belong in the OS's runtime library so that you don't have fifty programs running around with the same code embedded in them. Likewise with rich audio libraries, 3D APIs, and so on...
Re:Operating systems should go away. (Score:2)
You seem to be making several different and to my mind, unrelated points here. In other words you've lumped everything you hate about operating systems into one rant.
First you distinguish between the OS kernel and all of the other user-interface features that we typically call the "OS". Okay, that's fine. It's just terminological futzing but whatever makes you happy.
Next you start to rant: If you use a computer, you want it to do what you want. Most of the time, you want it to help you manage information. Most users don't even know that their computer *has* an operating system. Most users know that it's a really useful typewriter with an 'undo' facility.
This has nothing to do with your earlier terminology and it is just plain wrong. Most users can tell the difference between DOS, Windows 3.1, Windows 95 and KDE with Linux. Whenever you upgrade a user's OS you'll hear them say things like: "Where did that menu item go?." "These icons are different." and so forth. If you want to claim that what they are complaining about is the "shell" and not the "OS" -- whatever -- more terminological futzing.
What OS does your fridge run? your car? your microwave oven? your alarm system?
These systems have user interfaces that are substantially simpler than a computer. Computers do thousands of things and even relatively naive users want to do BOTH MP3 ripping AND email AND web surfing. So right off the bat you're talking about three interfaces that need to be somewhat consistent and you need a way to move information between them. That's where the OS comes in! It is the common substrate that they build upon. A toaster doesn't need any such thing.
If you know that you are running an operating system, you are either an OS hacker, or the OS hackers have failed to protect you from their work.
Call it an OS. Call it a shell. Call it a framework. Call it a runtime. Call it a VM. Whatever you call it, desktop computers need it to manage the flow of information between applications and toasters do not. So what's your point???
Slashdot should sue... (Score:2)
Web Service
A little-known web site suddenly achieves popularity, perhaps with a link from Cool Site of the DaySM or a mention in a prominent news story. Word of mouth spreads, and soon the web site?s servers are overwhelmed. Or rather, would have been overwhelmed except that heuristics in the Millennium system had noticed the new link and already started replicating the site for increased availability. Monitored traffic increases confirm the situation and soon the site?s data has been "pre-cached" across the Internet. As the site?s usage drops over the following weeks, Millennium reallocates resources to meet new demands.
I just can't seem to understand WHY they didn't mention the slashdot effect in this paper!! I can remember CSOTD back in 94-95, but I must admit that I haven't looked at it in years - do they still get a lot of traffic?
They also described Freenet. :-) (Score:2)
Does Microsoft possess even a single creative/original soul in their entire organization?
Re:They also described Freenet. :-) (Score:2)
There's Dave Cutler, who wrote WNT and later disciplined his wayward child back into Win2k. There's also Charles Simonyi [microsoft.com] who has some really interesting ideas. The fact that Bill "BASIC" Gates now has Simonyi's job title is, however, not encouraging. (and the "hot news" on that page used to mention something about IP going into a production mode, now it does not, that is also not encouraging).
So they do have some real innovators, but rest assured, they do deal with them as soon as they find them.
This quote stuck out in my head (Score:2)
I wonder if this phrase will have a different meaning if the MS monopoly continues for the next few years?
"Millenium" == Serial Experiments: Lain (Score:2)
Lain: Navi, connect to the Wired.
(scratch that...)
Joe User: PC, connect to the MS.
Automatic propagation (Score:2)
Software automatically propagates across the network, installing itself on new machines as neccesary. Nice idea for making sure patches and updates are applied.
But can we say "designed from the ground up to propagate malicious worms", kids? I knew we could. You think NIMBA was bad, this system'll make that look like a walk in the park on a sunny day.
Please read the paper before posting. It's short. (Score:4, Insightful)
That said, I'll comment on the paper itself. They have a point, somewhat understated, which is basically, "Yeah, this may be crazy, but it's worth looking into, isn't it?" One obvious response is that it sure seems to be What Microsoft Wants in terms of a homogenized global system that Microsoft controls. Though such a thing is never specifically said, it is called the "Millennium" system, and the ME in Windows ME stands for "Millennium Edition" (side note, it just occurred to me that "Windows ME" could be said with the same tone, inflection, and connotation as "Fuck me!" as an expression of dismay -- "Go Windows yourself!").
Well, who knows, but their idea of a transparent large-scale network that is self-managing as they've described is an interesting one, and there are some things that would be appropriate in such a system. That said, here's several reasons why I think such a system will not happen in the near future:
1. Too much resistance. This *is* a crazy idea, and even if it could be made to work, most people are used to the idea of "my" computer, "my" data, and everything happening physically *here*, inside this little box under my desk. This will take a long time to get over. Perhaps a gentle transition would help, with more and more things gradually shifting to the Big Network.
2. Games. Games require zero latency - nobody enjoys playing Quake with network lag, let alone system lag. All computations for games and other time-sensitive applications would have to be done pretty much within the physical computer you are using, otherwise the latencies are too great and the game would be unplayable and chunky. Imagine if your 50ms ping time also figured into the video processing!
3. Security. It seems silly to assume people would *want* to walk up to a random machine somewhere and have all their documents streamed to it over the Big Network. For one thing, who knows whether the terminal is secure, or if it's got secret programs installed in it to capture your keystrokes? Using a publicly accessible terminal to get to your private data is a bad idea. Also, critical machines (computers that run public infrastructure, banking systems, military systems, etc.) should obviously not be any part of this kind of transparent system, for the obvious security reasons.
4. Where we work. Telecommuting is, for all the cheerleading, not very common at all. When people do regular business-like work (i.e. office workers writing reports, having meetings, doing whatever) they will want to have everything in the same place, and do it in big chunks at a time. Face-to-face communication with people is also very important to the way business is usually done, though this may change as people get more used to the idea of telecommunicating for business. Being able to "walk up to a computer anywhere" and do work is pointless, because the vast majority of people are not going to WANT to be walking through the mall, window shopping, and decide they need to do some work, so go sit down at a public terminal and start doing work. (Nevermind the security issues, mentioned above.)
5. Monoculture. If we think a Windows monoculture is bad now (and we do -- at least, I do), imagine what happens when every computer in the world is now running this system! On the other hand, if such a system was designed so that anyone could implement their own version of it, then you avoid some monoculture issues, but because you have to have interoperability between the systems, you essentially end up with what we have now -- the Internet, made of multiple differing systems that can still communicate using a common protocol, except the protocol would extend beyond data transfer and into things like distributed processing.
If you've managed to read this far, congratulations! I can recommend a decent novel that incidentally covers this topic (it is not the main focus of the plot, but does figure into it): Permutation City, by Greg Egan. A very good novel with lots of interesting ideas, but it does feature a worldwide network in which you can basically bid on processing power to draw from the global network, so your programs might be running anywhere in the world, but are running securely so that a computer doesn't really know what it's doing, it just executes commands. It doesn't go into much technical detail (like how they manage to have computers execute encrypted code without decrypting it), but it's relevant nonetheless.
Nothing new here, move along... (Score:3, Insightful)
There is absolutely nothing being said in this paper that hasn't already been discussed countless times in universities and research labs around the country. This paper is little more than a vision statement along the lines of the phrase that Microsoft has used for much of its lifespan: "a computer on every desk and in every home". It doesn't say anything that people haven't already thought of.
What's more relevant and interesting about this paper is that there are probably no organizations on the planet capable of developing a system like this on their own. It's going to have to be collaborative. Despite the me-tooism of Microsoft researchers in acknowledging the directions being taken by others, the Microsoft coders in the trenches won't be capable of developing something like this to be stable, reliable, and secure.
This may mean it won't happen in the way envisioned in this paper. Microsoft will have to wait until someone comes along and figures out how to actually do something new, and viable - not just talk about it - just as Tim Berners Lee et al created the web. Then they'll try to embrace and extend it, if they can.
Re:Nothing new here, move along... (Score:2)
Let me second that. People have been trying to implement what Microsoft's vision statement outlines for many years. But these systems have always turned into a complex tarpit of interfaces, resource management, and code. Microsoft might be able to throw enough programmers at the problem to get something working, but it will fail because of its complexity--after all, real people will have to program it.
There are some genuinely new ideas needed for how to make this vision happening, and Microsoft doesn't seem to know any more about that than anybody else. I suspect once someone figures it out, it won't take legions of programmers to do it.
Re:Please read the paper before posting. It's shor (Score:2)
Re:Please read the paper before posting. It's shor (Score:2)
Umm... ya, you got psychological problems. Wishing physical harm on someone just because he's ruthless and successful in his business practice? I think you should follow this link [psychologyinfo.com].
Re:Please read the paper before posting. It's shor (Score:2)
He is not just ruthless he is also a criminal. His organization is criminal, he has perjured himself, his underlings have perjured themselves and tampered with evidence. He and his cohorts have also intimidated witnesses in a federal trial. All of these acts are criminal and by all rights he should be jail right now.
Please do not try to minimize the criminal acts commited by MS and their top brass bringing their success up. Of course they are successful they resorted to crimes when everybody else was playing within the bounds of the law. It's an unfair advantage and one that our legal system ought to rectify.
BTW. Although it's not a crime I have not read one interview or statement from any MS executive which did not contain at least one lie. These people are pathological liars.
Is it right to with criminals death? Well maybe not but I do wish they would serve their time in jail.
Re:Please read the paper before posting. It's shor (Score:2)
good freaking god... you are such a god damned loser for even saying that in jest.
The next generation of application.... (Score:4, Funny)
What!? You thought I could come up with a witty way to make fun of that statement?! I'm not a magician!
Herd of Gorillas, trying to make a better Gorilla. (Score:2)
Leave it to gorillas to invent a super-gorilla, when what nature (the client) wanted was a human.
Emulation of the past in bigger and better methods is not the shining future I had hoped, for, folks. After all, I don't really want a computer, I want machinery that does my work and makes my life more comfortable, preferably without my having to train or tell it. I don't want robot slaves that act like human, I want a thermostat. I want an operating ystem that helps my computers be devices that help me in my life, not the other way around.
Fascinating... (Score:2)
- They seem to completely overlook practical problems such driver issues, concentrating on application development. While driver issues are a good chunk of what made NT (and other Windows flavours) so crashy.
- They also completely overlook interoperability problems. The article is placed in an imaginary world where every computer on the Net runs MS Millenium. That's just so, so typical. And the worst is, I'm about certain those guys didn't think wrong when writing that. It's just the way they seem to think (we get to do whatever we see fit, and fsck the competitors).
- More interestingly, they also overlook the problem of revenue sources. I mean, if the OS is 'everywhere', how does MS earns money off it? The underlying assumption that computer users owe money to Microsoft no matter how, kind of disturbs me. Though I admit it could be me overreacting, too, with them being the Microsoft we all know and love and everything.
- And, of course, they once again assume that there are exactly two types of developpers and nothing else: Microsoft developpers, who get to write system-level things, and the rest of the world, who get to write applications using the tools provided by MS (note how the 'high-level' languages they mention are all available as MS products -- completely overlooking such wonderful abstraction tools as the Python programming language, among others).
Yes, this is truly fascinating, because, on a theorical point of view, they got it right, and yet their vision is certainly not something anyone is their right mind would like to see becoming real. Thanks for posting that article,
Microsoft still promoting CSotD??? (Score:2)
A little-known web site suddenly achieves popularity, perhaps with a link from Cool Site of the DaySM
Hello Microsoft. Welcome to the year 2001. Those type of sites died back in 1997. CSotD was the first and best, but the copy cats ruined the genre. I even had a web site selected back in 1995 for CSotD. Really ticked off my ISP (tyrell.net - gone now).
Re:Microsoft still promoting CSotD??? (Score:2)
However the page is copyrighted 2001.
Interesting but... (Score:2, Interesting)
The authors however have conveniently omitted the question of whether the future OSs should be cross-compatible. Since so much fuss is being made about having a distributed OS across heterogeneous networks and heterogeneous machines, wouldn't it be worth an effort to also try incorporating some kind of support for other OSs. For instance, Millennium could implement support for ext2fs by itself to make Linux partitions visible either on the same machine, or across a network. The linux kernel team has already done its bit about compatibility with co-existing operating systems.
What is of need is to have some set of common services that all operating system developers, irrespective of what gods they worship, can pledge to provide.
Is this too much to ask from the M$ guys?
They're after our Slashdot effect! (Score:2)
A little-known web site suddenly achieves popularity, perhaps with a link from Cool Site of the DaySM or a mention in a prominent news story. Word of mouth spreads, and soon the web site's servers are overwhelmed. Or rather, would have been overwhelmed except that heuristics in the Millennium system had noticed the new link and already started replicating the site for increased availability. Monitored traffic increases confirm the situation and soon the site's data has been "pre-cached" across the Internet. As the site's usage drops over the following weeks, Millennium reallocates resources to meet new demands.
It's a trick! They're out to undermine the very /. effect on which we thrive!
-- MarkusQ
Interesting Stuff (Score:3, Insightful)
Unfortunately coming from Microsoft most /.'ers will prefer to scream and whine about it, attempt to twist it to demonstrate their own particular MS issue or make more jokes that are usually weak at best.
Pity, because if this had appeared elsewhere without any MS connection folks would be talking about it in a positive way, taking the discussion someplace interesting. Instead most are just blinded by the name MS and have once again congregated for the ritual stoning.
Anyway, /.-correctness aside there are a couple of points that the paper glosses over (amongst many) that I find particularly interesting:
The first is the concept of stateless storage - files are there as long as you need them then eventually wither away when no longer referenced or required. This seems to me a particularly utopian view as I'm regularly realizing that I'm either missing a note I want from long ago (too aggressive purging) or that I've got so much material on something that it's becoming burdensome. I entirely fail to imagine how this sort of winnowing could be automated. Agents to help me organize, tag, and prioritize yes, but without my interaction it strikes me as likely reliable as a computer consistently recognizing pr0n images from others.
The next is the internal intelligence of a system. This has been an area of much research for many years. The current-state information should be almost all available from within the system and with a few supplied metrics (costs, resources, constraints, priorities) "intelligent" decisions should be possible to make. Surprisingly there seems to be little of this actually available on the market already, at least not much available for general server/desktop management (that I've heard of.)
Finally the lack of references to directory services and the role PKI/encryption would play in this future scenario is interesting. Clearly these will be key elements in the ubiquitous seamless environment the authors are talking about yet their mention is notably absent. Is this a reflection of MS's appreciation of these as areas of strategic importance in which is hasn't yet a firm foundation and doesn't wish to draw attention to or is it something that the authors think will be so established by the time they're envisioning explicit reference isn't necessary? Either way it's an interesting omission.
Is this really all that bad? (Score:3, Insightful)
These guys were really looking forward to the future. And I don't think the standard MS bashing applies. Not everyone who works in Redmond behaves like MS's business unit.
I'm sure for the most part, the coders are great people. Its the business men upstairs who we should really have beef with.
Seriously folks, can't you see that indicriminate MS hatred is d no different from other forms of bigotry like racism and homophobia? MS does put out some quality products. I'm told their games group is very good (Age of Empires) and their input devices are top notch.
Captain_Frisk... wishing everyone would think before flaming.
Re:Is this really all that bad? (Score:2)
Age of Empires was written by Ensemble Studios; Microsoft was just the publisher. IMHO your example only demonstrates the barriers to entry for smaller companies in that they are forced to band with a larger distributor in order to get their product into market.
neat... (Score:2)
The abstraction of data and computational location is cool. They're not saying that we should blindly start distributing our data across network devices without any attention to latency, reliability of links, security, etc. Ever heard of 'quality of service'? Or authentication? Or authorization? Or resource limits? In the case of computation, sometimes you can break a program up into blocks that take a long time to execute; if it takes much longer to execute the code than it does to move the code across the network to a faster/less loaded CPU, then it makes sense to do it. On the other hand, if the computation will take only a little time, or if the result is required ASAP, you wouldn't want to move it. If it's unknown, let the user pick a default or let the system make a good guess based on what the code looks like. And, they're not saying that we should send our data to MS to be worked on, or even someone down the block - maybe you have some of your own computers laying around that don't get used much. The goal here is to turn your private LAN into a cluster, that only acts as a cluster when it makes sense to do so. In the case of storage distribution, they're not saying that others on the net should be able to use your storage space without your permission or that you should have to store anything on anyone elses storage space. Let's first consider three cases; a swap file, a master thesis document, and an mp3 file. You would want to keep your swap file on your local drive; the swap manager would request this type of low latency storage from the file system. You'd want your thesis document copied to every available storage device that you could (maybe encrypted and signed to ensure that it's secure); you'd tell your word processor to save it with this quality of service. You wouldn't likely care to encrypt your mp3 files, but you don't need to keep them on your drive when there's lots of space available elsewhere on the network (think next generation storage area network). You wouldn't want to store the mp3 too far from your network, but as long as it came back at more than the bit rate that the song plays at, you likely wouldn't care too much (unless your friends often download mp3s from you). If some device on the network runs out of space, it could shuffle stuff around. It might make sense to elect a storage manager system on your network, replicate your file allocation table/inode table/whatever around to each box on the network, so that if the distributed file system server (really just something that keeps track of locations) goes down, something else could come up in it's place. I mean, I havn't really thought about this for too long - I'm sure that there'd be some problems but nothing that can't be fixed during the design stage.
Self tuning is also cool. It'd be great for all of those sites that get slashdotted. It makes sense to do expensive things on a website (server side) to provide more features when there's light load; when there's heavy load, it makes better sense to hold off on those expensive features and concentrate on the content instead. This might mean auto-tuning apache's caching and stuff, or automatically re-indexing a database to better serve the kind of requests that are popular. Some of this means lots of application programmer work - like what features to sacrifice under heavy load, but others like automatically indexing can be done with varying degrees of administration.
It's not all evil, and some of it is really cool. The idea is that we should be ABLE to make the most use of the resources available, and not be limited by things like physical location.
Microsoft's Cairo (Score:2)
This one has articles on Cairo and Copeland, so I'm glad I kept it.
But it's at work now, and I'm not. I didn't realize when I noticed the magazine today under my box of microwave popcorn that re-reading the article would be timely until seeing this thread.
Someone explain to me.. (Score:2)
..why Microsoft's Acquisitions department is making technical decisions about the future of Operating Systems?
Mozart (Score:2)
Abstraction is great. (Score:2)
Has anyone been following the trends of viruses over the past decade? As computers become more user-friendly, we allow more dumb people onto the internet. Each new, more abstract version of windows brings teeming waves of imbeciles onto the internet's sandy shore. Beached, they lay idle in the bile-soaked sand, stinking up the coast. We can clean up the mess by requiring that people take a simple test before they are allowed to use thier computers. It could be part of the liscense agreement. We could call it the Garbage Prohibited Liscense.
The dumb people can have a seperate dumb internet using a proprietary protocol developed by Micrsoft.
Fascinating. Yes, fascinating! (Score:2)
But wait--are they? The question that comes to mind for me is WHY are they experimenting here? On the one hand, there's the standard MS approach--anything to make a buck, and gain market share. The Borg approach, in other words: Rewrite the definition of the OS or the internet, until you own it all.
But then you see this statement:
"We do not harbor the conceit that it will be possible to be fully successful in such an endeavor, but we do feel that the time is right for radical experimentation."
The first part sounds like honest programmers, and the second part sounds like geeks. Could it be that (gasp!) MS has some good people working for them? Some people who really _do_ want to push the envelope a bit, regardless of the corporation's intent?
At any rate, I find it interesting and slightly ironic that this is coming from the company who first made >90% of the population aware of (or care about) what their OS actually was.
Oops! Almost forgot (Score:2)
I can see it now: "Microsoft: The fattest thin client ever created!"
It's nice to see MS is not changing... (Score:2)
This reminds me of MS Press' book Code Complete. All the way through it they harp about stability and design and useablity, then they go off and release some of the buggiest code this side of my first ZX81 programs.
MS is doomed folks.
This is MOSIX (Score:2)
Distributed computing? Automatically deciding if a program should run locally or on a remote machine? Fault-tolerance? Dynamic load-balancing? Resource controls? Near-infinite scalability?
Sorry Microsoft, but you're the one playing catch-up here. Linux already has, working, today, 98% of your vision.
It's called MOSIX [mosix.org].
Frankly it's the most jaw-dropping bit of Linux development I've ever seen. On a local network, create your own supercomputer out of idle workstations. Across the internet - well, .NET should go hang its head in shame. As a programmer, all you have to do is write ordinary, threaded applications, and magically benefit from the processing power of tens, hundreds, even thousands of machines. MOSIX does all the hard stuff.
Truly an amazing piece of work.
This is UCLA LOCUS (Score:2)
As for the automated network administration thing, AppleTalk networks did that from day one. That approach didn't scale (too much broadcasting), and the security was lousy (a more fundamental problem with plug-in and go).
an astounding lack of vision (Score:2)
I think it's unlikely that Microsoft will do better with Millenium than an open source operating system that already exists: Plan9 from Bell Labs [bell-labs.com]. Plan9 already supports location independence, aggressive abstraction, introspection, and all the other stuff that is in Microsoft's vision (Inferno, which the paper cites, is somewhat based on Plan9). The limitations Plan9 has (and it has many) are, I think, intrinsic to this vision, and I doubt traditional operating system designers are equipped to deal with it--otherwise, they would have already done so over the last few decades. And nothing in Microsoft's paper suggests that Microsoft is straying outside this well-grazed field.
Altogether, it looks like Microsoft is going to do what they always do: they are 10-20 years behind the curve, and they are working on another unimaginative, outdated operating system.
Normalization (Score:2)
There, you have the future of computing.
How long will it take? Depends on how many glasses of wine the engineers the world over drink between now and then.
Re:stupid people will require stupid OS's (Score:2)
Re:stupid people will require stupid OS's (Score:3, Insightful)
In response to a lengthy descrption of a drastically different computing paradigm, jimarndt responds that it should be 'easy to use'.
I'll bet jimarndt didn't read the article at all.
What Microsoft Research is sugesting is a network computing model unlike anything I've previously heard of.
Read this (from the article):
A user purchases and installs a new personal computer or workstation. The hardware vendor has done a good job with the cables and connectors, so plugging the system together is easy. The user plugs the power and network cables into the wall and flips the power switch. From the moment that a boot ROM, or perhaps a boot loader on disk, downloads code from the network, the new machine joins the Millennium system. The user has full access to Millennium with no human-managed configuration activities required. Millennium evaluates the hardware resources of the new computing device that it has acquired and starts to shift computations and data in response.
Wow! I'm all for a web-services model. I like the idea of having the line between an application and a really usefull website be blurred. But this is sugesting something much much more! Who controls the "Millennium" system? Obviously, they've got a certain company in mind.
Think about privacy on a system like this!
Think about the potential security risk (sure, security is on their list of goals, but that doesn't really make the problem go away!) on a system like this! In the "A new network" example, the reseachers say "The administrator inserts a Millennium installation DVD disk into one of the machines and the system propagates across the network." Imagine that I insert that dvd, only I'm not the administrator and the dvd isn't the system update. Think it won't happen? As we saw during CodeRed the windows update servers aren't even properly protected! Hackers would have a field day if the Milenium system were ever made a reality.
I think a much better paradigm for the "new millenium" (how long does it stay new?
Microsoft wants the end-users of the world to put all their eggs in one basket and I don't think thats a good idea.
Re: (Score:2)
Re:To be absolutely honest (Score:2)
.technomancer
Re:Plan 9 (Score:2)
(BTW, the do give credit to Inferno, an embedded little brother of Plan 9.)
Re:BSOD (Score:2, Informative)
I've seen it blue-screen straight out of the box from Dell and Gateway.
I've seen Windows 2000 not only blue-screen, but I've seen them reboot themselves when Windows Media Player or Quicktime start to play video clips.
I've seen Windows 2000 blue-screen when a USB keyboard is plugged into them.
It's better than NT or 9x, but it still sucks.
Re:BSOD (Score:2)
So if the computer crashes, taking everything you had open with it, it's okay as long as it reboots itself instead of making you do it yourself?
Re:one thing missing (Score:2)
Actually, I think that's a pretty impressive statement coming out of Microsoft.
Re:one thing missing (Score:2)
More like Plan9 (Score:2)
We haven't had any news about Plan9 in quite a while. Could someone in the know shed some light as to what's happening with it now?