A New Kind of OS 393
trader writes "OSWeekly.com discusses a possibility of futuristic OSes with both negatives and positives. From the article: 'Imagine if you will, a world where your ideas and perhaps, even your own creative works became part of the OS of tomorrow. Consider the obvious advantages to an operating system that actually morphed and adapted to the needs of the users instead of the other way around. Not only is there no such OS like this, the very idea goes against much of what we are currently seeing in the current OS options in the market.'"
It's like nothing we've seen .. since Linux (Score:5, Insightful)
Consider the obvious advantages to an operating system that actually morphed and adapted to the needs of the users instead of the other way around. Not only is there no such OS like this, the very idea goes against much of what we are currently seeing in the current OS options in the market.
I don't know about the parent, but when I build a kernel I don't just default to everything. I build for what I'll need. If that changes significantly then I'll do another with different options and settings.
While it may seem novel to "morph" to what's currently needed, it's not really so revolutionary an idea. It once was that operating systems cleared out unused libraries from memory (rather unlike the way Windows behaves, by loading 385 MB of junk you just might need during a session) and dynamically adjust the amount of processor priority and time (Priority and Run Burst) each task is assigned as needed depending upon system load, etc. Some things appear to have gone backward as we've got more dependent on ooh, shiny! features, whistles and bells.
Maybe like NASA digging up how they once did the Apollo Moon missions, to relearn, it's time for some of the people who do operating systems today to look back at how we did things 20-30 years ago.
It's been done (sort of) (Score:5, Insightful)
As a Mac user who has to interact with PCs quite often at work, I find this not only not helpful, but completely obnoxious. I realize this is probably due to MS's fairly awful learning algorithm, but I think the lesson here is that it's going to take a long, long, long time before anything like this can make its way to the desktop without pissing off 50% of the users. Or more.
Tedious... (Score:5, Insightful)
I'm not sure 'bout that (Score:3, Insightful)
Trying to fix someone's computer with an adapted OS would be a real pain, and asking for help via email would be next to impossible, because your options could be in a different place.
Even today's OS adapatability can be unnerving. I get used to using something from the top N programs on the Start Menu (Sorry, no Linux on the work computer), but when it gets bumped off because Windows thinks I used something else more often, I'm confused for a few seconds, just enough to be annoyed.
So my guess is that this "new kind of OS" won't succeed because of support hassles and confusing the user. But it'd be darn cool if those problems could be fixed.
Re:It's like nothing we've seen .. since Linux (Score:3, Insightful)
Good ideas (Score:5, Insightful)
If those questions had answers, someone would already be writing the "OS of the future." Sadly, at least in present and near-future technological terms, those questions don't have answers, and so they'll remain in the world of hand waving prognostications about some techno-utopia.
Nothing to see here, move along. (Score:5, Insightful)
This article sounds like articles from 1990 about the house of 2015, you know, the ones talking about how saying "light" will turn light on, how you will check and reply to your video e-mails from your living room big screen TV well you know.. just like Back To the Future II.
My point is, I don't think you'll really see or even want a self deciding or modifying OS, even if the idea sounds cool. Mod me down for this if you want, but I think this whole article is just some nearly-worthless futuristic rambling, even if it's got some interesting ideas, don't pay attention.
Adaptation algorithm = boon for Spy agencies (Score:2, Insightful)
Adaptive OSes would be one step better since breaking into your specific "morphing" would reveal more intimate data about the way you think, the importance you place on specific topics based on the way you prioritize your email message accesses,etc. To some degree this is possible by cross referencing cookie data between big corporate sites who just love one another. But adaptation potentially makes it much easier.
I'll bet that people will be clamoring to include morphing (if it ever exists) in web 2.0 type applications. I don't really understand this excitement. Your data is only as secure as the trust you place on the system admins of your site. No contract ever really guarantees they won't give into law enforcement agencies who want to know what color underwear you like.
Interface, not OS (Score:4, Insightful)
Sure, when people talk about OS X they are often referring to it's interface (Aqua), but an interface does NOT have to be integral to the OS.
Linux / X-Windows are the obvious example on Slashdot.
Re:Turning the computer inside out (Score:1, Insightful)
Basically, you have to have a system that can do both documents and applications well. Of course I believe that applications should still run on an orthogonally persistent fine-grained object database underneath, provided and managed by the system, but there is still the notion of a task that runs and completes.
Re:It's like nothing we've seen .. since Linux (Score:5, Insightful)
You know, as a programmer, I get really tired of people suggesting ways to program computers "without doing any coding". That's where BAD things come from. That's where "dynamically hiding menu items" come from, so you never know where things are. That's where "visual programming" comes from, so you're staring at a screen full of boxes and lines with little to no organizational structure.
No. If you're gonna program a computer, learn how to program. The CS field as a whole apologizes for the fact that computers are hard. They are complex machines. Unfortunately it is not always easy to get them to work they way they should, or the way you want them to. But that's life. If you're not willing to learn how to program, you should be willing to learn how to use what other people have programmed, or learn how to write specs and make intelligent suggestions to the community. But this bullshit about "intelligently adapting the OS to a user's needs" is just asking for trouble. It's asking for "programming" without actually asking for any "design" or "specifications". It will end up being crap.
The fact is, making something "user friendly" means making the front-end more simple -- and thus making the back-end more complicated. But this complexity always eventually compounds and compounds until the end user can't understand what's happening and gets confused. In the end, we learn that computers are easier to use if you understand the back-end, and that can only happen if you use a minimum of metaphor. That is-- a straight-forward system that is obvious and transparent.
The mistake that Windows and many GUI systems have made is in trying to HIDE the system in metaphor. It always backfires, because although a transparent system may be harder to learn, it is far, far easier to deal with once the learning curve has been climbed. And since we've discovered that even the simplest metaphoric GUI requires "training", well.. you may as well train the end user how it actually WORKS instead of trying to hide it from them in a bubble of "interface".
Of course, that's just MHO. Though I believe Neal Stephenson [cryptonomicon.com] agrees with me.
(My apologies to the parent. My comments aren't really directed at you, per se, I just get tired of people suggesting that computer programming should be effortless. Computer using should be easy, but programming is programming, if you know what I mean.)
What hogwash (Score:5, Insightful)
Re:It's like nothing we've seen .. since Linux (Score:3, Insightful)
I have seen some of these things come and go. For example, I remember when VB6 came out and there was a lot of talk about it would be the end of C++. For example, why ever write an actual win32 based application, when it is easier to just crank something out in VB in a shorter time.
At the time, I remember some Windows C++ guys who I worked with being all like, "I guess I will have to find another career because I really don't want to be a VB programmer".
Well, it didn't happen.
This kind of a statement, that there will be some new revolutionary thing where computers can do new things that they didn't do before without having to be programmed - if you can really do it, then more power to you, but my guess is that it just won't be possible.
Re:It's like nothing we've seen .. since Linux (Score:5, Insightful)
Things have obviously changed quite a bit; you don't have to be a programmer to get WYSIWYG editing and print output anymore. It may not seem like it from here, but there are probably a lot of functions that most people consider "programming" that will fall into the same category at some unspecified point in the future. All that programming does is simply interface with the machine at a slightly more complex level than the average user. We're just talking about improving the interface to the point where some things, which now require "programming", will simply be "using" instead... and programmers will move on to more complicated arenas.
Macros or mail filters or Netflix's recommendation system are all ways that average users basically program computers today without any hardcore CS education. Ten or twenty years ago, they would have required such a background to accomplish the same tasks, but no one really considers it "programming" today; there is no reason that many other functions that we currently think of as programming won't become similarly easy or transparent.
There will always be the wizards responsible for writing the code that puts those things into place, and so that's where I agree with you--if you want to be a coder, go learn to code. In that sense, programming will always be programming, but I think the common definition of the word is a necessarily moving target.
Re:Good ideas (Score:4, Insightful)
You may as well talk about the OS of the future which just has a single button in the middle of the screen that says "do what I want". The gap between intention and action is bad enough, trying to model future intention based on past action is just asking for trouble.
Re:What hogwash (Score:5, Insightful)
I certainly did not mean to imply that there's anything wrong with a GUI. But there IS something wrong with dynamically hiding parts of a GUI based on some unspecified learning algorithm.
Do you understand what I mean?
Computers should be transparent and obvious, THAT is what makes them easier to use, not artificially messing with the interface to pretend the "hard parts" don't exist. But that doesn't mean we shouldn't be able to use the mouse to interact with them. It just has to be designed well -- meaning everything accessible in a logical manner, whether it is through the keyboard or the mouse.
Re:What hogwash (Score:3, Insightful)
Fine. A command line interface was what we had "in the begining" because it was all the hardware could support.
But you know what? When you're working that closely with a system, you learn it better! No, typing "mv *.txt ../textDocuments" won't teach you a wit about x86 assembly, but it will get you thinking about directory structure in a way that explorer.exe prevents one from doing. Using a text editor and a typsetting program like LaTeX can help you format well-structured documents with an ease that winword.exe will never be able to match.
I do not nostagically pine for CLIs. But on my Powerbook, the two most used programs are Terminal.app and Vim.app -- and ls, find, and grep get me through my chores quicker than graphical interfaces do.
Re:It's like nothing we've seen .. since Linux (Score:1, Insightful)
I am a professional programmer, I write code just like any other, but I have to say that things like Visual Programming langues have their place. Pure Data or Max/MSP for instance, for writing programs dealing with digial audio synthesis, are fucking brilliant. VVVV for doing 3D graphics pixel shader pipelines and controlling them with midi...
I suppose if I want to tinker at making an FM synthesizer in Max/MSP that must be a bad thing, because I didn't whip out C++ and fucking code everything from scratch. Wow, I might even get results in an hour instead of day. Wow, I might use this as a prototype to later write a VSTi in C++.
I suppose someone who can't write programs in a conventional programming language should be forbidden from having something easy to write programs in then too? They'd better bite the bullet and go take CS if they wanna create anything...
Bullshit.
Here's How That Works (Score:5, Insightful)
Re:Adaptation algorithm = boon for Spy agencies (Score:1, Insightful)
Re:It's like nothing we've seen .. since Linux (Score:2, Insightful)
yeah. ok. if you don't like the fact that people expect programmers to be the people programming, maybe you should be in a different field.
The problems with this (Score:2, Insightful)
Now I'm not saying that this isn't an improvement. However, I am saying that as the number of options pertaining to a particular decision grows it becomes harder for us to choose rationally. The more options their are means that the more evaluation the user must perform in order to sort through those choises. Now in the Real world we can do this simply through reflex and learning. We've evolved over thousands and millions of years and individualy learn from expiriance. We start off with parents guiding us, a default factory human setting if you will. In the real world our senses have also evolved to detect not everything, but what is relevent. We are omnivores who tend to eat a lot of fruit so we have good color vision to tell at a glance what is tastey and what is not. We do not hunt at night so we have poor night vision. The sense of touch is much more accute in our fingers than our toes in order to use tools better.
The problem comes when we make adaptive computer systems we must also be sure that options are intuitive as well. Just look at the unix shell for a somewhat non-intuitive example, and many aspex of windows are worse as we can't even see into much of those inner workings.
Revolutions make good PR slogans but generally bad development models. The way forward to the "OS of the future" is to keep developing what we have now(my own bias is debian(ubuntu) linux) and perhaps shift to a bit more of a focus on both the backend of how the computer runs the code and the backend of how the user interacts with the computer. While a bell or whisle may be a few wavey lines highlighting some widget on a screen, if they help you find that widget when you need it then they probably have earned their keep.
Re:What hogwash (Score:2, Insightful)
"mv *.txt
means move all text documents in current folder to the textDocuments folder which is contained within the parent folder of the current folder...
How is it that my GUI windows drag and drop doesnt allow me to understand this? wait that's right....it does.
CLI has it's place but in my experience I've been able to do a great number of things that a Linux Guru can do in CLI with my Windows GUI. Perhaps it's a matter of what technology you grew up using?
Metaphors aren't all bad (Score:3, Insightful)
While I agree completely with the point you're making, the first sentence of this paragraph seems to contradict it. Unless you just mean that when people *commonly speak of* making things "user friendly", they're talking about hiding things for the sake of simplicity while actually adding complexity. But that's certainly not the way it has to be, which seems to be the overarching point you're making.
People (users and programmers) seem to think that making something user-friendly involves hiding complexity behind some kind of "wizard" or cheesy metaphor, because the legacy systems underlying most computers in the world and irreversibly complex kludges full of inconsistencies and weird hacks. When you have a very complex system with lots of little rules and exceptions to those rules, it's damn near impossible to make that "user friendly" without hiding it all under the rug.
Instead, the better approach - though I understand why it's largely impractical for reasons of backwards compatibility - is just to have the underlying system less complex to begin with, and then let the users look "directly" at that - whether that be via GUI or CLI doesn't matter. Have the underlying system operate according to the smallest number of rules that can be consistently adhered to while still achieving the necessary functionality, and then present the operation of those rules to the user in the clearest and most consistent manner possible. Maybe attach a metaphor to each rule (e.g. the "directories are like folders, disks are like filing cabinets" analogy works well), but make sure that there's a 1:1 correlation between metaphorical objects and the "real" (logical) objects dealt with by the computer.
For an example of how NOT to do things... One of my biggest complaints about Apple's declining UI standards (which used to be top-notch) has been the way that iPhoto organizes it's albums (or at least used to... I've been told this is different in newer versions, but haven't confirmed that). When you create an album in iPhoto, and move your photos into there, it's not actually creating a folder on the HD and moving or even copying your photos into that. The iPhoto album grouping is contained entirely in iPhoto. This means that if I want to send someone an album, I can't just zip some folder in my iPhoto library and send it... I've got to go into iPhoto and "export" those photos. (This has the further fault, aside from not accurately mapping metaphor onto what's really happening with the data, of reinforcing the false notion that files are "in" programs. I get so tired of users telling me, when asked where such-and-such file is, that it's "in Word". I'd go so far as to abolish all Open dialogs if it'd make users realize there's a structure on their disk organizing their files, and that files don't live inside of programs).
So yeah. Metaphors aren't bad, so long as the metaphor accurately maps on to what's really happening - i.e. so long as the system is transparent. If you've got a very complex system underneath, making it transparent is going to make it user-unfriendly. But hiding what's really going on will also make it user-unfriendly, for the reasons you stated in the quoted paragraph. So your only solution is to have the underlying system itself be simple, but flexible and thus powerful, and then present that simplicity transparently to the user. Then your users won't have much to learn, and once they've learned those few rules, the entire possible realm of functionality in the system will be at their fingertips.
Re:It's like nothing we've seen .. since Linux (Score:5, Insightful)
Indeed. In fact, Microsoft developed the very feature this article is describing, and they named it 'Clippy'. The rest, as they say, is history
Re:Thank You!!!! (Score:2, Insightful)
- want to use a USB device.
- use a standards compliant web browser (although I think Firefox is still available for Windows 95).
- want to network securely.
- want to leave your PC on for more than 6 hours.
- install a wireless network interface.
- want to run a secure operating system.
- realise you're running an OS which is as old as Sega's Daytona USA.
Hey, I'm sure it's working for you, but it simply won't cut it for me.
Re:Other users? (Score:5, Insightful)
sort of how your tivo starts to think you're gay because you're girlfriend keeps recording oprah?
TFA was completely worthless. besides the whole "big brother" strawman the author sets up, there are so many other issues that are simply not addressed. he uses a silly example of having the computer learn that you don't like to be bothered with emails while working on a video editing project except for "critical emails". well, how does the computer "learn" this behavior? if you don't check your mail when you edit video, you're not likely to find the "critical" email. thus, the computer doesn't understand that an email from "bob my client" is somehow more important than an email from "my nigerian ancestor who is also a prince." if you DO check your email during your video editing session, i suppose the computer would think that you like to be bothered with your emails while you're working on video.
then you have to factor in the complexities of whether or not editing video is in the same importance category as photo retouching. and is that also as important was writing a letter? i think i'd rather my computer let me be the judge of whether or not an email is important to me and when. besides, there's no easy way for the computer to know if i'm doing "entertainment work" (in my case, farking a photo) or "work work" (retouching photographs for publication).
also, as anyone who's used any sort of "learning technology" like voice recognition or hwr, we all know there's a long and frustrating process to getting the software to work even passably well. so i guess the first six months or so of your new system you'll have your computer making all sort of bad assumptions about your workflow and deciding to hide or highlight certain functions in your apps. while working within a tradition WIMP methaphor might not be the theoretically most efficient way to get work done, it's at least generally consistent. which, in turn, probably makes it the most efficient.
if i need a tool, i want it to be where i left it. i don't need my full set of hex keys as often as i need my cordless drill, but i sure don't need any magic gnomes running hiding all my hex keys and replacing them with my drill (which i already have a place for).
Re:What hogwash (Score:2, Insightful)
Re:It's like nothing we've seen .. since Linux (Score:3, Insightful)
Most programmers don't think everyone should know how to program, and I don't think this was the point the OP was making.
Many programmers believe that if someone wants to program then they should learn how to program. Sounds pretty reasonable to me.
The hard part of learning programming is not learning syntax - the hard part is learning to decompose tasks, spot edge-cases, handle bugs and generally think through problems in a logical, structured, consistent way.
"Making programming easier" so far in computing history has chiefly consisted of abstracting the syntax and trying to prevent inexperienced users from doing anything too stupid. Unfortunately unless you already know task decomposition, how to handle bugs and how to tackle various types of problem all the syntax-simplification in the world won't matter a damn.
It's like trying to make "writing stirring and emotive prose" easier by fiddling with the rules of spelling - sure, your essay might be spelled correctly, but that won't make it interesting, well-thought-out or persuasive.
Or think about it like making/maintaining cars - you can make it easier and easier for people to make their own parts, but you still won't end up with the utopian ideal of everyone driving customised electric supercars that get a thousand miles on one charge.
What you'll get is a bunch of people wobbling along at 50 in piece-of-shit rustbuckets that run on leaded petrol, with bits dropping off left and right.
After a while, when trained mechanics and car-designers have seen enough cars spontaneously disintegrating at 50 miles per hour, injuring or maiming the occupants in the process, they might just start to wonder if that whole "empowering users to make their own cars" thing was a bit of a stupid idea. If maybe, just maybe, while users are inordinately proud of their hideous homebrew best-practice-bereft lash-ups... just maybe they'd actually be happier and safer driving nice, boring cars designed by people who actually know what they're doing. Or at least, if there was some sort of accreditation process necessary before people started letting their entire businesses rely on said junk-piles.
The hard part of car design is not making the components - it's knowing how to design a car.
The hard part of programming is not writing the syntax - it's knowing how to design the program.
Simplifying the process of designing a program has almost nothing to do with simplifying the syntax.
It's not elitism, it's just someone who knows what they're talking about and isn't communicating the reasoning behind it to you.
Actually, I suspect many programmers would love it if people left the programming to them. It'd wipe out a whole generation of half-arsed Excel Macro/MS Access/VB/VBA/VBScript abortions that we then have to take on, fix/re-write from scratch and maintain for you. I know I would.
All in all, it's not that programmers think everyone should be forced to learn to program (although I do personally believe they should teach it a bit more in schools, especially these days). However, many computing and IT professionals (hell, I'd be willing to bet many professionals in almost every professional field) do believe that if you want to do the job, you should learn how to do it first.
This isn't elitist, any more than "if a job's worth doing, it's worth doing right".
And frankly any opposition to that, far from being anti-elitist, sounds more like being pro-incompetence.
Alos, and incidentally (as is traditional on
Re:What hogwash (Score:3, Insightful)
What, apart from linking to the Stephenson essay about the command line, describing it as agreeing with your stance, you mean?
And saying the GUI hides the system in metaphor implies you prefer direct access to "the system" with a GUI.... meaning... what? Well, unless you intend people to use "the system" by using little electromagnetic tweezers to flip bits inside the hardware I think assuming you meant the CLI was a fair guess.
Anyway your entire rant was completely misplaced.
I get really tired of people suggesting ways to program computers "without doing any coding".
Grandparent didn't say he wanted to program computers without doing any coding. He said he wanted an OS which loaded appropriate stuff according to his changing needs without doing any coding. Yeah, I'll agree completely that if you want to program a computer, then STFU and code. Little draggy-droppy things as some sort of replacement for coding is a bit ridiculous. But unless you consider all computer use to be "programming" a computer, then surely you can accept there is a vast swathe of things the computer should be able to do without writing code.
Re:It's like nothing we've seen .. since Linux (Score:3, Insightful)
I get sick of authors that think everyone should know how to read and write.
Actually, no I don't.. that was sarcasm.
Programming isn't like learning to maintain your own cars. It's a general purpose ability to express particular thoughts in a structured way such that one of the most powerful general purpose tools in the world can be applied to it. It's worth learning for EVERYBODY. You may not realize it, just as 2000 years ago people may not have realized the value of an entire society that could communicate via written communication.
Speaking as a programmer... I don't want programmers to be the scribes of the 20th century. We should not be gatekeepers to this powerful system.
-Laxitive
Re:It's like nothing we've seen .. since Linux (Score:3, Insightful)
Part of the idea of a personal computer is that the average user can "program" the computer to do tasks. If I want to reduce a set of tasks that would typically require me to perform many repetitious things on the computer, to something that the computer can do for me all at once, should I have to A) wait for some company to produce software for me to buy that does this, or B) go to school and learn software engineering (your apparent solution), or C) have an easily-approachable means of communicating to the computer what I want done (scripting language, etc)?
Programming doesn't have to mean being a computer programmer by trade, rather, being able to instruct the computer to do something it isn't pre-set to do. There is a place between people who use their computer as a glorified TV and people who are engineers. These people should be able to get more out of their computer without going the full route of being a computer professional.