A Closed Off System? 177
AnarkiNet wonders: "In an age of malware which installs itself via browsers, rootkits installing themselves from audio cds, and loads of other shady things happening on your computer, would a 'Closed OS' be successful? The idea is an operating system (open or closed source), which allows no third party software to be installed, ever. Yes, not even your own coded programs would run unless they existed in the OS-maker-managed database of programs that could be installed. Some people might be aghast at this idea but I feel that it could be highly useful for example in the corporate setting where there would be no need for a secretary to have anything on his/her computer other than the programs available from the OS-maker. For now, let's not worry if people can 'get around' the system. If each program that made up the collection of allowed programs was 'up to scratch' and had 'everything you need', would you really have an issue with being unable to install a different program that did the same thing?"
Wouldn't a live CD do this? (Score:5, Insightful)
code isolation (Score:5, Insightful)
Question moot. (Score:4, Insightful)
"If each program that made up the collection of allowed programs was 'up to scratch' and had 'everything you need',"
Considering that is impossible, the question is pretty much moot, isn't it. I am always going to find more needs for things, and chances are I'm going to need a new piece of software. Even if an OS shipped with "everything", new things are invented all the time. Maintaining a "Closed OS" to allow for new things would be difficult, and to keep it relatively up to date even more so... but then it wouldn't really be closed if new stuff kept getting added to it...
Same thinking? (Score:2, Insightful)
No. - Re:Wouldn't a live CD do this? (Score:5, Insightful)
console? (Score:5, Insightful)
Have had it for almost 30 years! (Score:5, Insightful)
as a software developer... (Score:3, Insightful)
As a component of a larger, networked system, which had parts where I could install and run the software I was developing, then yes, no problem. But alone, by itself, no, it would be completely useless.
Of course, there's still some interesting questions about this theoretical beast. Is it scriptable? I often have quick one-off tasks that are best done with a quick script. If I can't run one-off scripts, then it's not "up-to-scratch" and doesn't have "everything I need", and if it can, then it's not a completely closed, locked-down system. The only way around that, even in theory, is to have an infinite number of monkeys providing you with all the scripts you could ever need in advance, and even then, there's probably be some difficulty finding the script you need right now from that infinite number of scripts. (Not to mention the costs of the infinibyte drives needed to store all those scripts.)
Bottom line, I think the notion of a machine that does "everything I need" is about as realistic as those old concepts of an irresistable force or an immovable object. Nice for creating logical paradoxes, but completely silly otherwise.
Re:Smith-Corona to the rescue! (Score:3, Insightful)
Re:On the subject of the CD Rootkit... (Score:3, Insightful)
Re:Vista + 'DRM' Hardware (Score:1, Insightful)
*groan* (Score:5, Insightful)
Oh, for fuck's sake! Don't give them any more ideas.
The extra cost of technology staff and the risk of a shittastrophe are nothing compared to abysmal employee morale. If you don't let 'em stroke off for a few minutes a couple of times an hour by going to ebay or playing snood you're going to end up with a resentful staff. And they'll produce awful, crappy work for you.
Re:No. - Re:Wouldn't a live CD do this? (Score:3, Insightful)
Yes, but you had to go out of your way in order to achieve this, right? That is, it's not something that happened because of soemething you downloaded off the net did away with the "protection" MS had installed originally in the machine. (Besides, as far as I know, only the bootloader needs to be on the blessed list).
Of course, everything is fallible. And besides, if every single executable code had to be signed and verified, how expensive in terms of CPU time would that be?
Hypothetical question: "lusers" as decoys (Score:5, Insightful)
If we (hypothetically) closed off the "stupid user" vulnerabilities that are the major attack vectors right now, wouldn't the malware authors instead just concentrate on other, more technical, avenues of attack?
Here's my thought: maybe having systems vulnerable to idiot users is actually a good thing for the informational ecosystem as a whole. They're more than just the canaries in the coal mine (although they serve that function, too), they provide a steady stream of marks for the virus/trojan/malware writers and phishing-scheme authors of the world.
If these people weren't able to basically throw themselves on the swords of their own stupidity on a regular basis, couldn't this just lead to smarter malware, which affected more of us (not just the stupid/ignorant)?
Malware authors are inherently lazy and opportunistic. While there are still lots of "the monkey told me to click it so I did" people around, and ways to exploit this idiocy, that's what they're going to do. They're not going to mess around with esoteric buffer overflows to steal your information, when they can just send out some fake PayPal emails and watch the data roll in.
Given the choice, I'd rather have the primary attack vectors be ones that rely on user stupidity, rather than technical flaws, because 0-day technical flaws are too 'egalitarian,' attacking both the clueless user and the experienced person without warning. Personally, anything that keeps the collective attention of the Russian Mafia focused on people too dumb to check the URL line in IE before typing in their bank account information is a good thing in my book.
I know this isn't a very nice sentiment to hold, but if there was some hypothetical way to remove user stupidity as a vulnerability (not possible, so this is all just a mind game), maybe we'd be better off not implementing it?
I'm not suggesting that we shouldn't attempt to educate people on good computing practices, but if people are too lazy or disinterested to become educated, maybe in their laziness they can do the rest of us a favor by acting as the collective decoys?
Re:No. - Re:Wouldn't a live CD do this? (Score:2, Insightful)
Yes the bootloader only needs to be on the blessed list, but in the absence of a blessed bootloader which allows arbitrary code to execute...
To your last point, signing and verifying every executable is not a heavy CPU tax. The real issue is the granularity, and if you can prevent any excutable which intentionally or unintentionally allows arbitrary external code to be executed from getting blessed.
Not on my PC (Score:3, Insightful)
The concept is also flawed. Just because something isn't an executable doesn't make it not contain instructions that tell your computer to do something. Word macro viruses is a great example of this kind of problem. It's just a simple word processing document.. but it can also be a virus. The
This is not the answer to computer security.
Re:Treacherous Computing (Score:3, Insightful)
I've done some freelance computer work for people who don't know all the technical stuff about computers. This normally relates to spyware/malware/virii/etc. The grand majority of the spyware and malware is self installed. Downloading cutesy screensavers or cursors or backgrounds that come with all manners of desktop search, search bars. When you have a Athlon 64 3800+ with 2 GB of Ram and 10,000 RPM SATA drives in a raid array slowed to a crawl because of too much crap (with antivirus and antispyware software installed, something is wrong.
I've even seen half the spyware removing programs that show up as spyware themselves in AdAware!
We're getting to a point where security FROM the user is almost if not more important than security FOR the user.
Re:*groan* (Score:2, Insightful)
Me: I would leave the internal network detached from the Internet and remove all external sources of input except the keyboard/mouse, and put the OS on something read-only. Nothing gets in, nothing gets out. Works for work, not for play.
-uso.
Re:Seems to be a matter of reading 'man fstab' ... (Score:3, Insightful)
or more likely:
Re:No. - Re:Wouldn't a live CD do this? (Score:3, Insightful)
No, the question is not a joke: What would such an OS do with Active-X and Java? Ok, they support digital signatures and let's believe such a system would work.
And JavaScript? It's clearly executable, but would it be blocked? Who would use such a computer when 50% of websites are not viewable without JS? Not to mention sites that only exists in the form of one SWF file...
On a server, JS would not be needed, but usually one needs customization in terms of scripts a.s.o. If the admin could self-sign programs (and would be so careful to only that with programs he wrote himself and where he is sure that no malware is included) on a second machine, that could work.
Re:*groan* (Score:3, Insightful)
At least not wher I live. Do you have internet terminals for employess at the gas station ?
Are the guys at the foundry revolting because they can't browse eBay while waiting for the steel to cool ?
Soft in the belly workers need to wise up.
Re:On the subject of the CD Rootkit... (Score:3, Insightful)
So your argument is basicly that because trust can be misplaced, there's no point in having a trust system? Let's remove the classification system because the joint chiefs could be Al-Quaida members. Let's remove all digital signatures because the signing key might have been compromised. The point is who to trust, and also look out for misuse of the word trust. For example, TCPA software is less trustable from a computer security point of view because it can't be audited. For example, I trust the debian signed packages, in the sense that they're official packages of the software and not trojaned versions. Of course, the maintainers of the package may install a trojan but that's a lot less likely. Trust rarely works in absolutes, degrees of trust is the norm. Being on that database would be one of these levels, though honestly I don't see the big problem in corporate environments. Don't give users admin rights, install only serious software downloaded or bought from official sites. I think the actual number of rootkit cases where those procedures are followed are almost zero.
Re:Wouldn't a live CD do this? (Score:3, Insightful)
As other people point out, this is not perfectly secure, because this doesn't prevent the device from loading software remotely and runnint it. However, it does reducee the scope for damage considerably: while you can't prevent data from being lost or corrupted, the real time consumer in recovering from a subverted system is bringing the system back up to a state where you can trust it. You can reboot the system back to its original state, then plug in your updated virus/spyware scanner and run it.
From a computer science perspective, you can't really "close" a system completely, any more than you can have an organism that runs without RNA: the fact that instructions and data are the same thing are at a deep level part of our very concept of what a computer is. Take a file in a fairly complex format, say Microsoft ".doc". What is that file but a program executed by the Word doc interpreter to create a visual representation of a document? What is Word but a specialzed compiler/interpreter for such programs? Thus, most of the non-trivial programs on an operating system are, in a sense, virtual machines. If any of these virtual machines have any kind of flaw, then a malicious programmer can get them to do things the user does not want.
If you're making a virtual machine, you want to limit what programs running on that machine do on the underlying computer. Sun realized that when they sandboxed Java applets. The problem is that this is too restrictive to be popular with users. So you end up letting the user grant programs access to different resources. At that point, any vision of iron-clad security is gone, a victim of social engineering.