Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

A Closed Off System? 177

AnarkiNet wonders: "In an age of malware which installs itself via browsers, rootkits installing themselves from audio cds, and loads of other shady things happening on your computer, would a 'Closed OS' be successful? The idea is an operating system (open or closed source), which allows no third party software to be installed, ever. Yes, not even your own coded programs would run unless they existed in the OS-maker-managed database of programs that could be installed. Some people might be aghast at this idea but I feel that it could be highly useful for example in the corporate setting where there would be no need for a secretary to have anything on his/her computer other than the programs available from the OS-maker. For now, let's not worry if people can 'get around' the system. If each program that made up the collection of allowed programs was 'up to scratch' and had 'everything you need', would you really have an issue with being unable to install a different program that did the same thing?"
This discussion has been archived. No new comments can be posted.

A Closed Off System?

Comments Filter:
  • by amanda-backup ( 982340 ) on Tuesday July 11, 2006 @10:52PM (#15703090) Homepage
    Doesn't a live OS CD such as Knoppix achieve this goal? These are usually built for "everything you need" for a particular purpose. You can still access and create data on disks on that system, but you never corrupt the programs themselves. If all the applications being used are web based, then things are even simpler - simply boot up with Knoppix, open Firefox and you are ready to go.
  • code isolation (Score:5, Insightful)

    by TheSHAD0W ( 258774 ) on Tuesday July 11, 2006 @10:56PM (#15703107) Homepage
    This would be "mostly secure", but unless strict data-space separation would use it might still be vulnerable to a buffer overflow or similar attack that would allow arbitrary code provided as data to be executed. The attacker would use this opportunity to establish a "beachhead", modifying whatever integrity-checking system the OS is using to allow it to continue to exist.
  • Question moot. (Score:4, Insightful)

    by The MAZZTer ( 911996 ) <.moc.liamg. .ta. .tzzagem.> on Tuesday July 11, 2006 @10:59PM (#15703117) Homepage

    "If each program that made up the collection of allowed programs was 'up to scratch' and had 'everything you need',"

    Considering that is impossible, the question is pretty much moot, isn't it. I am always going to find more needs for things, and chances are I'm going to need a new piece of software. Even if an OS shipped with "everything", new things are invented all the time. Maintaining a "Closed OS" to allow for new things would be difficult, and to keep it relatively up to date even more so... but then it wouldn't really be closed if new stuff kept getting added to it...

  • Same thinking? (Score:2, Insightful)

    by JayTech ( 935793 ) on Tuesday July 11, 2006 @11:02PM (#15703128)
    Isn't this the same exact thinking behind the TCPA planned by Microsoft & Co? Where only "licensed" software would be allowed to run? Doesn't sound like a bright idea to me, in fact it sound pretty scary.
  • by jdogalt ( 961241 ) on Tuesday July 11, 2006 @11:02PM (#15703129) Journal
    No. LiveCDs do offer read-only system images. But they do nothing whatsoever to prevent other programs from being run. I.e. programs downloaded from the net, autorun(or manually) from cd. LiveCDs get you the benefit that each reboot resets you to an known state. That is quite different from an OS which only allows programs from a blessed whitelist to execute. One scenario might be the discovery of way to remotely log into the system. In the livecd case, the attacker can now run whatever program they want, and likely regain entry in an identical fashion should the system be rebooted. What the author of this post is interested in, is a system what would not let the attacker with remote login be able to execute any code not on the blessed whitelist. Now mind you, the idea that such a system would be 'invulnerable' is ludicrous. The XBox seems the quintessential example of a system which tried to achieve this design goal. My XBox currently runs ssh, freevo, and any executable I want, proving it is difficult to achieve a successful implementation of such a design. -jdog
  • console? (Score:5, Insightful)

    by minus_273 ( 174041 ) <{aaaaa} {at} {SPAM.yahoo.com}> on Tuesday July 11, 2006 @11:04PM (#15703136) Journal
    Anyone else think this sounds a lot like the xbox 360? encryption keys and all.
  • by JoeCommodore ( 567479 ) <larry@portcommodore.com> on Tuesday July 11, 2006 @11:06PM (#15703141) Homepage
    Lets see the Commodore PET, Apple II and TRS-80 were pretty \much can't touch this OS without a hammer type computers.
  • by Xtifr ( 1323 ) on Tuesday July 11, 2006 @11:21PM (#15703192) Homepage
    ...I would have to say no. At least not by itself. It's pretty hard to develop software if you can't install and test the software you're developing somewhere! ;)

    As a component of a larger, networked system, which had parts where I could install and run the software I was developing, then yes, no problem. But alone, by itself, no, it would be completely useless.

    Of course, there's still some interesting questions about this theoretical beast. Is it scriptable? I often have quick one-off tasks that are best done with a quick script. If I can't run one-off scripts, then it's not "up-to-scratch" and doesn't have "everything I need", and if it can, then it's not a completely closed, locked-down system. The only way around that, even in theory, is to have an infinite number of monkeys providing you with all the scripts you could ever need in advance, and even then, there's probably be some difficulty finding the script you need right now from that infinite number of scripts. (Not to mention the costs of the infinibyte drives needed to store all those scripts.)

    Bottom line, I think the notion of a machine that does "everything I need" is about as realistic as those old concepts of an irresistable force or an immovable object. Nice for creating logical paradoxes, but completely silly otherwise.
  • by bcat24 ( 914105 ) on Tuesday July 11, 2006 @11:22PM (#15703198) Homepage Journal
    But there are some people who use a computer for nothing more than word processing, web browsing, and email. A "closed off" setup might work for them.
  • by bersl2 ( 689221 ) on Tuesday July 11, 2006 @11:23PM (#15703203) Journal
    The whole shitstorm over "Trusted Computing" and this are essentially the same topic, and the issue is who has control over the access control list, the user-administrator or some other party. The feature can be used for good or evil, for lawfulness or chaos, just as with any other tool.
  • by Anonymous Coward on Tuesday July 11, 2006 @11:50PM (#15703273)
    I expect you'll be busier than you think signing software once you get what you've wished for.
  • *groan* (Score:5, Insightful)

    by voice_of_all_reason ( 926702 ) on Tuesday July 11, 2006 @11:53PM (#15703280)
    ...it could be highly useful for example in the corporate setting...


    Oh, for fuck's sake! Don't give them any more ideas.

    The extra cost of technology staff and the risk of a shittastrophe are nothing compared to abysmal employee morale. If you don't let 'em stroke off for a few minutes a couple of times an hour by going to ebay or playing snood you're going to end up with a resentful staff. And they'll produce awful, crappy work for you.
  • by secolactico ( 519805 ) on Wednesday July 12, 2006 @12:02AM (#15703303) Journal
    The XBox seems the quintessential example of a system which tried to achieve this design goal. My XBox currently runs ssh, freevo, and any executable I want, proving it is difficult to achieve a successful implementation of such a design

    Yes, but you had to go out of your way in order to achieve this, right? That is, it's not something that happened because of soemething you downloaded off the net did away with the "protection" MS had installed originally in the machine. (Besides, as far as I know, only the bootloader needs to be on the blessed list).

    Of course, everything is fallible. And besides, if every single executable code had to be signed and verified, how expensive in terms of CPU time would that be?
  • Speaking as a user who understands their computer reasonably well and doesn't click on stuff just because animated characters tell me to, would this be a good thing?

    If we (hypothetically) closed off the "stupid user" vulnerabilities that are the major attack vectors right now, wouldn't the malware authors instead just concentrate on other, more technical, avenues of attack?

    Here's my thought: maybe having systems vulnerable to idiot users is actually a good thing for the informational ecosystem as a whole. They're more than just the canaries in the coal mine (although they serve that function, too), they provide a steady stream of marks for the virus/trojan/malware writers and phishing-scheme authors of the world.

    If these people weren't able to basically throw themselves on the swords of their own stupidity on a regular basis, couldn't this just lead to smarter malware, which affected more of us (not just the stupid/ignorant)?

    Malware authors are inherently lazy and opportunistic. While there are still lots of "the monkey told me to click it so I did" people around, and ways to exploit this idiocy, that's what they're going to do. They're not going to mess around with esoteric buffer overflows to steal your information, when they can just send out some fake PayPal emails and watch the data roll in.

    Given the choice, I'd rather have the primary attack vectors be ones that rely on user stupidity, rather than technical flaws, because 0-day technical flaws are too 'egalitarian,' attacking both the clueless user and the experienced person without warning. Personally, anything that keeps the collective attention of the Russian Mafia focused on people too dumb to check the URL line in IE before typing in their bank account information is a good thing in my book.

    I know this isn't a very nice sentiment to hold, but if there was some hypothetical way to remove user stupidity as a vulnerability (not possible, so this is all just a mind game), maybe we'd be better off not implementing it?

    I'm not suggesting that we shouldn't attempt to educate people on good computing practices, but if people are too lazy or disinterested to become educated, maybe in their laziness they can do the rest of us a favor by acting as the collective decoys?
  • by jdogalt ( 961241 ) on Wednesday July 12, 2006 @12:22AM (#15703359) Journal
    "Out of my way" is as vague a phrase as "should". Yes I had to follow some instructions, but technically I'm also following instructions when I dial my phone.

    Yes the bootloader only needs to be on the blessed list, but in the absence of a blessed bootloader which allows arbitrary code to execute...

    To your last point, signing and verifying every executable is not a heavy CPU tax. The real issue is the granularity, and if you can prevent any excutable which intentionally or unintentionally allows arbitrary external code to be executed from getting blessed.
  • Not on my PC (Score:3, Insightful)

    by egarland ( 120202 ) on Wednesday July 12, 2006 @12:32AM (#15703386)
    I have no problems with this setup if the computer is my Cell Phone. My PDA could be setup to only run signed apps, that wouldn't bother me much. But my PC isn't really a PC without the ability to accomplish arbitrary tasks.

    The concept is also flawed. Just because something isn't an executable doesn't make it not contain instructions that tell your computer to do something. Word macro viruses is a great example of this kind of problem. It's just a simple word processing document.. but it can also be a virus. The .mp3 and .jpg buffer overrun bugs are great examples of this too. A format that doesn't even include programability can be used to induce your computer to do something against your will.

    This is not the answer to computer security.
  • by MaverickUW ( 177871 ) on Wednesday July 12, 2006 @01:55AM (#15703601)
    I hate to say this, but while the idea of security from the user instead of for the user, sounds insane, it's probably very needed and very valid.

    I've done some freelance computer work for people who don't know all the technical stuff about computers. This normally relates to spyware/malware/virii/etc. The grand majority of the spyware and malware is self installed. Downloading cutesy screensavers or cursors or backgrounds that come with all manners of desktop search, search bars. When you have a Athlon 64 3800+ with 2 GB of Ram and 10,000 RPM SATA drives in a raid array slowed to a crawl because of too much crap (with antivirus and antispyware software installed, something is wrong.

    I've even seen half the spyware removing programs that show up as spyware themselves in AdAware!

    We're getting to a point where security FROM the user is almost if not more important than security FOR the user.
  • Re:*groan* (Score:2, Insightful)

    by dosius ( 230542 ) <bridget@buric.co> on Wednesday July 12, 2006 @02:11AM (#15703645) Journal
    Employer: That's not what I fucking hired them for, they're here to work for me.

    Me: I would leave the internal network detached from the Internet and remove all external sources of input except the keyboard/mouse, and put the OS on something read-only. Nothing gets in, nothing gets out. Works for work, not for play.

    -uso.
  • by mattyrobinson69 ( 751521 ) on Wednesday July 12, 2006 @03:05AM (#15703762)
    Although you can workaround this: /lib/ld-linux.so.2 /noexec/mounted/partition/escalate_to_root

    or more likely: /lib/ld-linux.so.2 /usr/local/bin/ksolitaire
  • by Anonymous Coward on Wednesday July 12, 2006 @04:29AM (#15703948)
    What is an executable?

    No, the question is not a joke: What would such an OS do with Active-X and Java? Ok, they support digital signatures and let's believe such a system would work.

    And JavaScript? It's clearly executable, but would it be blocked? Who would use such a computer when 50% of websites are not viewable without JS? Not to mention sites that only exists in the form of one SWF file...

    On a server, JS would not be needed, but usually one needs customization in terms of scripts a.s.o. If the admin could self-sign programs (and would be so careful to only that with programs he wrote himself and where he is sure that no malware is included) on a second machine, that could work.

  • Re:*groan* (Score:3, Insightful)

    by DrSkwid ( 118965 ) on Wednesday July 12, 2006 @05:29AM (#15704032) Journal
    Access to the internet is NOT an entitlement at work.

    At least not wher I live. Do you have internet terminals for employess at the gas station ?

    Are the guys at the foundry revolting because they can't browse eBay while waiting for the steel to cool ?

    Soft in the belly workers need to wise up.
  • by Kjella ( 173770 ) on Wednesday July 12, 2006 @07:15AM (#15704201) Homepage
    Think about this: If that database included the infamous Sony rootkit as "allowed" due to them laying pressure on whoever maintains it, doesn't it render the whole thing pointless?

    So your argument is basicly that because trust can be misplaced, there's no point in having a trust system? Let's remove the classification system because the joint chiefs could be Al-Quaida members. Let's remove all digital signatures because the signing key might have been compromised. The point is who to trust, and also look out for misuse of the word trust. For example, TCPA software is less trustable from a computer security point of view because it can't be audited. For example, I trust the debian signed packages, in the sense that they're official packages of the software and not trojaned versions. Of course, the maintainers of the package may install a trojan but that's a lot less likely. Trust rarely works in absolutes, degrees of trust is the norm. Being on that database would be one of these levels, though honestly I don't see the big problem in corporate environments. Don't give users admin rights, install only serious software downloaded or bought from official sites. I think the actual number of rootkit cases where those procedures are followed are almost zero.

  • by hey! ( 33014 ) on Wednesday July 12, 2006 @07:24AM (#15704217) Homepage Journal
    I don't know if they're still sold this way, but firewalls used to be computers that booted off a live CD. This in a way is even more secure than the flash memory used in consumer devices, because presumbaly there's a way to remotely flash these units.

    As other people point out, this is not perfectly secure, because this doesn't prevent the device from loading software remotely and runnint it. However, it does reducee the scope for damage considerably: while you can't prevent data from being lost or corrupted, the real time consumer in recovering from a subverted system is bringing the system back up to a state where you can trust it. You can reboot the system back to its original state, then plug in your updated virus/spyware scanner and run it.

    From a computer science perspective, you can't really "close" a system completely, any more than you can have an organism that runs without RNA: the fact that instructions and data are the same thing are at a deep level part of our very concept of what a computer is. Take a file in a fairly complex format, say Microsoft ".doc". What is that file but a program executed by the Word doc interpreter to create a visual representation of a document? What is Word but a specialzed compiler/interpreter for such programs? Thus, most of the non-trivial programs on an operating system are, in a sense, virtual machines. If any of these virtual machines have any kind of flaw, then a malicious programmer can get them to do things the user does not want.

    If you're making a virtual machine, you want to limit what programs running on that machine do on the underlying computer. Sun realized that when they sandboxed Java applets. The problem is that this is too restrictive to be popular with users. So you end up letting the user grant programs access to different resources. At that point, any vision of iron-clad security is gone, a victim of social engineering.

Work without a vision is slavery, Vision without work is a pipe dream, But vision with work is the hope of the world.

Working...