Become a fan of Slashdot on Facebook


Forgot your password?

A Closed Off System? 177

AnarkiNet wonders: "In an age of malware which installs itself via browsers, rootkits installing themselves from audio cds, and loads of other shady things happening on your computer, would a 'Closed OS' be successful? The idea is an operating system (open or closed source), which allows no third party software to be installed, ever. Yes, not even your own coded programs would run unless they existed in the OS-maker-managed database of programs that could be installed. Some people might be aghast at this idea but I feel that it could be highly useful for example in the corporate setting where there would be no need for a secretary to have anything on his/her computer other than the programs available from the OS-maker. For now, let's not worry if people can 'get around' the system. If each program that made up the collection of allowed programs was 'up to scratch' and had 'everything you need', would you really have an issue with being unable to install a different program that did the same thing?"
This discussion has been archived. No new comments can be posted.

A Closed Off System?

Comments Filter:
  • by amanda-backup ( 982340 ) on Tuesday July 11, 2006 @10:52PM (#15703090) Homepage
    Doesn't a live OS CD such as Knoppix achieve this goal? These are usually built for "everything you need" for a particular purpose. You can still access and create data on disks on that system, but you never corrupt the programs themselves. If all the applications being used are web based, then things are even simpler - simply boot up with Knoppix, open Firefox and you are ready to go.
    • You can install software on the livecd. It only exists tell the ram is wiped (restarted) but is runs just fine.
    • by jdogalt ( 961241 ) on Tuesday July 11, 2006 @11:02PM (#15703129) Journal
      No. LiveCDs do offer read-only system images. But they do nothing whatsoever to prevent other programs from being run. I.e. programs downloaded from the net, autorun(or manually) from cd. LiveCDs get you the benefit that each reboot resets you to an known state. That is quite different from an OS which only allows programs from a blessed whitelist to execute. One scenario might be the discovery of way to remotely log into the system. In the livecd case, the attacker can now run whatever program they want, and likely regain entry in an identical fashion should the system be rebooted. What the author of this post is interested in, is a system what would not let the attacker with remote login be able to execute any code not on the blessed whitelist. Now mind you, the idea that such a system would be 'invulnerable' is ludicrous. The XBox seems the quintessential example of a system which tried to achieve this design goal. My XBox currently runs ssh, freevo, and any executable I want, proving it is difficult to achieve a successful implementation of such a design. -jdog
      • The XBox seems the quintessential example of a system which tried to achieve this design goal. My XBox currently runs ssh, freevo, and any executable I want, proving it is difficult to achieve a successful implementation of such a design

        Yes, but you had to go out of your way in order to achieve this, right? That is, it's not something that happened because of soemething you downloaded off the net did away with the "protection" MS had installed originally in the machine. (Besides, as far as I know, only th
        • "Out of my way" is as vague a phrase as "should". Yes I had to follow some instructions, but technically I'm also following instructions when I dial my phone.

          Yes the bootloader only needs to be on the blessed list, but in the absence of a blessed bootloader which allows arbitrary code to execute...

          To your last point, signing and verifying every executable is not a heavy CPU tax. The real issue is the granularity, and if you can prevent any excutable which intentionally or unintentionally allows arbi
          • by Anonymous Coward
            What is an executable?

            No, the question is not a joke: What would such an OS do with Active-X and Java? Ok, they support digital signatures and let's believe such a system would work.

            And JavaScript? It's clearly executable, but would it be blocked? Who would use such a computer when 50% of websites are not viewable without JS? Not to mention sites that only exists in the form of one SWF file...

            On a server, JS would not be needed, but usually one needs customization in terms of scripts a.s.o. If the admin cou
      • by PaulBu ( 473180 ) on Wednesday July 12, 2006 @02:24AM (#15703673) Homepage
        ... pay particular attention to noexec flag -- yes, one can configure his/her generic U**x system not to be able to execute anything off "other media" (including home directories) for what, like, 20 years... ;-)

        Amazing what those guys back then thought of, is not it?

        Paul B.

    • I don't know if they're still sold this way, but firewalls used to be computers that booted off a live CD. This in a way is even more secure than the flash memory used in consumer devices, because presumbaly there's a way to remotely flash these units.

      As other people point out, this is not perfectly secure, because this doesn't prevent the device from loading software remotely and runnint it. However, it does reducee the scope for damage considerably: while you can't prevent data from being lost or corrup
  • by Bin_jammin ( 684517 ) <> on Tuesday July 11, 2006 @10:54PM (#15703097)
    fun you must be to think up questions like that.
  • Windows Group Policy (Score:5, Interesting)

    by Ececheira ( 86172 ) on Tuesday July 11, 2006 @10:56PM (#15703104)
    Windows has long been able to do this via Group Policy. You can specify that only programs signed with specified Authenticode keys can be run, effectively locking the system. Since all OS files are signed by Microsoft and anything a corporation would need could be signed, then if a corporation wanted a locked-down box, then they'd just specify the allowed keys and block everything else.

    It'd be a huge nuisance but it's possible today.
    • Windows fails both the "up-to-scratch" and the "everything you need" tests! But yes, I agree, it can be locked down, as can most other modern OSes (all of which also fail those two critical criteria--I'm not Windows-bashing here).
      • I'm not sure you can say that as far as the corporate world goes. By default, Windows and related programs is everything you need because that is what 90% of corporate enviroments are based on. That is not to say that nothing else is better or has useful features that Windows lacks, but simply that you can easily ahve everything that you need to run a fully sucessful office on a Windows, or even a purely Microsoft box.
      • Re:not quite! (Score:2, Interesting)

        by Goaway ( 82658 )
        I agree, it can be locked down, as can most other modern OSes

        Oh, so how exactly do you lock down Linux so that only signed software can be run?
        • Re:not quite! (Score:5, Informative)

          by ocelotbob ( 173602 ) <> on Wednesday July 12, 2006 @03:44AM (#15703851) Homepage
          SELinux policies. You can configure SELinux to have a default deny to execute files that aren't on an approved list of executables, and also ensure that only trusted persons have access to change those files.
  • I'd use it (Score:4, Interesting)

    by Wizarth ( 785742 ) on Tuesday July 11, 2006 @10:56PM (#15703105) Homepage
    For office use, a linux distro (such as Debian or Ubuntu) which allowed you to specify the repositories, and not allow modification of the list, would work just fine, in general.

    System admin's would only allow updates from the offical repository, with a local repository for mirror/caching and business specific software packages.

    I use something like this for my relatives. Give them a linux, don't give them root, make all updates/installations go through me.

    Then print out a poster for my door "setup.exe will not run on your system" ...
  • by GhaleonStrife ( 916215 ) on Tuesday July 11, 2006 @10:56PM (#15703106)
    Think about this: If that database included the infamous Sony rootkit as "allowed" due to them laying pressure on whoever maintains it, doesn't it render the whole thing pointless?
    • The whole shitstorm over "Trusted Computing" and this are essentially the same topic, and the issue is who has control over the access control list, the user-administrator or some other party. The feature can be used for good or evil, for lawfulness or chaos, just as with any other tool.
    • Think about this: If that database included the infamous Sony rootkit as "allowed" due to them laying pressure on whoever maintains it, doesn't it render the whole thing pointless?

      So your argument is basicly that because trust can be misplaced, there's no point in having a trust system? Let's remove the classification system because the joint chiefs could be Al-Quaida members. Let's remove all digital signatures because the signing key might have been compromised. The point is who to trust, and also look ou
  • code isolation (Score:5, Insightful)

    by TheSHAD0W ( 258774 ) on Tuesday July 11, 2006 @10:56PM (#15703107) Homepage
    This would be "mostly secure", but unless strict data-space separation would use it might still be vulnerable to a buffer overflow or similar attack that would allow arbitrary code provided as data to be executed. The attacker would use this opportunity to establish a "beachhead", modifying whatever integrity-checking system the OS is using to allow it to continue to exist.
    • Obviously there will be some kind of attack, no matter what the system. I think the question is mostly dealing with malware and trojans, stuff that doesn't try to break it, but relies on user stupidity.
      • relies on user stupidity.

        That's the cracking point. So why don't people rather try to employ people with a brain? That might save costs beyond all the trojan issues etc. If businesses ask for stupid monkeys they get monkeys.
      • Speaking as a user who understands their computer reasonably well and doesn't click on stuff just because animated characters tell me to, would this be a good thing?

        If we (hypothetically) closed off the "stupid user" vulnerabilities that are the major attack vectors right now, wouldn't the malware authors instead just concentrate on other, more technical, avenues of attack?

        Here's my thought: maybe having systems vulnerable to idiot users is actually a good thing for the informational ecosystem as a whole. They're more than just the canaries in the coal mine (although they serve that function, too), they provide a steady stream of marks for the virus/trojan/malware writers and phishing-scheme authors of the world.

        If these people weren't able to basically throw themselves on the swords of their own stupidity on a regular basis, couldn't this just lead to smarter malware, which affected more of us (not just the stupid/ignorant)?

        Malware authors are inherently lazy and opportunistic. While there are still lots of "the monkey told me to click it so I did" people around, and ways to exploit this idiocy, that's what they're going to do. They're not going to mess around with esoteric buffer overflows to steal your information, when they can just send out some fake PayPal emails and watch the data roll in.

        Given the choice, I'd rather have the primary attack vectors be ones that rely on user stupidity, rather than technical flaws, because 0-day technical flaws are too 'egalitarian,' attacking both the clueless user and the experienced person without warning. Personally, anything that keeps the collective attention of the Russian Mafia focused on people too dumb to check the URL line in IE before typing in their bank account information is a good thing in my book.

        I know this isn't a very nice sentiment to hold, but if there was some hypothetical way to remove user stupidity as a vulnerability (not possible, so this is all just a mind game), maybe we'd be better off not implementing it?

        I'm not suggesting that we shouldn't attempt to educate people on good computing practices, but if people are too lazy or disinterested to become educated, maybe in their laziness they can do the rest of us a favor by acting as the collective decoys?
        • You could say the same thing about locking your doors at night making burglers smarter because they can't just walk right in.

          There's a certain level of difficulty where it no longer becomes easy enough and profitable enough to be a malware producer, and if we could simply bring everyone up to that level I think we'd all be better off. Sure some of them would stay in business, just like some criminals have no issue kicking down doors and smashing windows, but a lot can be accomplished by eliminating so-ca
  • Question moot. (Score:4, Insightful)

    by The MAZZTer ( 911996 ) <<megazzt> <at> <>> on Tuesday July 11, 2006 @10:59PM (#15703117) Homepage

    "If each program that made up the collection of allowed programs was 'up to scratch' and had 'everything you need',"

    Considering that is impossible, the question is pretty much moot, isn't it. I am always going to find more needs for things, and chances are I'm going to need a new piece of software. Even if an OS shipped with "everything", new things are invented all the time. Maintaining a "Closed OS" to allow for new things would be difficult, and to keep it relatively up to date even more so... but then it wouldn't really be closed if new stuff kept getting added to it...

    • Sure for you... but what about your publically deployed kiosk... or your call center desktop or whatever. Definitely plenty of applications for such a deployment, I think it's just that this is already accomplishable using read only partitions/live cds/etc.
    • A business could certainly define "what it needs" and then actively regulate and manage additional needs. It's just a matter of enforcement of the basic rules and not caving to the whims of vocal and pushy users. The real challenge is making the hard decvision as to what the company really needs, and sticking to it.
  • by jZnat ( 793348 ) * on Tuesday July 11, 2006 @11:00PM (#15703119) Homepage Journal
    This is exactly what Microsoft would like to do with Treacherous Computing, although the issue would cover things like security from the user rather than for the user.
    • I hate to say this, but while the idea of security from the user instead of for the user, sounds insane, it's probably very needed and very valid.

      I've done some freelance computer work for people who don't know all the technical stuff about computers. This normally relates to spyware/malware/virii/etc. The grand majority of the spyware and malware is self installed. Downloading cutesy screensavers or cursors or backgrounds that come with all manners of desktop search, search bars. When you have a Athlon
      • I have to disagree.

        Security FROM the user is fine, until *I* can't turn it off and run the programs that I wrote. And no, I'm not going to spend $50,000 to buy a signed network management package that includes a haphazard partial-implementation of the one feature I actually need.

        But there's the rub: if you can turn it off, or bypass it via a click-through warning, most people will - especially when the screensaver site gives them instructions on how to do it.

        Linux distributions already sign packages with
  • by Onan ( 25162 ) on Tuesday July 11, 2006 @11:01PM (#15703125)

    Yeah, turns out somebody was doing this for kind of a while. Called them "typewriters" or somesuch.

    Really, much of the value of a computer lies in the fact that it's an extremely versatile device. Choosing to discard all that, and believe that you can know ahead of time every single thing you will ever want to accomplish with it, seems like a pretty bad deal.

    • But there are some people who use a computer for nothing more than word processing, web browsing, and email. A "closed off" setup might work for them.
      • But there are some people who use a computer for nothing more than word processing, web browsing, and email

        anyone remember the I-opener ? that was a closed (qnx) turnkey just-does-this-and-no-more system.

        I don't think the company lasted long, though. too many people (myself included) bought the boxes for $100 and hacked them to get linux and win95 on them. ahh..

        but the idea was kind of ok, for some people. and there was NO way to get viruses or problems when you aren't even running a real multiuser o/s

        • anyone remember the I-opener ? that was a closed (qnx) turnkey just-does-this-and-no-more system.

          Well, throw in a WiFi chip into it, shrink to 1/4 of the size (1/8 of the volume), as allowed by tech now, and I would not mind carrying such a beast around! ;-) I guess they used to be called 'Palms', or some such, in the earlier days...

          Seriously, a no-nonsense portable connected device - what can be wrong with it?

          Paul B.
  • OS X (Score:4, Interesting)

    by mattjb0010 ( 724744 ) on Tuesday July 11, 2006 @11:01PM (#15703126) Homepage
    already does this. See here [], under "Application Access: You Decide". You can set up another user account for yourself (not just any children) which would be protected. I'm pretty sure Windows has similar things (not sure if you need 3rd party software to do this) and as mentioned, there are live CDs of Linux/BSD/etc.
    • OS X's Application Controls isn't anything close to being "secure" -- It's implemented on the Finder rather than the OS level and can be bypassed by any convenient scripting environment (Applescript, MS Office, etc).
  • Same thinking? (Score:2, Insightful)

    by JayTech ( 935793 )
    Isn't this the same exact thinking behind the TCPA planned by Microsoft & Co? Where only "licensed" software would be allowed to run? Doesn't sound like a bright idea to me, in fact it sound pretty scary.
    • It depends on who controls the keys.

      If the vendor controls the keys, yes, it is scary. If I do, no, it is not.
  • by nuxx ( 10153 ) on Tuesday July 11, 2006 @11:02PM (#15703130) Homepage
    Huh. Imagine that... Something which can be done by having a Microsoft OS set to run only signed binaries while running on top of a 'trusted computing platform'.

    As I've said before, this would be a huge boon to IT departments all over the place. I'd love to be able to lock users to running a signed OS only the apps we specifically approve and sign. This would lock out all unapproved software *and* malware. If the OS is secure enough to keep there from being any ways around this, it'll be ideal.

    Oh, and of course, as long as such trusted computing stuffs can be turned off for users who purchase the hardware and don't wish to use it, it's a win-win all around.
    • As I've said before, this would be a huge boon to IT departments all over the place. I'd love to be able to lock users to running a signed OS only the pps we specifically approve and sign.

      Why can you simply not give users admin? Am I missing something?

      Its been a while since I used Windows but I can remember working at places where we had to phone IT to get stuff installed because we did not have admin. Is my memory at fault?

  • If you're going to consider limiting users that much, why not simply disable web access or cd players, usb ports etc? I think ultimately, there are several ways to keep a machine safe from intrusion, but it's a compromise for most of us ... functionality vs security. If you want to tilt towards security, in-house systems, disabling activex controls, java, admin access etc are all effective to a certain degree, but much like your concept, sound extremely limiting. I mean, secretaries don't need any software
  • console? (Score:5, Insightful)

    by minus_273 ( 174041 ) <> on Tuesday July 11, 2006 @11:04PM (#15703136) Journal
    Anyone else think this sounds a lot like the xbox 360? encryption keys and all.
    • If you want what the poster suggest, you'd pretty much have an XBox/PS2/etc with a keyboard.

      One of the many, MANY hazards with this would be having to buy a supported printer, supported network card, etc... as 3rd party software (and there by hardware) is excluded by definition.

      As another poster has mentioned, wouldn't a LiveCD suffice?

  • Just about every office I've worked at so far has a certain number of menial computer jobs that are unique to the job setting. And many of these menial jobs have been passed off to the secretary. And many times I've been asked to come up with a little push-button application, macro, script, batch file, or something, just to make the job easier.

    And as a software developer, there's just no way a completely closed system is going to work for me....
  • by JoeCommodore ( 567479 ) <> on Tuesday July 11, 2006 @11:06PM (#15703141) Homepage
    Lets see the Commodore PET, Apple II and TRS-80 were pretty \much can't touch this OS without a hammer type computers.
    • Lets see the Commodore PET, Apple II and TRS-80 were pretty \much can't touch this OS without a hammer type computers.

      Well, yes.... but the problem with the Apple ][ was that this was the sort of behaviour Woz encouraged. There was an entire industry dedicated to producing hardware devices that provided functionality that the OS would otherwise not allow.

      On a more serious note, this was definitely a concession to the fact that the processors of the day just weren't able to perform many specialized tasks, w

    • Lets see the Commodore PET, Apple II and TRS-80 were pretty \much can't touch this OS without a hammer type computers.

      Oh yeah? After booting Apple DOS 3.3 type the following at the AppleSoft BASIC prompt:

      POKE 47616, 96
      Now you can't read or write to a disk. Now that's malware!

      Free karma if you can name what routine I disabled.

    • I don't know how you got modded insightful for your comment; I have an Apple ][e sitting on my desk and there is absolutely nothing bullet proof about it. In fact, the hardware is designed to load and run software stored on a diskette immediately after the system is loaded. Since the system is stored on a ROM, there is no way to change this behavior--you call that secure by default? The software being loaded can do ANYTHING to the system at will. Nearly all DOS virii were spread this way.

      Secondly, once

  • I don't quite get the point... If all apps have to be signed before install, then you have a point of attack. Intercept communications, fake checksums, attack the OS providers server, ... wouldn't be much more secure than anything else.

    Wouldn't it make more sense to go back to the live cd concept... You pick everything you need and then make a bootable cd out of that. We did that 10 years ago - was a lot of work but worked great. I'm sure over the years people have written better scripts than the hacks we d
    • > if the box got hacked, they still couldn't do anything with it

      except run any program they choose and have it run until next boot, on a high profile site with plenty of uptime.

  • by FreeMath ( 230584 ) on Tuesday July 11, 2006 @11:08PM (#15703148) Homepage Journal
    You mean like a Mac?
  • ....limit a machine to only outgoing traffic? That would let you use an office suite and send (but not receive) email.

    Downside: you'd have to use a CD or flash drive to transfer documents on/off the machine. You couldn't receive email on the machine.

    Upside: The only security risk would be by direct access.

    Actually, the most secure machines probably aren't even password-protected. If the machine isn't attached to anything but a power cord, and the machine itself is inaccessible, then
    • What happens when a connection that you initiated results in you getting infected with malware that initiates connections rather than listening for connections?

      For example:
      - LiveJournal ads recently had problems with an advertiser setting their ad to some malware.
      - MySpace videos very recently had problems with videos containing malware.
  • by Xtifr ( 1323 ) on Tuesday July 11, 2006 @11:21PM (#15703192) Homepage
    ...I would have to say no. At least not by itself. It's pretty hard to develop software if you can't install and test the software you're developing somewhere! ;)

    As a component of a larger, networked system, which had parts where I could install and run the software I was developing, then yes, no problem. But alone, by itself, no, it would be completely useless.

    Of course, there's still some interesting questions about this theoretical beast. Is it scriptable? I often have quick one-off tasks that are best done with a quick script. If I can't run one-off scripts, then it's not "up-to-scratch" and doesn't have "everything I need", and if it can, then it's not a completely closed, locked-down system. The only way around that, even in theory, is to have an infinite number of monkeys providing you with all the scripts you could ever need in advance, and even then, there's probably be some difficulty finding the script you need right now from that infinite number of scripts. (Not to mention the costs of the infinibyte drives needed to store all those scripts.)

    Bottom line, I think the notion of a machine that does "everything I need" is about as realistic as those old concepts of an irresistable force or an immovable object. Nice for creating logical paradoxes, but completely silly otherwise.
  • Its a good idea, only it already exists. Kinda.

    Take any Windows Linux or OSX system, and lock it down till its just a kiosk.

    There you go!

    This is also doable with a windows98 installation onto a CD. Knoppix comes to mind for Linux. I've also tried setting up a kiosk like graphic OS to go onto a compactflash card that acts as an IDE device. I needed newer apps too many times on it.

    See, a FIXED OS needs to be configured seperately for each system since noones requirement is the same as anothers'. QNX, Windows
  • You know, all the products in the supermarket are really distracting. What I crave, as a product of modern USian culture and educational systems, is less choice. Why should I have to decide what to do? Surely someone could pick all the useful things for me. Maybe there could be some kind of vote, where we could all just agree to use what everyone thought was best. That would be a perfect world, with no cutthroat competition or need to worry about the future. Shouldn't I be free from worry and uncertai
  • more of a diskless system.

    You would have the OS installed on a flash memory drive. Either its in the system ( embedded like ) or its a plugin card like sd stick. Read only though. You have memory that you can use as program running space. You can save data to external system like flash drive.

    Lastly, you would run applications from a second flash drive.

    Think of a linux on cd kind of system ( or other os 0 with no hard drive, and you save your data on a flash drive. All programs are on the cd. You ca

  • Unless your system is 100% proven for all inputs (of the input classes you are using), there is the possibility that an attacker can feed an input for which your program's state machine does not halt (and, instead, goes into other states, perhaps escalating privileges or otherwise doing anything).

    So this means you either have completely disconnected systems, or you only use things like Spin [] which are provably correct.
  • *groan* (Score:5, Insightful)

    by voice_of_all_reason ( 926702 ) on Tuesday July 11, 2006 @11:53PM (#15703280) could be highly useful for example in the corporate setting...

    Oh, for fuck's sake! Don't give them any more ideas.

    The extra cost of technology staff and the risk of a shittastrophe are nothing compared to abysmal employee morale. If you don't let 'em stroke off for a few minutes a couple of times an hour by going to ebay or playing snood you're going to end up with a resentful staff. And they'll produce awful, crappy work for you.
    • Re:*groan* (Score:2, Insightful)

      by dosius ( 230542 )
      Employer: That's not what I fucking hired them for, they're here to work for me.

      Me: I would leave the internal network detached from the Internet and remove all external sources of input except the keyboard/mouse, and put the OS on something read-only. Nothing gets in, nothing gets out. Works for work, not for play.

    • Re:*groan* (Score:3, Insightful)

      by DrSkwid ( 118965 )
      Access to the internet is NOT an entitlement at work.

      At least not wher I live. Do you have internet terminals for employess at the gas station ?

      Are the guys at the foundry revolting because they can't browse eBay while waiting for the steel to cool ?

      Soft in the belly workers need to wise up.
    • "stroke off for a few minutes a couple of times an hour" ?!!!

      So 2-3 minutes, twice every hour, for eight hours. That's 32-48 minutes of jerkin' it a day: a serious personal problem, and definately not something I'd want my fellow employees doing during work.

  • I think it's fine (in some situations) some some central authority to be the one who decides what can be run on their computer.

    What I don't get, is why the "OS-maker" would be that authority. Look at just who happens to be the OS-maker with the greatest marketshare, and ask yourself: should someone with that repuation for [in]security, be the one who is in charge? They practically invented the concept of having browsers that automatically install malware and media-insertion that installs rootkits.


  • by ka9dgx ( 72702 )
    Isn't this exactly what the X-box is? A closed, locked down system... which totally prevents the execution of third party applications. []

    Of course, it's not secure if anything running anywhere has the ability to modify the system files.


  • Not on my PC (Score:3, Insightful)

    by egarland ( 120202 ) on Wednesday July 12, 2006 @12:32AM (#15703386)
    I have no problems with this setup if the computer is my Cell Phone. My PDA could be setup to only run signed apps, that wouldn't bother me much. But my PC isn't really a PC without the ability to accomplish arbitrary tasks.

    The concept is also flawed. Just because something isn't an executable doesn't make it not contain instructions that tell your computer to do something. Word macro viruses is a great example of this kind of problem. It's just a simple word processing document.. but it can also be a virus. The .mp3 and .jpg buffer overrun bugs are great examples of this too. A format that doesn't even include programability can be used to induce your computer to do something against your will.

    This is not the answer to computer security.
  • Do you understand the Secretary's job? I mean really understand it, the official and unofficial parts. Do you understand it enough better than she understands it so that you can build a computer that does all of the things she needs and wants it to do? And don't forget, it needs to do everything her boss decides she needs to do with it.

    I'm not -that- smart and I'll bet that you aren't either.

    There are places where a closed OS works. Think wireless router or Internet appliance. But the desktop? Not so much.
  • by S3D ( 745318 ) on Wednesday July 12, 2006 @01:32AM (#15703552)
    Symbian OS form v9.1 is very close to be "Closed OS" (pan intended). If application use any "capability"(for example camera API) - any but most basic functions, it should be signed - endorsed by "test house", which have license from Symbian itself. Third party applications still possible, but only from certified developers. So if Symbian v9.1 will be any success there will probably be more closed OS in future.
  • There's nothing special about application signing. Making your existing read-write partitions and any mount no-execute is the equivalent of saying all existing applications are signed and no others are and would solve this problem.

    Application signing can be compromised just as much as the above. If done properly it does give an extra layer of protection.

    You might say that one difference is that application signing can be done remotely so that the owner of the computer loses control but that's no differe

  • Hmmm...with Linux, the only places that regular users can write to anyway is their home directory and /tmp. They need write access to those areas to be able to save stuff. Unfortunately, we probably can't stop them creating or downloading executables to those areas.

    However, mount(8) has a great option - "noexec" - that can be used to prevent files from any partition being executed. If you put restricted users' home directories in /nxhome (no execute home) and mount /nxhome and /tmp as "noexec", that would p
  • You can mount filesystems with the noexec flag, which will prevent files from being executed. Have user directories mounted like that, and just have executables where users can't write to.
  • There will always be loopholes in every system.

    To (mis)quote Morpheus, "It's a system, and like every system, it has rules. Some of those rules can be bent; others can be broken."

    No matter how tight you try to make it, the malware writers will always find a way around it. They may use scripting systems (even this hypothetical closed system would need some sort of scripting capability), or they may find a way to circumvent the lockout mechanism, or any number of other unpredictable ways to get in.

    Complete se
  • Why not a list of programs you control? Why does some third party have to decide? Your secretary example demonstrates the need for this, as your OS vendor might decide (and rightfully so) that HL2 is a valid program which can be run. So really, it has to be up to your needs otherwise it is pointless. Furthermore, we already have software which can be used to implement this [].

  • What you want is a system that will only run crytographically signed binaries.

    However -- like anything else the devil is in the details, or particuarly, in one detail: who controls what apps the OS will run. If it is an OS vendor, that vendor will see that control as a source of revenue, or worse: a way of gaining strategic control over its users (i.e. stay with us on the upgrade path or bad things will happen) or vendors.

    Really the owner of the computer should decide who to delegate the job of deciding wh
  • by Wubby ( 56755 )
    The word that comes to mind for me is "payola". The only thing you will get for download is software that is sponsored. Pay the "vendor" the right price and they will certainly "certify" your app. And if it's all proprietary, I doubt anyone but the software developer will REALLY know what's in the code. It's an idea that just has too many exploitable flaws to be "A Good Thing(tm)".

    Ok, "payola" is not the right word, but it's what comes to mind. A sponsored work that is not presented as such. It would
  • There's the infamous workaround for -noexec mounts... but does really need to me +x if it is only called (legitimately) by other executables? I don't really have any place I can test this theory at the moment...
  • We alerady have this. Citrix and winterminals. The users get only the applications published by the admin, can only save the data allowed by the admin in the shares designated by the admin. Its certainly not perfect from a security perspective, but it does more or less what the OP asked.

  • This already exists in mobile phones. Some phones provide a JVM that lets you run code in a sandbox, but their bootloaders check an RSA signature before executing the operating system, and the operating system checks signatures on the Java classes before giving them privileges. It's how the phone companies get away with charging you an arm and a leg for ring tones and wallpapers.
  • Yeah, that's great, until the company goes out of business, or (for a really fun thought) loses the encryption keys (due to fire, flood, terrorist bombing or BOFH...) Now you're forever stuck with whatever their last release was, and you may not ever get new software.

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde