Zero Install: The Future of Linux on the Desktop? 718
SiegeX writes "Zero Install ,which is apart of the ROX desktop environment is not just a new packaging system, it's a whole new way of thinking; a way that I believe is exactly what Linux needs to become a serious contender for Joe User's desktop. Zero Install uses an NFS to both run *and* install apps from. The apps are all self-contained in their own directory; binaries, docs, source code and all. Once the app has been downloaded its kept in a cache from that point on to minimize delay. The beauty becomes apparent when Zero Install is combined with ROX which runs the application by just clicking on the directory it was installed to. Deleting the application along with all the other misc files is as simple as removing the directory it's contained in. This method of partitioning applications in their own directories also allows installing multiple versions of any application trivial. This is something even the greatest of technophobes could understand and use with ease."
Someone should tell Apple (Score:5, Funny)
Could there be a difference here? (Score:5, Interesting)
Could there be a difference here? Hopefully they are not putting code into virus-writable directories, as often happens on Apple.
Moderator trolls. (Score:5, Insightful)
It is easier for moderators to mark things as a troll than to accept an OBVIOUS fact, since it flies in the face of the religion surrounding infallability of Apple.
But for a critical thinker who uses a personal OS X machine, (especially who has installed a fair amount of software):
Go to to your Applications directory and ls -la to see just how many are owned by the primary user instead of root. And then see if the primary user happens to also be a member of the Admin group, which has write access to all the files there owned by root/admin. This also applies to the Applications directory itself.
On my powerbook, taking installation defaults, over 95% of the apps installed in the Applications directory are writable by the primary user.
This seems inexcusable from a virus security perspective.
On Linux, 0% of my apps are writable by the primary user.
Partial solution (Score:4, Insightful)
However, it still seems that the folders created there are owned by you, so this is rather imperfect solution.
Fast user switching is theoretically a better one, but not on my 12" PBook with 1024x768 resolution due to an almost Dock-class UI design failure.
*) A Panther feature. In Jaguar you're forced to use a terminal in this case.
Re:Moderator trolls. (Score:5, Insightful)
On my powerbook, taking installation defaults, over 95% of the apps installed in the Applications directory are writable by the primary user.
This seems inexcusable from a virus security perspective.
That sounds reasonable, until you remember that such places are writable only after the user authenticates. This means entering the administrator password, allowing installer X or operation Y in the Finder to go ahead and write to that directory. I don't see how that's any less secure than what most moderately experienced Linux users do - ./configure ; make ; sudo make install
Re:Could there be a difference here? (Score:5, Informative)
I install Mozilla on OSX. It says, rather than using an install program, I should just drag the icon into a directory, such as the applications directory, making it writable by any unprivileged program the user may execute.
This is not unlike I have seen for other programs for Mac as well.
When I try to take the initiative and protect these directories, the programs often stop working, because the program writes these directories when run by the user.
I am told by admins that program directories on OS X should generally be made user-writable.
When I do a default install Eclipse on OS X, it places the user workspace underneath the Eclipse program directory, not naturally leading to program directories that are not virus writable. I run it on Linux, and the workspace is separated from the program directory off of my home directory, and the eclipse stuff installs nicely into a protected area.
Re:Could there be a difference here? (Score:5, Interesting)
When I do a default install Eclipse on OS X, it places the user workspace underneath the Eclipse program directory, not naturally leading to program directories that are not virus writable. I run it on Linux, and the workspace is separated from the program directory off of my home directory, and the eclipse stuff installs nicely into a protected area.
Eclipse 3.0M8 is what you're looking for. It lets you choose the workspace location without fuss.
Re:Someone should tell Apple (Score:5, Insightful)
Don't bitch to Steve (Score:5, Insightful)
If MS Office can be a drag and drop install, almost anything can.
Re:Don't bitch to Steve (Score:3, Insightful)
You think people write installers for fun? They usually write them because they don't have a choice, because the OS lacks some piece of functionality or other that lets the system adapt dynamically.
Besides, Mac-style drag-and-drop installs have their own problems: they don't get updated properly and they don't verify or deal with dependencies on install; they just dump the mess into the user's lap.
If MS Office can be a drag and drop insta
Re:Someone should tell Apple (Score:5, Interesting)
Seriously, as an Apple user, I'm glad to see a Linux desktop system copying the MacOS instead of Windows. I've felt for some time that it is a huge mistake for KDE and GNOME to try so hard to make themselves look like Windows when, in OS X, there is a much better example of a Unix-based desktop. Why waste your time copying less than the best?
Yeah, yeah, user familiarity, etc. Look, folks, I guarantee you that if all you've ever used is Windows, if you sit down at a good OS X machine, it will take you about half an hour to get used to the differences and be up to speed -- and after that you'll be discovering new and better ways to do things and saying, "That's so cool! Why didn't Microsoft ever think of that?" If a Linux desktop can have some nifty non-Windows features too (and I really don't care if the developers rip them off from Apple or come up with them on their own) it will do a lot more to enhance Linux desktop growth than just coming up with a system that's "like Windows, only not exactly."
Next response I anticipate: "Yeah, well, if Mac OS X is so much better, how come it hasn't beat Windows in the marketplace?" The answer, of course, is that there is a lot of mindless anti-Apple prejudice, and regrettably I don't expect that to change any time soon. But anti-Linux prejudice is much milder, I think. A good Linux desktop with Mac OS X's best features (and maybe some of its own) especially if it were backed by IBM, could be the best shot at breaking the Windows stranglehold on the corporate desktop.
Re:Someone should tell Apple (Score:5, Interesting)
True.
and after that you'll be discovering new and better ways to do things and saying, "That's so cool! Why didn't Microsoft ever think of that?"
Uh, I had the opposite response. "Why in the f*&k did they do it that way?!" Keyboard control is practically nonexistant. And Finder blows... Damn I hate the way it works. It doesn't allow to see where you are in the filesystem very well. It's just awkward and slow to use.
The killer feature Apple has is that they have a GUI for everything in the system and it hides a lot of complex stuff so the end-user doesn't have to worry about every little detail (of course sometimes that backfires when I simply can't do something because it's hiding the details).
Back on topic...
I've found ROX is very similar to Finder. I hate it also.
The ROX "all in one directory" is the exact same concept as Apple bundles. I can't believe how many people don't even know what they are. I guess that's what happens when you hide all the details. Anyway, the bundle concept is pretty cool. Just copy/move a directory to install your application.
Re:Someone should tell Apple (Score:3, Insightful)
I think Linux would win a lot more converts if KDE and GNOME where less like Windows. Especially in regards to Lindows, I think it will eventually end up making Linux a generic Window in the eyes of potential users. Just go into any $1 or less store and you'll see what I mean, a great deal of the packaging resembles name brand packaging found in grocery stores. Sure you might get some lo
Re:Someone should tell Apple (Score:4, Interesting)
And yet at every point in this history, the Mac was struggling against the dual prejudice of "M4XZ 5UX0RZ" from the script-kiddie brigade (and don't underestimate these people; even now, I suspect that Aunt Tillie is likely to go to her 17-year-old nephew for computer buying advice) and "Apple makes toys" from the suits. And even now, that the Mac OS has an industrial-strength Unix base and the hardware actually offers better price/performance than any comparable brand-name PC at all but the lowest of the low end, you still hear variants of these tired old prejudices trotted out every day.
My advice for the Linux desktop developers is: please, please, rip off everything you can from the OS X desktop. But don't never admit to anyone where you got the idea. We Apple geeks will know regardless, and sit back in our half-bitter, half-proud glory. We've got plenty of practice. [1/2 g]
Re:Someone should tell Apple (Score:5, Insightful)
Re:Someone should tell Apple (Score:4, Insightful)
The big reason was that enough of the details of the Windows operating system were available that people could actually do something. Apple wanted to be the single source for most of the software as well.
From Apple's point of view, users were basically consumers of both hardware and software products.
With the PC and many other competing systems, the barriers to entry were much lower -- you could also be a producer of software as well.
Everyone I know who was into the Macintosh were strictly users with little idea of knowing how the software worked and no inclination to learn how to write their own software. Everyone with an interest in writing software were using other computers and operating systems.
With OSX, I think there is finally room for the technically savvy users to do something more with their Macintosh systems than to just run programs from Apple and other software vendors.
Re:Someone should tell Apple (Score:3, Interesting)
I can second that as well as the parent. I've only met a few people who had any interest in developing software on the Mac, several people who were interested in developing multimedia on the Mac, and countless many people who just want a c
Re:Someone should tell Apple (Score:3, Informative)
Just that will do wonders to equalize the eMac's slower CPU. And if you're comparing the eMac to a Celeron, the CPU ain't even slower.
Another thing to consider is the fact that Apple uses high-quality tubes in their CRT's. This is
Re:You should get out more (Score:3, Insightful)
Which is basically true. I suppose most people could probably even manage to use Bash to launch programs if they could still run their other programs just as well.
Re:You should get out more (Score:4, Offtopic)
My wife's great grandfather (81 yrs. old) just "got a dell dude" for under $700. For that price he got a 2.2GHz P4 with 256MB memory, 40GB 7,200 RPM HDD, 10/100 nic, winmodem, an OK Intel "extreme" graphics controller, 17" monitor, keyboard, scroll mouse and a scanner. Apple does not offer anything in the _average_ price range of todays computers.
I just built my own computer for a little under $500 by just buying parts I needed. AMD 2500+ w/fan, 512 MB PC2700, 120GB 7,200 8MB cache HDD, 64 MB Gefore 3 Ti 500, DVD +- R-RW drive, new case and new KT600 based mobo all for under $500. I was able to shop around to get the best price, a feature not available from Apple.
What does Apple offer for under $700 that can perform just as well? Nothing. IMO, a good performing Mac does not cost less then $1,200 or so.
Apple put themselves into a niche market based on price and they seem happy there.
I thought about getting an eMac before I purchased parts to build a new computer. However, in the end, it came down to getting the best value for my money. I would have had to spend $300 more for the eMac and had a computer that was considerably slower then what I could get in a PeeCee for $500.Re:You should get out more (Score:4, Insightful)
I'm sure most people you've met haven't paid for Microsoft Office, which merely strengthens my point. Most people pirate their software. That's why $500 PCs seem like such a bargain; people aren't paying for the software they install on it.
What's funnier is all the people, including yourself, trying to tell me about OpenOffice (as if I didn't already know). Face facts, people. The average person installs a pirated copy of Microsoft Office. Stop fooling yourself because you aren't fooling anybody else. You and I both know that 99% of those $500 PCs are going to be running $2000+ worth of pirated software within the week.
The point, as always, is that with a Mac you are fully legit from the start. With the $500 PC, unless you run Linux and OpenOffice (which by all reliable sources is less than 1% of the desktop market) then you're up for at least another $500 to get the basic necessities of software. Anybody who mentions "OpenOffice" and "Linux" is ignoring the reality of the situation.
Re:Someone should tell Apple (Score:3, Informative)
all the various bits and pieces of a well formed Mac app go into a 'package' (not RPM). See it here [apple.com]. a double-click on that folder launches the app, deleting the folder removes the app, and updates need only to replace affected files within the app.
When developers take the time to use this structure, it works really really well. Unfo
Re:Someone should tell Apple (Score:4, Interesting)
The only shared libraries are the ones built into the OS. (The OS was on ROM, so version incompatibilties didn't really exist like they would with the disk based OS's we use today).
Their where special files placed in each directory (!boot and !run)
!boot was executed when the filemanager loaded the directory and was responsible for replacing the applications default 'dir' icon with one that represented the application function.
!run was executed when the user launched the program by double clicking on the folder.
It was possible to open the directory rather than execute the application, IIRC a shift key was used in conjunction with a mouse click.
Disclaimer: It's about 15 years since I used the above system, so details may be inaccurate.
Re:Someone should tell Apple (Score:3, Interesting)
Application specific files and user preferences all live in the program's directory. However, 3rd party libraries and so on tend to be stored in the system folders, which IMO gives the best of both worlds - applications can share common code, but application specific code and data are easily removed, and not strewn about all over the place (libraries tend to be small on
waste? (Score:3, Insightful)
Isn't this already being done with apt_get? I just think Linux needs a more user friendly updating service. I hate to say it, but windows is much better at taking completely computer stupid people and having them screw up their own pc's, instead of having to call a family member to do it for them.
Re:waste? (Score:5, Informative)
Consider most people do have broadband now (Score:5, Insightful)
Re:waste? (Score:3, Interesting)
Sounds a lot like Java Net Start to me.
Re:waste? (Score:5, Interesting)
All my binaries are statically compiled for the downloaders... and I NEVER get a complaint how my apps dont work, in fact I get more comments on how my linux apps binaries work every time no matter what and is a stark difference to most of the other linux stuff out there.
the typical response is "your binaires work every time... why can't other OSS developers do that?"
Re:static or what? (Score:5, Informative)
Statically linked binaries include all the libs and dependencies along with the binaries used to actually 'run' the program in one fat package. Depending on what it is you're packaging, it can add a shitload of weight to the package.
Dynamically linked binaries expect your system to contain dependencies already. They have the benefit of giving you a small, tight package but don't always work right away. IE you, the user, have to hunt down packages it needs or apt or rpm has to handle that for you.
It's a trade-off either route you choose. Statically linked binaries add bloat but usually work great without user or system intervention. Dynamically linked binaries are smaller and bloat-free but depend on you, your package manager or something else to make sure it works.
The typical stance of developers has been to build good packages that are small and dynamically linked. After all, what's the point of having 20 copies of a common system library that you may have had since your OS install? That's just bloat. Ultimately, the best developers, in my opinion, give you the choice when you go to download. Click here for static, here for dynamic.
Random Thoughts on Libraries (Score:3, Informative)
In older versions of Windows, this led to some really hard to track down flakey behavior. Suppose one application used a non standard version of some system li
Grammer Trolling my own post (Score:3, Funny)
I even previewed, but it's an easy one to miss.
You only have to download once (Score:4, Interesting)
Re:I thought we wanted people to reuse code? (Score:4, Informative)
The apps are all self-contained in their own directory; binaries, docs, source code and all. * * * This method of partitioning applications in their own directories also allows installing multiple versions of any application trivial.
What happened to the idea that we wanted programmers and users to share libraries and code? To solve rather than avoid dependancy problems?
Applications are self-contained in that everything from a single package is in a single directory, rather than being spread over /usr/bin, /usr/share, etc. They can still depend on other packages.
Without Zero Install, this means that although installing an individual package is quite easy, you may then have to install dependant packages in a similar fashion. With Zero Install, you get automatic dependancy resolution and freedom from install scripts.
Potential for unpublishing apps? (Score:5, Interesting)
Slashdot has previously covered Rox here [slashdot.org].
But one thing I wonder about Zero Install: what if you launch an application, it needs a piece that you don't have cached, and the server hosting it is down? Is it possible for a maintainer to unpublish an application?
Re:Potential for unpublishing apps? (Score:4, Informative)
Zero Install can download from mirrors, peer-to-peer, etc, provided it gets the master index with the GPG signature from the main server.
If you want to get the master index from a backup server, you need manual intervention (root needs to indicate that the backup server can be trusted).
However, since the signature part is small (about 1K), a single trusted backup site (debian.org?) could easily host every index in the world. The rest of the data can come through peer-to-peer, etc.
This sounds perfect... (Score:5, Interesting)
Hopefully, this takes off in more of the 'newbie oriented' distros so that we can say "Just type cp
I still would like to know how they plan on fixing library dependencies, but
Re:This sounds perfect... (Score:4, Insightful)
Re:This sounds perfect... (Score:4, Insightful)
Are you all guys seriously claiming that first
a) finding right foo for your system,
b) downloading foo,
c) knowing where it went,
d) knowing where to drag it,
e) actually dragging it
is simpler than cryptic
a) "install foo"?
Even if you automate all those steps and make a piece of software that has just a list of available applications and a place to drag it, you still need to know how start that, and it's not radically different or easier from how it's already done in most of cases (eg. checkbox in front of name and install button somewhere).
Re:This sounds perfect... (Score:5, Informative)
a) finding right foo for your system,
b) downloading foo,
c) knowing where it went,
d) knowing where to drag it,
e) actually dragging it
is simpler than cryptic
a) "install foo"?
[ b) enter root password and hope it doesn't mess up your system ]
a) You don't have to find the right version for your system, just run the application. It will try to access the appropriate binary and that will get cached. /var/cache/zero-inst/gimp.org/bin/gimp). You shouldn't care though, any more that you care about the structure of squid's web cache.
b) Downloading happens automatically, just like viewing a web page through a web cache.
c) It goes in the cache directory, whose structure mirrors the URI scheme (eg
d) Drag it where you want it. On your desktop, panel, start menu, etc.
e) You could just run it where it is, without creating a short-cut at all.
Piggyback on which P2P network? (Score:4, Insightful)
Why can't we list repositories on a P2P network, let a user connect to this network to constantly update their respositories, in the same way that emule works?
I've tried eMule. I don't want to have to sit in a queue and wait for 1,759 other people to get something just because I told the file manager to start an app. What improvements would you make to the architecture of the network? No, BitTorrent doesn't scale well for small (<10 MB) files.
Besides, P2P doesn't work well for residential or university dorm users who can't take incoming connections.
Re:This sounds perfect... (Score:4, Informative)
Sorry to spoil your arguments, but each of your starting assumptions is wrong ;-)
Just a bad idea taken to it's expreme.
If you could point me to the places on the site where you got your information, I'll try to fix them / make it clearer. Thanks.
This is why... (Score:5, Insightful)
If you can make more money, do harm. (Score:3, Funny)
The full name is "Windows Registry Copy Protection and OS Degradation Scheme". It's part of the "Treat all customers like criminals because some are criminals" Initiative.
It's not quite that simple (Score:5, Informative)
Actually, it hasn't. Ask any Mac pro; applications started making "library" files that went into the System folder(or worse, programs like Norton Utilities insisted on putting libraries into the Extensions folder, which was not what Apple told developers it was for). Apple caved in and 9.x started sprouting "Application Support" folders, a "Libraries" folder, etc. Developers just couldn't wrap their brains around the single-file, applications-don't-mess-with-the-system-folder model. Often times, commercial programs would blatantly disregard Apple's filesystem guidelines. Often times extensions has such weird names, Cassidy&Greene developed an extension manager with a database of all the known files so you could figure out what the hell stuff was.
While you tout OS X as better than Linux or Windows, as an experienced long-time Mac user I saw OS X as a step down from the old MacOS with regards to filesystem simplicity. Applications now install stuff into zillions of different places. Virtually none of their installers ask if you want to install just for your user(ie using your Library, Application etc folders), or install system-wide(a few- VERY few- do). Application installers that have no business needing my password ask for it; why does Acrobat reader need sudo to install itself into Applications? Answer- it doesn't, but it's probably saving some prefs file somewhere it shouldn't.
Even worse...you can install packages using a "package system", but Apple will be damned if they'll give you a way to UNINSTALL a package, system or otherwise. Want to remove all the localization crap you forgot to turn off during system install? You have to download a third-party app to remove almost a gigabyte of files from your system, instead of just going into a "Software" panel and clicking remove. Windows has had it for years, with its only flaw being that it calls the developer's uninstall program, which often times doesn't work, especially if you've deleted the app folder but nothing else.
Another side effect of the multiple-files problem is added complexity; the # of files in the filesystem has ballooned enormously, because instead of an application being one big file with a resource fork, it's now at least 3 folders, and often times hundreds(or even thousands) of files. Moving an application used to be easy- you moved one big file, the Finder just did a straight copy very efficiently. Now it has to copy hundreds of small files, so it takes forever(and amusingly, copying just a bunch of raw non-app files takes about 5 times longer in the Finder than it does via cp or ditto).
Don't get too uppity about not having a registry. OS X uses a number of preference files, and even though they've changed to XML and the like, users are seeing the same problems with OS 9- corrupt preference files causing odd behavior. Remove the naughty pref file, things start working again. There are now third party utils that specialize in checking these prefs; if they can do it, why can't it be part of the bootup process?
Oh, and lastly- Apple has made it even more difficult to make a boot disk for your mac to do disk maintenance. It used to be you just copied over your system folder, removed all the extensions, control panels, prefs, etc you knew you didn't need. Now? You need some stupid shareware program to do it, and half of 'em still haven't been updated for 10.3.
Re:It's not quite that simple (Score:4, Interesting)
Exactly. And you have no way of knowing what it's doing with that password. If you're hooked up to the Net, chances are it's (then or later) being cached somewhere inside apple.com, too. Do you know of a way to convince me otherwise? If not, a sensible person would just assume that the password is now known to Apple.
Similarly, when I first got my Powerbook, it had to be sent in for repairs after about a week. (The screen wouldn't come to life.) They wanted my password, of course, and I gave it to them. No problem, I thought; when I got it back, I'd just change my password.
Lotta good that did me. Yeah, I can use my new password when I log in. But nearly everything in the system that asks me for my password will only accept the original one. I've found a few places that packages cache the password, and changed those. It lasts for a while, then one day it wants my original password again. I've found it necessary to keep a record of all the passwords I've used, because I generally have to try them one at a time until I find which one works with a given app.
Your password is cached all over the place by OSX packages, so the only sensible approach is to assume that it's public knowledge, at least to Apple insiders.
This is one reason that I'd never use OSX for any sensitive applications. I have to assume, from the way it handles passwords, that OSX systems are open to anyone at Apple, and to anyone able to bribe the right people at Apple, and to any intruder who knows where they are cached on my machine.
I'd like to be proved wrong. But a mere assertion that I shouldn't worry my little head about it won't convince me. I want proof.
So far, I've never seen linux software playing fast and loose with passwords like this. I mean, mozilla will cache passwords for you, but it asks,you can say "No", and it apparently honors that. And it doesn't ask for local passwords, only those demanded by web sites.
Also, there are some linux apps that ask you for the root password because they need to run something as root. But you never have to give the root password. You can always kill the app and start it again under sudo. Then it won't know the password, and won't ask for it because it already has the right permissions, so you know the password couldn't be cached.
Re:It's not quite that simple (Score:4, Insightful)
As for asking for the original password, that is because of Keychain. That one is encrypted with your original password.
As for apple.com caching the password... Well, it is quite simple to prove/disprove that: put the OS X machine behind a firewall, and log any attempts to connect to a machine in apple.com network.
Re:This is why... (Score:5, Insightful)
Classic Mac OS used the resource fork for storing associated files, but still had an OS wide location for preference files (MacHD:System:Preference). Sure, no registry, but frankly, LOTS of OS's don't have a registry.
Mac OS X has bundles which resemble AppDir's (That Rox uses) a great deal, but OS X got them from NeXT not OS 9, and NeXT got it from RISC, which is the OS that Rox is trying to emulate in the first place. Mac OS emulation is the farthest thing from Thomas's mind, I assure you.
The real interesting technology isn't the AppDir's anyway, it's ZeroInstall, which allows you to view the internet as a file system from which you can directly run applications.
Re:This is why... (Score:3, Informative)
System 7 on up to 9.2 wound up with all kinds of scat dribbled throughout the various subdirectories in the system folder, as well as associated files in the application folder -- but only some applications were to blame -- many classic apps are self enclosed and install with a drag'n'drop. OS X improves the ratio somewhat with its bundles, an even greater proportion of apps are install-friendly. Some developers still
No, The future is thin clients (Score:3, Interesting)
This aids in system management, resource control, data security, platform independence,
It *is* the future.. ( and ironically the past.. remember VT100's and 3270's ? ) as is its the right way to do computing..
Re:No, The future is thin clients (Score:4, Insightful)
I *do* remember the good old days of VT100s, and they worked great; the thing that displaced VT100s in our research group was *Macintosh* --- those wascally little SEs and the occasional MacII had such nice software onboard, they were a delight to use. The Macs were in turn partially displaced by DEC RISC machines, which cost more but brought a lot of horsepower to the desktop.
We used to use a Beowulf in our current project, but the blasted Pentia got so fast there was no point. Our real-time processor now relaxes on a single machine.
It's not so hard to imagine the pendulum swinging back to thin clients (perhaps in the guise of wireless PDAs, or in a more sinister form via
Re:No, The future is thin clients (Score:4, Interesting)
As has been pointed out, the primary weakness of a network based thin client system is there exists a single point of failure. So I would propose the following for a corporate computer network:
Two mainframe systems, in physically remote locations, each completely capable of handling the corporate network. The mainframe's will serve as the core of the network, but the thin clients will be slightly more than just monitors.
A powerful thin client (I suppose thin client might not be the proper term) is what is needed to handle the reality of an iffy network. The thin client needs to be able to function independantly in the short term. It needs to be able to hold all of it's currently-in-use software and data in memory, and be engineered to make an emergency dump to some local nonvolital memory in case of a power failure. The key benefits to this "thick client" setup are a) because it is not an independant PC with the ability to boot and load software on its own, it is not a candidate for theft b) All data is preserved automatically at a central location except in the case of an emergency, and even then it is recoverable c) software updates only have to be performed one place to be deployed company wide d) maintainance is simpler since the thin clients can in theory be made without moving parts (i.e. hard drive) if they use solid state memory for the Gig or two of non-volital emergency storage they will need. They will be more expensive than a true thin client, but I rather suspect in bulk the economics would work and certainly maintainance costs would provide more than enough incentive.
Re:No, The future is thin clients (Score:3, Interesting)
Well duh... (Score:5, Interesting)
Not like this step hasn't been taken in the past by multiple other software solutions
Re:Well duh... (Score:4, Insightful)
Why would you want to do that?
On my Fedora box, if I upgrade glibc to fix a bug, I want *all* my applications to benefit.
Oh, and disk space is not the reason for having shared libraries -- memory usage is.
Re:Well duh... (Score:5, Insightful)
Granted as the last application requiring the old library is upgraded to being able to use the new library, the old library should be eliminated, but when a major upgrade breaking backwards compatibility happens, most people do not want to wait days or months for the application they have been using to be upgraded. They usually want to be able to continue to do the work that they need to do.
Then again, I could be wrong. Perhaps most other people are happy to sit around on their thumbs.
-Rusty
Re:Well duh... (Score:4, Insightful)
I propose we set aside a location on the system to hold subdirectories each dedicated to a single software package. Let's call it /opt.
This is just a silly statement (Score:3)
Oh, sorry, you *only* use a GUI and so click on the application. Well not everyone solely uses a GUI or wants to go searching through dozens of application directories for the specific binary which runs an application.
Re:This is just a silly statement (Score:3, Informative)
Re:Well duh... (Score:4, Informative)
This kind of proposal about scrapping the current directory structure has been discussed ad nauseum on the Filesystem Hierarchy Standard [pathname.com] mailing lists. Here is the Standard Rebuttal against scrapping /bin and /usr/bin:
You can't have your cake and eat it too. Some have suggested the use of symbolic links inThe Standard Rebuttal against scrapping /lib:
Another argument involves the use of 32-bit vs 64-bit libraries. Best practice seems to be making copies of the most recently installed libs inRebuttals for getting rid of /usr (i.e., having a One (Partition) Size Fits All approach):
#3 is especially common in large enterprise and government environments. If you've ever talked to someone who admins 1,000 desktops for their department, you'll know what I mean.On the mailing lists, the use of /package (or /pkg) also has been discussed ad nauseum. Keep in mind that the filesystem hierarchy is designed so that non-local (commercial) packages don't step all over each other when installing. Local (enterprise) software installation can happen wherever the hell you want it to, as long as it doesn't have to play nice with COTS software.
Executive summary: you can run whatever directory structure you want -- I won't stop you. Just expect to hear lots of complaints from your developers and sysadmins. The reason things are the way they are is partially due to industry inertia, but mostly due to the fact that they just work better that way. If you don't like it, go contribute [sourceforge.net].
Re:Well duh... (Score:3, Insightful)
A good idea, here's why... (Score:5, Funny)
Reminds me of an old joke...
Microsoft: Where do you want to go today?
Linux: Where do you want to go tomorrow?
BSD (in this case, OS X): Are you guys coming or what?!?
Arrrggg....Joe dont care ! (Score:4, Insightful)
While I appreciate the posters enthusiasm this is not a panacea for oe User putting Linux on the desktop. What is in my opinion is a scale of compatibilty with both hardware and software. I mean Joe User (or Joe Six Pack) Only cares if he can do what he need to with apps he wants to. NOT what someone else tells him is a better application. He wants to play his games, surf the web, doodle with his digital checks and balance his checkbook, Tell me of any GOOD applications the average computer illeterate could use to do his checkbook, edit his pictures etc that is as brainless as developers make them for Windows/Mac ? ZIP , There are GREAT apps for doing all those things but in general they are for much more sophisticated users. When Jp can go to CompUsa and buy anything he wants , games, tools, etc, that will run on Linux and has some support number he can call when he breaks shit THEN Joe will use Linux on the desktop.
Solitaire and the Sims both work in Linux (Score:3, Insightful)
And those are the worlds most popular games. Games is not the major issue. The major issue is being able to download your porn, being able to surf the web, being able to burn pirated software, movies and DVDs, being able to get on AIM or some IM client, and occassionally use a word processor.
This is what 99% of internet users do. They don't run some esoteric application by Microsoft, 99% of people don't use all the features of word or office. Most of them wouldnt know the difference between Word Perfect,
Yes (Score:5, Informative)
This is also what Microsoft is trying (Score:5, Informative)
Delete the directoty and the app is gone.
This is here now, and altough
Microsoft had it and lost it. (Score:5, Insightful)
Then Microsoft got smart (too smart for their own good) and decided it was more "efficient" to use shared libraries and that all such libraries should be kept in the %SYSTEMROOT% folder. This meant that applications stored files in one directory, libraries in the system directory and configuration files who knows where. That's better, isn't it?
After that Microsoft decided that it was too "troublesome" to have all of these separate configuration text files. They got smart here too (again too smart for their own good) and decided that it would be so much "better" to have all the settings in a single monolithic and monumentally fragile registry. (Watch out Gnome)
After all that, installing and removing applications became a nightmare. So they decided that it would be best to have a package management system that managed all installations and removals. They established standards that required the proper use of this package management system for the application to be "Windows certified". Unfortunately for them the package management system isn't so great, especially when it comes to the registry and while many vendors do obey the "Microsoft standard", many do not. In fact, the worst offender for not properly using the package management system, and there by polluting PCs with monumental amounts of cruft, is Microsoft themselves.
So, now Microsoft is trying to implement an "even better" system with their
Re:Microsoft had it and lost it. (Score:3, Informative)
Except gconf is nothing like the Windows registry.
The Windows registry is a single file (the in
Re:This is also what Microsoft is trying (Score:3, Insightful)
Exactly. .NET's management system only works for things running in the .NET environment.
Why I dislike about installing softwareunder Linux (Score:3, Insightful)
Re:Why I dislike about installing softwareunder Li (Score:3, Informative)
rpm -ivh --prefix ~/whatever packagename.rpm
That only works IF the package is "relocateable". Some packages, quite naturally, are not but most apps will (or should) be.
-DU-...etc...
duplicate detection, copy on write (Score:4, Insightful)
Something like Zero Install should be combined with some form of duplicate file detection or duplicate block detection and sharing. Furthermore, to avoid a lot of tricky bookkeeping, there should be copy-on-write. And that kind of functionality really is best implemented in the file system itself. So, something to think about for the next major release of "ext". (Note that Microsoft is implementing something like this, but they certainly weren't the first to come up with it.)
Note that the same thing should also happen on downloads: you only download application components you don't already have locally. NFS isn't a good protocol for that, but WebDAV could handle it.
Like the DOS days (Score:4, Insightful)
Believe it or not part of the reason why M$ went with the setup.exe installation was because software was harder to distribute around requiring the setup binaries.
Funny how things come around full circle.
If you have Knoppix try klik (Score:5, Informative)
Similarities to Archimedes (Score:5, Informative)
An 'application' looked like a single file that started with a '!'. It ran as though it was one file, copied and moved as though it was one file. If you used a modifier to open it (Ctrl-click, or something similar), though, it actually opened up as a folder. The app was really made of a number of files - the icon that the application/folder would have, the actual programs, any config files, a script that was run when the program was launched, and another script that would be run as soon as the OS 'saw' the app.
Part of the config would tell the OS what file types the app could handle, so as long as the app had been 'seen' (ie, it's parent folder had been opened), the filetypes would be recognised until the next reboot.
Re:Similarities to Archimedes (Score:3, Funny)
I mustn't be new around here.
Oh Yeah, Great Idea... (Score:3, Funny)
Clearing up a few things... (Score:5, Informative)
The main one is that there are actually two installation systems being discussed in the article:
ROX application directories can be made available via Zero Install. In that case, running the application is a lot like running a program from a network share (but more aggressively cached). Or, you can DnD them onto your local disk manually (without Zero Install).
You can also use Zero Install for non-ROX type applications.
Secondly, when we say that application directories are self-contained, we mean that a single .tgz download corresponds to a single installed directory. Application directories can (and do) still depend on shared libraries (possibly other application directories).
Without Zero Install, after installing an application by drag-and-drop, running it may tell you that you need to install some other library before it will work.
With Zero Install, the application just tries to access it from its fixed location (URI) and it gets fetched.
Re:What about security? (Score:5, Informative)
System security? Nothing. All code runs as you. As for your own security, it doesn't allow any attack that couldn't have been done without Zero Install too.
Reducing the security risk from traditional installation systems (APT, RPM, etc where you're running a downloaded install script as root) was an important goal for Zero Install.
See The Zero Install system [sourceforge.net]
Plugins break this model (Score:5, Insightful)
The trouble stems if you have some kind of base package, which is extensible via some kind of plug-in architecture, traditionally implemented with DLLs under Windows, or shared object library repositories under Unix and varients. Do the plugins form their own "application" or are they part of the application which they extend? What if I want to manage groups of plugins from a common source, independent of the applications extended? Do all applications have to be so isolated that they can only rely on a common base operating system that can't be extended by third parties (which would then be locked into their own application spaces)? What about multiple users sharing the same applications: will their saved files be intermingled?
Blech. Sounds like the cure is worse than the disease.
But, nevertheless, the idea of organizing independent applications in a convenient hierarchy is a desirable one. The trouble is that the traditional filesystem only offers a single hierarchy in which to organize them and so we struggle to determine the best hierarchy to use. We really need to organize sets of files that compromise a related unit ("file set", if you will, and "application file set", for the specific case of end-user applications) in multiple hierarchies: a new one created for the file set being added, and existing ones that the file set affects.
"Symlinks!"
What's that?
"Symlinks!"
Well, O.K. symlinks kind of solve this problem: pick a cannonical location in the file system for your file set and symlink secondary links to the appropriate files. This is a good idea, and has been used for ages to separate the reference to a file in the filesystem from where it is actually stored, but there are drawbacks:
1. Symlinks are one-way. Typically you'll have an application directory full of files and subdirectories, and a bunch of links into that directory tree. What happens if you move or delete entries? Oh, woe to the who has broken symlinks.
2. The context in which the symlink is interpreted may restrict where the target may be. Consider startup scripts added under /etc/rc.d/... They' don't do much good if they link to files in filesystems that haven't yet been mounted. Some restriction to where things have to be canonically installed depending on how and when they will be used is apparent. Fortunately, we generally don't have complicated hierarchies of what parts of the filesystem are mounted, but rather just a few: boot, locally mounted, remotely mounted. So, this problem is managable: we can inagine /opt and /usr/opt: the former available on the root filesystem.
3. Application interaction. The trouble with having one application extend the capabilities of another (and the base O/S can be considered as "one application" from the perspective of third party software providers, other than the O/S provider) is that adding, moving, or removing files can or should affect running applications. Ideally, an action which would leave a symlink dangling should be picked up by any running applications that might care and either delayed until the application can cope, or vetoed. (And, I suppose, --force and --async are your friends here). Current practice in most package managers is to have pre-install, post-install, pre-deinstall, and post-deinstall scripts that try to deal with this inter-application issue. The problem is two fold: (1) the things necessary to be communicated to other applications are varied, and (2) the manner in which they are communicated differ between applications (never mind different versions of the same application). Ideally, the inter-application interface that deals with new, removed, or relocated external files should be (a) thin, and (b) supported by t
Re:Plugins break this model (Score:3, Interesting)
It's a good start -- I've used it as the "official" organizational standard when building an internal custom GNU/Linux distribution.
But, it still enforces a particular hierarchy, generally to keep the traditional operating system components playing nice as new applications are added. There is no
OS X package? (Score:5, Interesting)
And if I get it, just like in OS X, this doesn't mean your application can't use or install other resources in the Library.
Pretty cool, that's 90% of my Linux gripes gone in one big swipe. I hope this can become mainstream. It also means I can stop posting on the importance of simple installers
What about shared libraries and memory? (Score:4, Insightful)
However, memory isn't so abundant. When loading up an app, is the system intelligent enough to recognize that a given library was already loaded into memory from a different directory, and therefore it won't load another copy of the same library?
Re:What about shared libraries and memory? (Score:3, Informative)
I think future versions of Windows know how to scan the disk periodically, find redundant files, and essentially link them together automatically. That's pretty cool - you deliver your app with FOO.DLL version whatever and drop it in your app's directory. If someone else installs a FOO.DLL in their app's directory that matches the exact same bits, the sy
Re:What about shared libraries and memory? (Score:3, Insightful)
This way you could still have systemwide shared libraries that are updatable, but it wouldn't be manditory to use them if i
Libraries, Preferences Other Issues. (my mini FAQ) (Score:5, Informative)
Q. Do I have to add a bunch of crap to my $PATH?
A. No, you just use a shell that is application directory aware, and it will find the binary just fine if the application directory is in a directory in $PATH.
Q. Will it let me recompile critical applications, either to patch them or optimize them?
A Sure. Keep three different verions of Apache around, one with mod_perl, one with mod_rewrite, another with mod_php. Optimize for your new Sexium X CPU. Turn on full foo support, even though it's not recommended!
Q. What about apps with hardcoded pathnames?
A. Edit and recompile. HAND.
Q. What about libraries?
A. (From this page [sourceforge.net] on the ROX Application directory system.) Applications link to libraries in
Q. What about versioning?
A. You can keep different versions of an application around in different directories. I couldn't find any information regarding library versioning. Hopefully libraries in
Q. DND Saving? What's that?
A. Rox aware apps support dragging files from a save box to a directory in a file browser to save. Finally, someone does this right.
Why not solve BOTH of Linux's major problems? (Score:3, Insightful)
The big problems are to make it possible for an average user to install and deinstall first applications, then, peripherals.
In general, any OS is going to need the same kind of information from any class of peripherals. Why can't someone write software to decode the Windows driver information formats and turn the information into something that can be used to configure Linux to use these peripherals?
If someone plugs a USB scanner or digital camera or printer in, why shouldn't Linux ask for, first a native Linux driver, and if this isn't available, a Windows driver disk?
Wouldn't it be nice to be able to buy peripherals based on price and performance and not have to worry if it's usable with Linux or not?
Wouldn't it be easier to write a translation application or several than for the Open Source community to write thousands of drivers individually and for the rest of us to attempt to find them and then try to figure out if that driver will actually work with the distro one is running?
encaps (Score:3, Insightful)
Re:apps contained in their own directories.... (Score:3)
Re:apps contained in their own directories.... (Score:5, Informative)
Re:Going back in time? (Score:3, Funny)
User settings storage in win32 (Score:5, Informative)
It's still very much like this in Windows, in fact, with the "Program Files" directory often containing everything (although "Documents and Settings" is becoming more used for user settings storage). Personally I like the idea. I've always been confused trying to locate various files which belong to a single application in *nix.
Most *n?x apps seem to store all the per-user settings in a dot-file or dot-folder in the user's home directory. In Windows, they're often strewn about in at least three places: C:/Documents and Settings/Me/Application Data/, C:/Documents and Settings/Me/Local Settings/Application Data/, and HKEY_CURRENT_USER in the registry. In addition, a lot of the apps I have installed on my Windows 2000 machine came bundled with peripherals, where the app and a device driver came as part of the same install, the app in C:/Program Files/ and pieces of the driver in various folders in C:/Windows/.
How does Rox handle it?
Re:User settings storage in win32 (Score:3, Informative)
Usually, ~/Choices/ROX-Filer/Options.xml, etc. Choices cascade (/usr/local/share/Choices, /usr/share/Choices, ~/Choices). You can change the location with CHOICESPATH.
In future, we'll probably move over to the (very similar) freedesktop.org base dir system, which defaults to ~/.config instead but is otherwise pretty-much the same.
Re:User settings storage in win32 (Score:4, Insightful)
Why did Microsoft make this move? Not only is it "all eggs in one basket" so it becomes unsafe in case it would crash, but it's also hard to clean it when the installer don't do the job to 100%, which it almost never do. Often I simply don't dare to, since it could be spread out in more places than HKCU\Software\Company\blah.
I'm not really interested in Microsoft bashing -- just an answer to why the app settings aren't stored in their respective folders. The OS settings could be stored in a win.ini just like before, or any other file structure that might be faster to navigate (like a hash table).
I think the registry *is* pretty fast and that's one reason, but it still isn't reason enough since the apps could just store in their directories with a similar structure via the standard registry API's. Why in a single multi-Megabyte mess?
Re:User settings storage in win32 (Score:4, Interesting)
And, dammit, this is as messy as hell. I don't like having hundreds of
</rant>
Re:Screw drag & drop (Score:4, Insightful)
Most people don't want to learn a new packaging system. Most don't want to wait for someone else to package a program into the repos, assuming they even want to package it. The problem with Apt is it relies on someone else saying 'oh, that's great, I'll make some debs.' And it only works for someone with Debian or a Debian-compatable system like Lindows, Xandros, Knoppix, et al. A AppDir works on any distro that has ROX installed, so there's no more bullshit with having to package programs for 20 distros, or 20 versions of a distro. No 'Well, your using RedHat 8.2, which had this RPM, so here's a recompiled copy that works with that version, but if you have 9.1 it uses THIS RPM, which needs this recompile, or if-.' You just drop the AppDir in, and it works. No muss, no fuss.
Additionally, to install a deb you need to be root. Most people don't want or need to use root. 0install fixes that, and AppDirs make it easy even without 0install.
With AppDirs and 0install, you no longer have any of the problems software on GNU/Linux does. (Packaging for a number of distros and pushing it to the user with dependancies seamlessly) Apt only fixes one of those problems.
Re:Screw drag & drop (Score:4, Informative)
With APT, they don't have to. They just have to be able to double-click on the program they want. I've seen lots of Windows users who love the concept and wish Windows had it.
The problem with Apt is it relies on someone else saying 'oh, that's great, I'll make some debs.'
Isn't that true for installers as well? Somebody has to make those too.
And it only works for someone with Debian or a Debian-compatable system like Lindows, Xandros, Knoppix, et al.
Ximian has a nifty tool called Build Buddy [ximian.com] that automates the process of generating packages compatible with not just RPM and DEB systems, but Solaris and HP-UX package managers too.
Additionally, to install a deb you need to be root.
This needs improvement, yes.
Re:Screw drag & drop (Score:4, Insightful)
BTW, I don't doubt APT is good (have no idea) but the OS X install process is as simple as it gets, anybody could do it, since you don't have to learn new concepts, all concepts are borrowed from the real world.
I don't know many Windows users (or non technical Linux users) who can manage an OS or even application install - without running in weird problems, I know no mac user who can't and doesn't do those things on his own.
Re:Screw drag & drop (Score:3, Insightful)
Re:Why do we have shared libraries at all? (Score:3, Informative)