Squaring the Open Source/Open Standards Circle 255
Andy Updegrove writes "Before there was Linux, before there was open source, there was of course (and still is) an operating system called Unix that was robust, stable and widely admired. It was also available under license to anyone that wanted to use it, and partly for that reason many variants grew up and lost interoperability - and the Unix wars began. Those wars helped Microsoft displace Unix with Windows NT, which steadily gained market share until Linux, a Unix clone, in turn began to supplant NT. Unfortunately, one of the very things that makes Linux powerful also makes it vulnerable to the same type of fragmentation that helped to doom Unix - the open source licenses under which Linux distributions are created and made available. Happily, there is a remedy to avoid the end that befell Unix, and that remedy is open standards - specifically, the Linux Standards Base (LSB). The LSB is now an ISO/IEC standard, and was created by the Free Standards Group. In a recent interview, the FSG's Executive Director, Jim Zemlin, and CTO, Ian Murdock, creator of Debian GNU/Linux, tell how the FSG works collaboratively with the open source community to support the continued progress of Linux and other key open source software, and ensure that end users do not suffer the same type of lock in that traps licensees of proprietary software products."
Cpt. RMS to the rescue! (Score:2, Troll)
Re:Cpt. RMS to the rescue! (Score:3, Insightful)
This post and the dozens below it are arguing the difference between Linux the open source kernel project and Linux the brand.
When most normal non-dev people talk about Linux they aren't talking about a kernel, seperate from the development projects which rely on it; they are talking about Linux the operating system alternative. Linux is actually a really good brand and those of you who try to box it into just the kernel are missing the point of this and many other articles like it.
If we think of Linux
Re:Cpt. RMS to the rescue! (Score:4, Insightful)
Open Standard != standards in Open Source (Score:5, Insightful)
To me Open Standards are much more important than Open Source. Open Standards allow Open Source solutions to be created that are compatible with the other solutions.
Re:Open Standard != standards in Open Source (Score:3, Insightful)
Works both ways - having standards in open source solutions allows other licensed software to be compatable with it
Re:Open Standard != standards in Open Source (Score:3, Informative)
The most common license, (L)GPL pretty much blocks most other licenses.
Re:Open Standard != standards in Open Source (Score:3, Informative)
The most common license, (L)GPL pretty much blocks most other licenses.
I'm not sure I understand what you mean. I've never seen an open standard (documentation) released under a license designed for code. The free standards group (LSB & other free standards) release their licenses under the GNU free documentation license [freestandards.org].
I'm not exactly an IP lawyer, but I don't see any obstacle to writing code to conform to
Re:Open Standard != standards in Open Source (Score:2)
For example, ODF is an open standard that can be implemented by anyone (I think), but the old StarOffice file format wasn't a standard, it just had an open source implementation.
Re:Open Standard != standards in Open Source (Score:2)
The grandparent's argument was that open source implementations of open standards lead to interoperability.
My argument is that such interoperability depends on the open source licensed used for that particular implementation.
If an implementation requires all source code to be open, closed source is forced to create their own implementation with the added risk of less compatibility between implementations. Infact there is the added problem of closed source programmer
Re:Open Standard != standards in Open Source (Score:2)
Utterly incorrect. I wrote:
Either way; open source implementation of an open standard does not necessarily result in better interoperability.
Noone said that - the idea is that if there's an standard used to create open software, anyone (open or not) will be able to follow said standar
Re:Open Standard != standards in Open Source (Score:4, Insightful)
I totally agree. I've had Java programs compiled with 1.1 still run under 1.5 (aka 5.0). I've also seen mainframe programs run for almost a decade without needing to be recompiled. In the Linux world you can't have a 0.0.0.1 change in the glib or kernel version without having to possibly force a recompile.
I'm glad it's not just me. Try explaining to anyone outside the Linux community that they can't install a particular program because the RPM they just downloaded wasn't created for their distro, kernel version, glib version and has dependencies on a dozen or so libraries that are either missing on their machine or are incompatible with the ones already installed. Now I know that there are a ton of tools out there to help you with this but they aren't exactly for the casual user.
It's this type of situation that drives people to Windows and will keep Linux as a distant second place.
Now before anyone labels me a Linux hater, I'm not. I just think it could stand some improvement to make it easier to use in order to reach a larger audience.
Re:Open Standard != standards in Open Source (Score:3, Insightful)
To be honest, I really don't understand why linux does not have a standard packaging system. And I don't understand why packaging formats need to be so complicated either.
Imagine if there was a unified packaging system- you could install fedora on
Re:Open Standard != standards in Open Source (Score:2)
Not to mention that if you are running a properly configured Beagle/Slocate, you don't NEED to....
It aint open standards that "killed" Unix (Score:5, Insightful)
Unix was killed by the high price of licenses. Unix during the early 1990's was supposed to be for the big boys --- the enterprise customers willing to pay up to 10,000 USD per seat for a Unix license.
With the license for Windows NT starting at less than 1000USD, the enterprises which formed the majority of the paying Unix customer base soon found a way to make do with NT and delete their Unix installations.
It wasn't open standards and the fragmentation that did Unix in, it was plain hubris among the Unix vendors who cannot fathom a future where a cheaper Windows NT would replace the robust, stable and widely admired Unix they are selling.
Re:It aint open standards that "killed" Unix (Score:5, Informative)
UNIX vendors also basically stopped workstation development (X11, Motif, CDE etc) in the early 90s when NT showed up, giving up the desktop without much of a fight.
Re:It aint open standards that "killed" Unix (Score:3, Insightful)
I don't think it was unix, I think it was java's "safe sandpit" that caught peoples interest. Portable C code was common well before Java, and I recall my first impression of Java was...it's an update of that P-CODE [wikipedia.org] thing they taught at uni.
As for the "downfall of unix", the biggest influence was definitely cost, I recall that during the mid 90's unix meant you had a fat wallet. Often the same applications that ran on say NT
Re:It aint open standards that "killed" Unix (Score:2)
Re:It aint open standards that "killed" Unix (Score:2)
It wasn't open standards and the fragmentation that did Unix in, it was plain hubris among the Unix vendors who cannot fathom a future where a cheaper Windows NT would replace the robust, stable and widely admired Unix they are selling.
At that time, IIRC, there was a lot of criticism that Unix wasn't as robust or secure as
the mainframes.
It was a Number of Things (Score:5, Insightful)
There was an arrogant attitude toward PC hardware in the mainframe and workstation market. If you wanted to do real computing, you wouldn't use a PC -- those were just toys! Drop 15 grand on our workstation and then we'll talk. Well PC's WERE toys for a few years, but you had to have blinders on to see that they weren't going to make progress. That arrogant attitude persisted while the 386 and then the 486 came out, while all the while Windows NT and to a smaller extent OS/2 started stealing more and more business from the traditional UNIX vendors.
And while the UNIX vendors arrogantly believed they had a better product, not a single one of them ever made an effort to push the GUI portion of UNIX beyond CDE (Well... except NeXT and SCO, but SCO's offering was a step back from CDE.) Gnome, KDE and Enlightenment were all efforts of the Open Source community and to my knowledge Sun's really the only one of the old guard to even consider using one of them. Hell, even Afterstep is a step up from the commercial vendors' offerings.
In the end it was cheap Intel hardware and cheap Intel operating systems that did the old guard in. Windows on a pentium made a server that worked well enough that it was impossible to justify the price jump of an order of magnititude to get just a little bit more. And I doubt there are more than a handlful of companies that would even consider putting UNIX on an employee's desk. Had the old guard of UNIX vendors played their cards right and embraced PCs as a natural extension of their high-end UNIX systems, things might have gone differently.
The current situation is rather interesting. The cost of Windows licenses is significantly more than the cost of Linux licenses. Microsoft can't really compete with free, so they have to find other avenues of attack. That, more than fragmentation, is the biggest danger to Linux. Most commercial companies only deal with RedHat or SUSE anyway. I don't know what the future will bring, but we most definitely live in interesting times.
Does it handle KDE/GNOME install paths already? (Score:4, Informative)
All mainstream package formats have the full installation path hard-coded in the archive. LSB does not address this yet. The other problem of RPM, namely binary compatibility between different library versions, is already solved by compiling with apbuild [autopackage.org]. This works surprisingly easy, and allows my to provide one single package that can be installed everywhere [1].
[1] I can recommend to compile packages at Slackware because Slackware ships most packages without patches. Compiling an app at SuSE for example, made binaries depend on ABI changes caused by SuSE patches.
Re:Does it handle KDE/GNOME install paths already? (Score:3, Insightful)
I simply don't understand why this has never been addressed.
The Linux community is always talking about expanding and competing with the Windows world, but they shoot themselves in the foot on trivial details like this.
The response I often get when I ask why don't we change to something that makes more sense is, "if you want a product more like Windows, then use Windows. We don't want our product dumbed down."
However, just because a product is difficult to use does
Re:Does it handle KDE/GNOME install paths already? (Score:3, Insightful)
Wrong. Nobody switching to Linux gives a shit what directory their KDE is installed in. Believe it or not most people have more important criteria that they demand from their computers, and are much more likely to switch back to Windows if they are required to look in their KDE directory in the first place.
This is the OSS Godwin. People turn
Re:Does it handle KDE/GNOME install paths already? (Score:3, Insightful)
She was incredibly pissed at Windows and wanted to increase her geek cred (which is substantial). However, she is back to Win x64. She is pretty smart. But she'd download an RPM, try to install it, then have no clue where the program was because it didn't create an entry in the menu, and she'd have no clue where the program directory was.
Are you telling me that isn't something that would annoy something at
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Yes, we all know RPM sucks... so why didn't you just give her Gentoo al
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Re:Does it handle KDE/GNOME install paths already? (Score:3, Interesting)
Only if it can be added in such a way that it has zero impact on those
of us who are not interested in it. Nothing pisses me off more than when
I have to relearn how to configure fundamental subsystems becuase they've
been changed to make things easier for users of software that I don't use.
Out of curiosity, why didn't you show your girlfriend the find command?
If that wouldn't have increased her geek-cred, then not
Re:Does it handle KDE/GNOME install paths already? (Score:2)
The funny thing is that I put on Beagle to speed up searches on her laptop, but I don't think she ever used it. Like me, she is anal in how she organizes things. Having a good file structure means you don't have to hunt for items. All Beagle did was take up over 2 gigs with indexing data. Sheesh!
(Side note, but I think an even happier medium is virtual folders
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Re:Does it handle KDE/GNOME install paths already? (Score:2)
But that wasn't an answer to my questions.
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Re:Does it handle KDE/GNOME install paths already? (Score:3, Interesting)
When I want a computer as a flexible environment, however, in which I will install and uninstall games, media players, various productivity applications that I may be trying out, and the like, I just can't imagine going back to Linux. In the 4+ yea
Re:Does it handle KDE/GNOME install paths already? (Score:2)
They made their program the way they made it, and who is anyone else to suggest it should be changed?
Because of the informal nature of OSS software, and the lack of standards, the products often look unprofessional, despite the functiona
Re:Does it handle KDE/GNOME install paths already? (Score:2, Insightful)
Exactly! You seem to have missed the point poster was getting at. It may well be necessary to look in the KDE, and other, directories to simply get KDE to install on a given system, because the files get
Re:Does it handle KDE/GNOME install paths already? (Score:2)
I agree partially with your post in that a standard directory structure would be useful, but on the other hand I think it's very important that operating systems have the freedom to reorganize things as they see fit..
Personally I think all programs should be flexible enough to be relocated easily without being recompiled, but that's another story.
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Re:Does it handle KDE/GNOME install paths already? (Score:2)
However, they should all go in a standard place. Windows has 'program files', if you want a binary - you know exactly where it is. Unless it's something like PHP what only installs into c:/php and breaks if it goes elsewhere. That is rubbish in the Windows world.
The same (doubly so) applies to configuration files. Whereas nearly everything ends up in
Re:Does it handle KDE/GNOME install paths already? (Score:2)
Tell me please: why can't we symlink everything? (Score:3, Interesting)
So different distros will put their files in different places. (Actually, I can't believe programs will actually have the library locations hard-coded in, but whatever; I'll accept that the alternatives have some disadvantages.) So Ubuntu will store its WonderfulLibrary.so in
Fear of fork. (Score:5, Interesting)
1) Licensing that allows a fork.
2) Frustrated users who feel like they can't shape the future of the product via existing channels.
This is why there are at least three forks of java and none of perl. I suppose one could argue that the forks of Java are not true forks but attempts at re-engineering but the end result is the same.
Will linux fork like Unix? Well in a way it already has, there is real time kernel, different kernels for devices etc but not in the way the article talks about it. The article isn't talking about forks per se it's talking about distros. The author seems to have missed the point that the Unix forks were actual forks in the kernel not "just" distros.
Weird article really. Kind of pointless too.
Re:Fear of fork. (Score:2)
Re:Fear of fork. (Score:2)
Just make sure it is static linked
Re:Fear of fork. (Score:2)
That's a distinction without a difference. You, as a user or developer, never interact with the kernel. The closest you are likely to come is calling a libc function that is a thin wrapper around a system call. If the included libraries are different, or they use different versions of the compiler which conform to a different ABI, then the same code will not easily run on both.
Oh, and the
Re:Fear of fork. (Score:5, Insightful)
As an OS Linux is horribly fragmented. That is why people flock to a popular distro like Ubuntu, regardless of whether or not it is the best distro.
Personally, I do believe that the community needs fewer distros. There should be three methods for installing, period. Something like apt-get, emerge and then installing from a downloaded RPM. You shouldn't see different binaries for different distros. A Linux app should be an Linux app, period.
If we had true standards, we'd have fewer distros. But how many methods and standards do we have for installing programs? For file structures? For menu structures?
In what I believe to be a perfect world, there would only be maybe 8 major distributions of Linux.
Home/Personal
Developer
Media Center
Server
For each of those 4, you get a focus on either GTK or QT apps. Regardless, the file structure, configuration files, menu structure, etc. would be the same for every distro.
And while this will NEVER happen, I think we need one major development kit, instead of GTK vs QT. When it comes to aesthetics, visual style and usability, I can certainly understand people wanting a choice between Gnome and KDE. But when I design an app, I should build it on one toolkit, and then it should work on both Gnome and KDE, letting Gnome/KDE handle how it looks, etc. As it stands now, the dependency chains are ridiculous. If I use KDE but want a few GTK apps like Firefox or GAIM, I have to install half of Gnome.
Re:Fear of fork. (Score:2)
You missed out 'Ultimate'.
HTH
Re:Fear of fork. (Score:2)
There are plenty of people who want a completely streamlined, tweaked out build specifically for gaming.
Re:Fear of fork. (Score:2)
One of the big problems with linux is the problems of nonfree formats. simple reasuring things like the ability to play an MP3 or watch a dvd are stumbling blocks for new adopters of linux.
I understand the free and open principles and how mp3 has legal problems while obvorbis is patent free and anyone can use it. It still doesn't help much when you want to play an MP3 or play a DVD.
Autom
Re:Fear of fork. (Score:2)
I agree with a great deal of what you said, Enderandrew, but I have to disagre with some. This is all been gone over at distrowatch time and again.
Yes, there should be standards. We already have good installers and package managers. The Fedora install works great so why does every distro need a different one? And like you said, a Linnux app should be a Linux app. You ought to be able to apt-get what you want and install a package you find at, say, IceWalkers without so much trouble. A standardized file str
Re:Fear of fork. (Score:2)
How is that really helpful?
Enforce Binary Compatability with Fat Binaries (Score:3, Interesting)
Amen! Not only is it frustrating figuring out where all the config files are, but having an app fail to install or work because of dependancy or lib versions is also frustrating. I remember having fits trying to install Oracle 8i (circa 1999-2000) and having the install fail because the linker was choking over libc version incompatabilities and LD_ASSUME_KERNEL settings. Ofcourse, all the problems cou
Re:Fear of fork. (Score:3, Informative)
And while this will NEVER happen, I think we need one major development kit, instead of GTK vs QT. When it comes to aesthetics, visual style and usability, I can certainly understand people wanting a choice between Gnome and KDE. But when I design an app, I should build it on one toolkit, and then it should work on both Gnome and KDE, letting Gnome/KDE handle how it looks, etc.
That's nearly how it works now, and the Free Desktop folks are pushing it closer to that ideal all the time. Programmers should
Re:Fear of fork. (Score:2)
Well, you can always download it. I don't suppose for one minute that the OP is suggesting an arbitrary restriction, like MS's server products (used to?) have - eg MS SQL Server refusing to install on XP because it's not a Windows Server OS.
Re:Fear of fork. (Score:2)
1) Licensing that allows a fork.
2) Frustrated users who feel like they can't shape the future of the product via existing channels.
I'd add:
3) A lack of a passable alternative
There wouldn't be much point forking product X if product Y met the requirements.
Kernel vs userland in Unix Forking (Score:2)
The kernel is fairly much irrelevant for portability. Userland headers, C library compatibility, file locations, compiler options, linker options, Bourne shell incompatibilities, C compiler incompatibilities, C compiler and library bugs, word
Linux supplanting NT??? (Score:2, Insightful)
When did this happen? I must have missed it.
Re:Linux supplanting NT??? (Score:3, Informative)
But who IS certified? (Score:5, Interesting)
Could someone please explain me?
Re:But who IS certified? (Score:3, Interesting)
For distros that have a regular release cycle, something like LSB makes
sense. For distros that are moving targets by design (Gentoo, Arch,
Debian), then any standard that specifies specific versions of
libraries and compilers would reduce the value of these distros and so
they're better off ignoring those parts of the standard (and thus will
never be certified).
Re:But who IS certified? (Score:3, Interesting)
Perhaps it's a matter of opinion, but I'd hardly call Debian stable (plus security updates, of course) a "moving target". Isn't the real reason that LSB requires RPM? (Not wanting to start a flame war, the greatest benefit I found when I switched from R.H. to Debian was no longer having to use RPM. But that's just my personal preference, I guess.) In fact a search leads us to Red Hat package manager for LSB package building [debian.org] which says
Splintering (Score:4, Insightful)
Sure this leads to some incompatabilities and duplication of work but there are several ways for developers to mitigate this. Open standards are essential as they allow code be ported between distros rapidly. Another good idea is for devs to be involved (in some way) with using multiple distros. Different projects could work together more closely to achieve better interoperability.
Its an essential aspect of forking to accept that many forks are dead ends and should be allowed to die or merge back into the tree where desirable. There are many good projects out there and it isn't really in everyones interest to reinvent the wheel continuously.
Karma Burning (Score:4, Funny)
NT didn't displace UNIX (Score:5, Insightful)
Similarly, Linux isn't displacing NT, it's displacing commercial UNIX.
The overlap of functionality between NT and Linux is, really, quite small. There aren't many cases for which Linux is a good solution, where NT could also be (and vice versa).
Re:NT didn't displace UNIX (Score:2)
Similarly, NT drove off commercial Unixes - you never hear about AIX or HPUX anymore.
However, the factors that made the NT market (ie cheap whilst still being good enough for purpose) should be the factors that make Linux kill NT in just the same way. The trouble is, that Linux doesn't provide all that
It was the licensing that killed NetWare. (Score:4, Insightful)
NT did not broadcast its serial number. You could buy a single copy of NT and install it a thousand times. If you needed a new file server or a temporary file server, it was so much easier to setup another NT box. Yes you do. But they're still in the organizations that had them before.
What has changed is that Windows servers swept through the smaller companies. Those companies never had a *nix box. They might have had LANtastic or NetWare or nothing, but they did not have *nix. Okay, I can agree with you on that. I guess that depends upon what business segment you're talking about.
Linux has been showing double digit growth for the past 5 years (maybe longer). Businesses are deploying it. At the server level. Now you're talking about the desktop segment.
The corporate desktop segment is different than the corporate server segment.
And the biggest problem with the corporate desktop segment is all the Access databases that have been built over the years.
The 2nd problem is all the not-supported-or-sold-anymore Windows apps that users "absolutely must have to do my job" that they've acquired over the years.
Changing 10 servers is easier than changing 10 workstations for users who've spent 10 years with the company. You might want to take a look at Google before you talk about "hobbyist market". I'll have to disagree with you on that.
While that would be nice, it is far more likely that one distribution will become dominant and that distribution's structure will become the de facto "standard".
And it seems we're already on that path with Red Hat and Ubuntu.
Re:NT didn't displace UNIX (Score:3, Informative)
I find using apt-get with a decent GUI front-end to
Re:NT didn't displace UNIX (Score:2, Interesting)
Does not matter to the manager that wants a particular OS deployed for a particular solution. A few years ago I migrated a Netware printing system that handled tens of thousands of documents per day to an NT solution. It ended up requiring 16 NT servers to replace 3 Netware servers. Of course NT was not the correct solution but
Re:NT didn't displace UNIX (Score:2)
Re:NT didn't displace UNIX (Score:3, Funny)
LSB not opensource (Score:3, Insightful)
Re:LSB not opensource (Score:2)
How can packaging be such an issue for the commercial vendors, when huge projects like KDE, GNOME, PostGres, MySQL, etc... manage to have packages for all major distros? I fail to see how hard is to mantain build scripts for RedHat, Suse and Debian boxes to automatically generate RPMs, DEBs and Tarballs.
I think that the scenario is pretty much defined, we have the RPMs for RedHat/Novell based distros, DEBs for Debian and it's offspring, and TGZs for everybody else.
LSB is a misleading, limiting and silly name (Score:5, Insightful)
There are no obstacles to Darwin, *BSD and Solaris systems meeting LSB compliance, because it has nothing to do with kernels and everything to do with the specific details of a UNIX userland environment.
Generally I don't get into 'Linux' vs 'GNU' discussions but the LSB is once case where I feel the name 'Linux' is used completely inappropriately.
Re:LSB is a misleading, limiting and silly name (Score:2, Informative)
"2. How does FSG work with the Linux development team and the Linux process?
Actually, the LSB doesn't specify the kernel--it only specifies the user level runtime, such as the core system libraries and compiler toolchain. Ironically, then, the _Linux_ Standard Base isn't Linux specific at all--it would be entirely possible (and probably not altogether hard) for Solaris to be made LSB compliant. The LSB is entirely concerned wi
FUD alert (Score:4, Informative)
What's this about "various types of licenses" under which Linux is supposed to be available? Linux is GPL, so forking is possible, but there is no risk of UNIX-style fragmentation because the source is open and copyleft. For somebody to create a "closed Linux" they would have to start from scratch. You can't add closed bits to GPL software and keep them hidden, so any incompatible Linuxes ("fragments") could always be re-connected by users irritated about the differences.
The nonsense about UNIX displaced by NT and NT in turn displaced by Linux already set off my alarm, but the above really is FUD designed to further somebody's personal agenda.
It is not possible for UNIX-style fragmentation to happen to Linux, because of the GPL.
Re:FUD alert (Score:2)
That the old Unix was closed is *why* this couldnt happen with them, and why the fragmentation only got worse - becuase AT&T had thier own secret hacks, IBM had theirs, Sun, etc.. And it was all closed and secret, so none of them could implement compatibility with the others.
The 'hobbyists' are the very people developing the software. In a way, its similar to evolution - what
Unix never died (Score:4, Interesting)
Unfortunately, one of the very things that makes Linux powerful also makes it vulnerable to the same type of fragmentation that helped to doom Unix - the open source licenses under which Linux distributions are created and made available.
I believe fragmentation has very little to do with the issue concerning the doom of UNIX. My three top reasons are:
1) Price of purchase
2) Expensive/hard to administer
3) Stagnation in development
Users want the cheapest, easiest and most feature-filled solution. It's pretty straightforward actually, and a Personal Computer with Windows was the first to fill the niche, if you leave out Apple.
Apple lost because they wanted monopoly on _both_ hardware and software, while Microsoft only wanted to control the OS (in the beginning). More importantly, Microsoft was better at hyping/marketing their next generation, something that Apple has learned to do better in the recent years.
UNIX and IBM lost because they failed to scale down to personal PCs, which is where the commodization of computing happened in the 90's. IBM and other mainframe dealers refused to understand the Personal Computer (too much vested in big contracts), thus the clones took over along with Microsoft Windows while the dinosaurs waited it out.
Without the IBM PC Clone, the computing world would probably look very different today. In those days it was very attractive to be able to upgrade the PC, exchange parts and use commodized hardware for the whole rig. Many tasks which rented expensive CPU-time on UNIX mainframes, were moved over to PCs during the 90's.
Fragmentation, no doubt, can be very bad for development, but it is also a boon since it leaves developers free to explore different avenues regardless of politics and limitations. I think once a system becomes popular enough like "Linux", the demand for standardization will pull it together. Hey, even the BSDs keeps compatibility with "Linux".
What killed UNIX was lack of creativity, focus, commodization, too much control and maybe most importantly: arbitrary high prices just to milk customers.
Linux may have killed off UNIX (oh what irony), but NT have been beating the crap out of it for many years. Linux and UNIX never actually competed on even terms, because UNIX has already been pretty much abandonded for a long time - it's owners only keeping it for milking the last drops.
My pet peevee with bash and the GNU utilities is the lack of standards, and lack of further development of the command-line. In that regard, I hope "Linux" can progress without having to be beat by Microsoft releasing a better command-line.
POSIX is really an antique joke compared to what could be possible via the command-line. So the trap "Linux" might fall into, is the same as for UNIX: stagnation, because most users drool at eye-candy and not the actual implementation in the back-end. However, maybe the cost of switching command-line is not worth the gain, time will tell.
The LSB is not enough (Score:2)
A
Re:The LSB is not enough (Score:2)
look at http://freestandards.org/docs/lsbbook/install-app. html [freestandards.org] and you have something that is no effort on the developer or installer, but will make a hugely beneficial difference to the sysadmin.
A GUI standard would be good too - but I doubt its time is here for Linux yet. Reminds me of the Windows Usability Guidelines that used to exist, it made Windows a much more consistent interface and was a Bible for
Fragmentation is desirable. (Score:3, Informative)
It's hard for me to see in this chaotic (but necessary) environment how much external control developers are willing to have "imposed" on them by such standards - unless of course from a development/technical standoint it makes sense.
My understanding is that Linux really isn't in the game of competing with anybody (Unix, Windows or otherwise) anyway. it's just about the code, love of things computers and a new way of doing things.
E.
Symlink Union? (Score:2)
Linux vs WinNT (Score:2)
Does anyone have figures for this - that Linux has supplanted NT.
Not that I don't believe it - Since it's many years since NT was sold, it may
be possible that Linux has begun to supplanted NT. Hopefully by 2010 or so (when
Vista will be released
Ah yes but (Score:4, Informative)
LSB isn't preventing fragmentation (Score:3, Insightful)
Re:Unix is dead (Score:2)
NT? Plan9? BeOS? VMS?
You don't seem to like Linux.
Re:Unix is dead (Score:4, Funny)
Re:Unix is dead (Score:4, Funny)
Re:Unix is dead (Score:2)
Re:Unix is dead (Score:2)
% stty
speed 38400 baud;
lflags: echoe echok echoke echoctl
oflags: -oxtabs onocr
cflags: cs8 -parenb
erase intr
^H ^?
I want to SSH in to the server and set up my new wave NFS
Do you think that is the pinnacle of OS design ?
if you s/pinnacle/nadir/ and you might be on to something
Re:Unix is dead (Score:2)
NFS - puke
SSH - band aid
etc. etc.
it *could* be so much better
But you have to get people to even see it is a problem, like being alcoholic.
Unixaholism is a disease, DrSkwid is the cure !!
Re:Unix is dead (Score:2)
Re: (Score:3, Insightful)
Re:OpenBSD (Score:2)
nothing to do with protecting the BSD machines and everything to do with
protecting the win2k box.
I do something similiar with an XP box at work. If it's on the network,
then I'm required to let the network security goons run their software
on it that lets them monitor and make changes without my permission. By
not having it on the network, it remains a stable development platform
that I have full control over.
Re:OpenBSD (Score:2)
The kernel, however, leaves a little to be desired. The
Re:most confusing thing about linux (Score:3, Informative)
Any Free Software license allows you to use or distribute the software in any way you choose.
Re:most confusing thing about linux (Score:2)
Any Free Software license allows you to use or distribute the software in any way you choose.
No, no it really doesn't.
Free software licenses vary, but lots require you to distribute the source code with any binary distribution (or a written offer to provide source, see the GPL).
This is important.
fool (Score:2)
the OP's point is that you don't have to worry at all about the legality of burning/uploading free software to which you have access.
Re:most confusing thing about linux (Score:3, Informative)
No. Unlike commercial EULAs (see one that comes with Windows for example) Linux licenses are written in a clear language - what it says is what it intends.
Here is an explanation, just in case:
There are four major licenses: GPL, LGPL, BSD and MIT-X.
From users point of view all are equally good as they allow one to use the program and perform personal modifications.
From distributo
Re:most confusing thing about linux (Score:3, Informative)
Just like commercial EULA's, the GPL and other Free Software/Open Source license use technical language which can be unclear to people unfamiliar with either the area of law or to technical language in computing, and many are perhaps more unclear than many commercial EULA's because in their attempt to be unintimidating in size and complexity, they avoid t
Re:POSIX (Score:2)
a great thing about a cleche... (Score:3, Insightful)