Comment Re:They should be unifying KDE and GNOME (Score 1) 227
Did you not read what I said?
The LSB 4.1 specifies libpng12. OpenSUSE does not ship it by default (it ships libpng14). Ergo, OpenSUSE is not 100% LSB compliant.
Did you not read what I said?
The LSB 4.1 specifies libpng12. OpenSUSE does not ship it by default (it ships libpng14). Ergo, OpenSUSE is not 100% LSB compliant.
Looking at the LSB now, is specifies that "libpng12" should be available on a system out-of-the-box. In fact, I pointed out earlier that Google Chrome uses libpng12.
So that's at least OpenSUSE which is not LSB compliant as it provides libpng14, not libpng12 (and the difference is Chrome silently doesn't work after you install it).
So considering the second largest distribution is not LSB compliant out-of-the-box, the argument that the LSB provides everything you need is outright misleading. If all major and minor Linux distributions confirmed 100% to the LSB, you might have a point, but the fact is that they don't.
And don't say "you install the LSB manually" because the average user (not technical people) have no idea what the LSB even is or what it provides.
Quite a number of the Debian-based systems do not include any form of RPM support (unless the user installs RPM from their repository, but in my experience it is very rarely installed by default).
Now maybe that's changed and the main Debian systems do; but that still only provides you a minimum. You can't specify package dependencies since packages are named differently between distributions (hell, they're named differently between RPM-based systems let alone Debian-based systems).
So where does that get you? You either need your package installer to know how to handle and invoke package management as well as the names that go along with it, or you bundle the libraries with your program itself. The latter is highly discouraged and frowned upon due to the conflict and library duplication that arises from it (as well as that LD_LIBRARY_PATH is considered a security issue). Even if the LSB defines that certain libraries should always exist on a Linux system, you can't guarantee that the provided libraries are of the correct version, haven't been patched by an upstream vendor (changing their external interfaces) or a variety of weird and wonderful things you can do to to make a library not work quite as it is expected to.
In the end, if you require any complex libraries (think UI), do you include their dependencies, and then those dependencies dependencies, and while you're at it, why not just ship an entire Linux distribution to make sure everything works right?
To clarify this response,
You might be able to target the LSB for some essential components of Linux, like kernel interfaces and perhaps filesystem layout (and even that isn't 100% reliable between distributions)... but the moment you start calling upon dynamic libraries for image manipulation, event systems and various other functionality you run into trouble, because there's no mechanism in Linux to handle when a library file is non-existent beyond sending "ld: library XYZ not found" to stdout.
There's two ways to solve this problem essentially; design a package format which works across all distributions (can be done using a minimal bootstrap executable with the package data attached on the end) or modify the dynamic loader so that it calls upon package management to resolve the library dependencies at runtime, and prompt the user to install the designated packages (through the X server if it is running!)
I originally thought the former was a better option, but the latter seems to solve quite a few issues; third party developers can link against a dynamic library and not have to worry about whether or not it's currently installed on the system and you can easily distribute binaries without requiring any package wrapping at all. The only issue you might run into with this solution is non-native applications (such as those written in scripting languages) have no way of having their interpreter resolved since the system will treat the application as a text file until after installation.
What causes hesitation among commercial app developers is the absolutely atrocious state of application distribution and dependency resolution. Every distribution has their own package format (or at the very least, different package names and content) and a different set of dependencies are required for every single one.
But why should these projects go out of their way to make life easier for proprietary software vendors? Why?
I didn't say they should. I just explained the reason, from personal experience no less, why you don't see a lot of commercial desktop software on Linux.
It's just not practical to target Linux as a commercial developer
Despite the fact that many vendors DO target Linux? Can you name anyone who has shied away due to the issues you describe?
I think you're going to have to name a few, well-known desktop products that work on Linux, rather than the other way around (since there is no way I can determine whether other individuals or companies have been influenced for this reason).
I can tell you that I have personally though; it's not practical for a small-time developer to actually manage and maintain all of those packages in addition to providing support for Windows and Mac, especially when the Linux audience is much smaller.
Oh really? What package format are you using? I'd love to know as it would have to handle the resolution of dependencies regardless of what the packages were called or whether the package manager for said format was actually installed on the local system.
Both of those programs you listed are orientated towards developers / gamers which have a better understanding of how systems work and how to resolve dependencies than the average user. Hand-holding has nothing to do with it; it's just the target audience of those programs already knowing what they're doing.
The audience that the original post was talking about is your average user, maybe not even your average Linux user*. Software designed for the average user can't rely on them to run the program from the command line to see the "ld: library XYZ not found" message and then go hunting for dependencies. They're going to double-click the icon on the desktop and wonder like hell why the program is taking so long to start up. For an example of how confusing this is even for experienced users, you need to look no further than installing Google Chrome, only to find that you don't have libpng12 installed (but rather your distro provides libpng14 by default).
TL;DR It's a terrible installation experience when installing third-party software and average users aren't going to take the time to find out how to resolve dependencies. They just want it to work.
* Because by bringing commercial software to Linux you're increasing the number of potential Linux users as well; as an example, you only need to look at the people who are tied to Windows because a commercial or open source equivalent is not available for the software they use.
What causes hesitation among commercial app developers is the absolutely atrocious state of application distribution and dependency resolution. Every distribution has their own package format (or at the very least, different package names and content) and a different set of dependencies are required for every single one.
It's just not practical to target Linux as a commercial developer when you have to generate and maintain several different types of packages, their dependency lists AND the software repositories to go with them.
On Windows, I can target the largest audience with a single executable file. On Linux, I can target an insignificant desktop audience by maintaining a package for every variant of the system. So who in their right mind would think it to be cost effective to target Linux for desktop software on a commercial basis?
It's been previously said by Valve that they only actually have about 9 - 10 people working on Steam at any given time (no I do not have a citation for that either, it was quite a while ago by my memory).
So basically, the idea that Steam has been holding Valve up in producing games is total crap.
"CDE would be much more useful and safe (not a huge security risk like it would be in its current state) if it prepared distro-specific scripts to use the vendor's package management system to correctly install dependencies, which would be managed and updated as needed from there on out."
It's something that I'm working towards with AppTools (http://code.google.com/p/apptools-dist). The current solution that I've planned is a centralized site which provides HTTP APIs to resolve package names and dependencies (because the php5 on OpenSUSE is not necessarily called that on Ubuntu).
Unfortunately, the filesystem component of packages is really holding the project back. There's no resizable, compressable, resettable filesystem that can be directly embedded into applications and used through FUSE, so I've had to go down the path of writing my own, which as I said, is taking a fair amount of time.
While I can see this being useful for scientists, it's probably not going to be so useful in terms of an end-user environment (not that it couldn't be used in one). For starters, this is really only good for command-line packages as GUI applications, the one that end-users are most likely to use, would end up including most of the X and UI libraries during the CDE detection, even if the files on the target computer are compatible with the machine.
The other issue with having different projects for different distribution purposes (Klik for applications, CDE for scripts, etc.) is that it induces fragmentation; the very thing these projects are trying to get rid of by no longer relying on the package management system.
Disclaimer: I work on AppTools (http://code.google.com/p/apptools-dist) and therefore I'm biased
"I've seen it. It's rubbish." -- Marvin the Paranoid Android