Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:They should be unifying KDE and GNOME (Score 1) 227

Looking at the LSB now, is specifies that "libpng12" should be available on a system out-of-the-box. In fact, I pointed out earlier that Google Chrome uses libpng12.

So that's at least OpenSUSE which is not LSB compliant as it provides libpng14, not libpng12 (and the difference is Chrome silently doesn't work after you install it).

So considering the second largest distribution is not LSB compliant out-of-the-box, the argument that the LSB provides everything you need is outright misleading. If all major and minor Linux distributions confirmed 100% to the LSB, you might have a point, but the fact is that they don't.

And don't say "you install the LSB manually" because the average user (not technical people) have no idea what the LSB even is or what it provides.

Comment Re:They should be unifying KDE and GNOME (Score 1) 227

Quite a number of the Debian-based systems do not include any form of RPM support (unless the user installs RPM from their repository, but in my experience it is very rarely installed by default).

Now maybe that's changed and the main Debian systems do; but that still only provides you a minimum. You can't specify package dependencies since packages are named differently between distributions (hell, they're named differently between RPM-based systems let alone Debian-based systems).

So where does that get you? You either need your package installer to know how to handle and invoke package management as well as the names that go along with it, or you bundle the libraries with your program itself. The latter is highly discouraged and frowned upon due to the conflict and library duplication that arises from it (as well as that LD_LIBRARY_PATH is considered a security issue). Even if the LSB defines that certain libraries should always exist on a Linux system, you can't guarantee that the provided libraries are of the correct version, haven't been patched by an upstream vendor (changing their external interfaces) or a variety of weird and wonderful things you can do to to make a library not work quite as it is expected to.

In the end, if you require any complex libraries (think UI), do you include their dependencies, and then those dependencies dependencies, and while you're at it, why not just ship an entire Linux distribution to make sure everything works right?

Comment Re:They should be unifying KDE and GNOME (Score 1) 227

To clarify this response,

You might be able to target the LSB for some essential components of Linux, like kernel interfaces and perhaps filesystem layout (and even that isn't 100% reliable between distributions)... but the moment you start calling upon dynamic libraries for image manipulation, event systems and various other functionality you run into trouble, because there's no mechanism in Linux to handle when a library file is non-existent beyond sending "ld: library XYZ not found" to stdout.

There's two ways to solve this problem essentially; design a package format which works across all distributions (can be done using a minimal bootstrap executable with the package data attached on the end) or modify the dynamic loader so that it calls upon package management to resolve the library dependencies at runtime, and prompt the user to install the designated packages (through the X server if it is running!)

I originally thought the former was a better option, but the latter seems to solve quite a few issues; third party developers can link against a dynamic library and not have to worry about whether or not it's currently installed on the system and you can easily distribute binaries without requiring any package wrapping at all. The only issue you might run into with this solution is non-native applications (such as those written in scripting languages) have no way of having their interpreter resolved since the system will treat the application as a text file until after installation.

Comment Re:They should be unifying KDE and GNOME (Score 1) 227

What causes hesitation among commercial app developers is the absolutely atrocious state of application distribution and dependency resolution. Every distribution has their own package format (or at the very least, different package names and content) and a different set of dependencies are required for every single one.

But why should these projects go out of their way to make life easier for proprietary software vendors? Why?

I didn't say they should. I just explained the reason, from personal experience no less, why you don't see a lot of commercial desktop software on Linux.

It's just not practical to target Linux as a commercial developer

Despite the fact that many vendors DO target Linux? Can you name anyone who has shied away due to the issues you describe?

I think you're going to have to name a few, well-known desktop products that work on Linux, rather than the other way around (since there is no way I can determine whether other individuals or companies have been influenced for this reason).

I can tell you that I have personally though; it's not practical for a small-time developer to actually manage and maintain all of those packages in addition to providing support for Windows and Mac, especially when the Linux audience is much smaller.

Comment Re:They should be unifying KDE and GNOME (Score 1) 227

Both of those programs you listed are orientated towards developers / gamers which have a better understanding of how systems work and how to resolve dependencies than the average user. Hand-holding has nothing to do with it; it's just the target audience of those programs already knowing what they're doing.

The audience that the original post was talking about is your average user, maybe not even your average Linux user*. Software designed for the average user can't rely on them to run the program from the command line to see the "ld: library XYZ not found" message and then go hunting for dependencies. They're going to double-click the icon on the desktop and wonder like hell why the program is taking so long to start up. For an example of how confusing this is even for experienced users, you need to look no further than installing Google Chrome, only to find that you don't have libpng12 installed (but rather your distro provides libpng14 by default).

TL;DR It's a terrible installation experience when installing third-party software and average users aren't going to take the time to find out how to resolve dependencies. They just want it to work.

* Because by bringing commercial software to Linux you're increasing the number of potential Linux users as well; as an example, you only need to look at the people who are tied to Windows because a commercial or open source equivalent is not available for the software they use.

Comment Re:They should be unifying KDE and GNOME (Score 1) 227

What causes hesitation among commercial app developers is the absolutely atrocious state of application distribution and dependency resolution. Every distribution has their own package format (or at the very least, different package names and content) and a different set of dependencies are required for every single one.

It's just not practical to target Linux as a commercial developer when you have to generate and maintain several different types of packages, their dependency lists AND the software repositories to go with them.

On Windows, I can target the largest audience with a single executable file. On Linux, I can target an insignificant desktop audience by maintaining a package for every variant of the system. So who in their right mind would think it to be cost effective to target Linux for desktop software on a commercial basis?

Comment Re:Something neglected to mention (Score 1) 385

"CDE would be much more useful and safe (not a huge security risk like it would be in its current state) if it prepared distro-specific scripts to use the vendor's package management system to correctly install dependencies, which would be managed and updated as needed from there on out."

It's something that I'm working towards with AppTools (http://code.google.com/p/apptools-dist). The current solution that I've planned is a centralized site which provides HTTP APIs to resolve package names and dependencies (because the php5 on OpenSUSE is not necessarily called that on Ubuntu).

Unfortunately, the filesystem component of packages is really holding the project back. There's no resizable, compressable, resettable filesystem that can be directly embedded into applications and used through FUSE, so I've had to go down the path of writing my own, which as I said, is taking a fair amount of time.

Comment Good for scientists; not for end-users. (Score 2, Insightful) 385

While I can see this being useful for scientists, it's probably not going to be so useful in terms of an end-user environment (not that it couldn't be used in one). For starters, this is really only good for command-line packages as GUI applications, the one that end-users are most likely to use, would end up including most of the X and UI libraries during the CDE detection, even if the files on the target computer are compatible with the machine.

The other issue with having different projects for different distribution purposes (Klik for applications, CDE for scripts, etc.) is that it induces fragmentation; the very thing these projects are trying to get rid of by no longer relying on the package management system.

Disclaimer: I work on AppTools (http://code.google.com/p/apptools-dist) and therefore I'm biased :)

Security

Submission + - Self-Destructing USB Stick Sold as Un-Hackable 3

Hugh Pickens writes: "PC World reports that Victorinox, maker of the legendary Swiss Army Knife, has launched a new super-secure memory stick that sounds like something out of Mission: Impossible. The Secure Pro USB comes in 8GB, 16GB, and 32GB sizes, and provides a variety of security measures including fingerprint identification, a thermal sensor, and even a self-destruct mechanism. Victorinox says the Secure is "the most secure [device] of its kind available to the public." The Secure features a fingerprint scanner and a thermal sensor "so that the finger alone, detached from the body, will still not give access to the memory stick's contents." The product uses an integrated Single Chip Technology, so that there are no external and accessible lines between the different coding/security steps, as on multi-chip solutions making cracking the hardware impossible. Then there's the self-destruct mechanism. While offering no explanation of how it works, Victorinox will only say that if someone tries to forcibly open the memory stick it triggers a self-destruct mechanism that "irrevocably burns [the Secure's] CPU and memory chip." At a contest held in London, Victorinox put its money where its mouth was and put the Secure Pro to the test offering a £100,000 cash prize ($149,000) to a team of professional hackers if they could break into the USB drive within two hours. They failed."

Submission + - Pirate Party Pillages Private Papers (pirateparty.org.au)

David Crafti writes: "Pirate Party Australia has made the move to host the recently leaked ACTA document in order to highlight the lack of government transparency in the negotiation process. We believe that the document is not under copyright, and we are not party to any NDAs, so there should be no restriction on us posting it. We would like to see what the government (any government) tries to do about it. If it turns out that there is some reason that we have to take it down, then we will, but if this happens, it will only validate the document's authenticity."
Censorship

Australian AvP Ban Reversed 71

Earlier this month, we discussed news that Sega's new Aliens vs. Predator video game had been refused classification in Australia, effectively banning it. After a scathing response from the developer saying they wouldn't censor the game, and later news that the classification scheme may be updated to include an R18+ rating, it now seems that the Classification Board has seen fit to give the game a green light after all. Sega's Darren Macbeth told Kotaku, "We are particularly proud that the game will be released in its original entirety, with no content altered or removed whatsoever. This is a big win for Australian gamers. We applaud the Classification Review Board on making a decision that clearly considers the context of the game, and is in line with the modern expectations of reasonable Australians."

Slashdot Top Deals

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...