Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:a step in the wrong direction (Score 1) 135

It is OS X that puts user experience over developer (OS X developer) convenience. Automatic dependency resolving is some work, which you sidestep by bundling. Bundling costs the user, in terms of security and in terms of performance (esp. less memory). That is why they have never been popular in the Linux world. Of course, both of these are sneaky problems, which only eats at you in tiny nipples.

They've never been popular in the LInux world because Linux users have typically been willing to deal with the extra adminitrative overhead of package management and lack of proprietary software. But Linux users are not typical users and they never will be.

If you have a huge base, you will be dragging along crap for a long time, because any library you put in the base has to be supported forever unless you want to break contract on that base. And any libraries accumulate cruft, stuff that wasn't designed as well as should be, or rested on assumptions that at no longer true.

You don't need a "huge" base. You only needs a sufficiently robust base., which you have with OS X. You're speaking in purely theoretical terms. Open your eyes. OS X is a nice system to use and work wth.

I have dozens of proprietary programs on this computer, mainly games. None seem to have a problem. The package management system doesn't get in the way, like you seem to think. It is very easy: You bundled all the shared libraries you need with you program, stuff in the some directory (./lib is popular), and add a LD_LIBRARY_PATH=./lib to the launcher. And no, this is not something the user does, it is something the developer does, just like the developer might create a deb package, or installshield package, or whatever OSX uses. See, it's all in your mind?

Dozens, huh? And you're telling me the all just worked? Bullshit. Oh wait, I bet this is where you say something like "excluding bugs" or "I only had to do a few tweaks to make it work." Or some qualifier like that. You've never had to backport a program or fiddle with getting a package you want from one distribution to work on yours? I dealt with all of it on Linux and it SUCKED.

The problem with bundling programs on Linux is that you have to bundle far more shared libraries than you would on OS X because a developer can't expect you to have anything but the most basic libraries like libc. So it is no wonder you have such a lower opinion of bundling. Your experience with it is on Linux, a system that doesn't explicitly support bundling.

We need to know, so that we can tick off the checkbox that says "working in 32-bit". As an old sysadmin, I presume you are familiar with the fact that going from 64-bit to 32-bit can break stuff, just like the other way?

On Linux, yes, but I never had a problem on OS X. That's what I'm trying to tell you.

A harmless bug might suddenly not be so harmless when you change the word-length, you know? Had we used Macs, we would have to do the exact same thing. My point was that unless you actually query the system in some way, it's hard to tell. Seems very seamless to me.

You just went over how you have to use schroot to setup a full 32bit environment and you're going to sit there and tell me that it is seamless? You're being dishonest. When I am developing for OS X, I don't have to maintain two different test enviornments. If I want to test 32bit on OS X, I output 32bit executable. If I want to test 64bit, I output 64bit. In most cases, there's no real need to even make a 64 bit version. I can, if I have reason to (like I actually might need 4+ GB of RAM), but for the most part I can ship a 32bit OS X application and nobody will really notice the difference.

Do you seriously not get just how seamless 32/64bit is on OSX? I guess you wouldn't because it sounds lke you've never actually used OS X.

That would be because you do not know much about how shared libraries work. It is not only disk space that you waste, it is also memory. Oh yeah, sure, it is cheap also ;) Excuses upon excuses for a lazy, crude design.

Ok, now you're just being insulting. I know EXACTLY how shared libraries work. NOthing I have said indicates otherwise.

What hoops? You install a program, and the system resolves the dependencies and install what is needed. There are no hoops to jump.

And it works fine, until you want to do something outside the confines of your package management system. On OS X, I can have both. I can use MacPorts for more traditional, open source only, Linux style applications and I can use bundling for everything else. Though I would never expect the average user to use something LIke MacPorts. Even if put a pretty graphical interface on it. Bundling is far superior for end users and it works much better when it is embraced rather than leaving it to developers to implement in their own ways.

Look, it seems you had a bad experience as a linux sysadmin. That is too bad, but you need to get over it if you ever hope to be a decent developer.

As a sysadmin I had a great experience. For servers, Linux is great and I still use it. Package management is great servers. For users... for desktops... not so much. You have to be able to recognize when one size does not fit all. Stop treating desktop systems like they are servers.

Comment Re:a step in the wrong direction (Score 1) 135

Only poor developers only depends on included shared libraries, and yes, the remaining libraries could be reused. You are just making excuses for a crude system.

I'll take a system that puts user experience ahead of developmental purity, thanks. No excuses necessary.

If you think that base and included libraries are sufficient for any real work, you are not a very good developer. You will be taking much longer, with a lot more bugs, than if you used existing libraries extensively.

I'm sure it depends entirely on what I'm writing. Most software out there can rely on base and include libraries. And when they can't, they can just bundle what they need at very little real cost to the user. The trick is to provide a sufficiently robust base .Something that LInux has utterly failed to do. So it is understandable that you might not think much of "base" systems. On Linux you can't even rely on the user having certain GUI libraries installed beyond the basic X11 libraries. Aren't there like a half dozen different ways to play sound on Linux? It is is a disaster, if you ask me.

I already went over why it is crude. It has at least 2 problems: Security, and resource waste. Problems solved for decades om linux

Disk space is cheap and security hasn't been an issue. I don't see the problem.

Hardly. You have several options from there. Personally, being a developer, I maintain a complete 32-bit environment to reproduce those bugs that only shows up on 32-bit. This is very easy with the schroot package.

It might be "easy" to do, but the question is, should it even been an issue at all? I hear Linux appologists say "Oh, that's easy to fix, you just do this this and this.." That's all fine and good if you job is to develops software, but it doesn't really look good compared to a system where there was no problem to begin with.. at least as far as the end user is concerned.

Oh, don't worry, I totally understand where you're coming from. I was like you for the longest time. I would make all kinds of excuses for Linux when people complained about the lack of commercial software or how complicated it was to do simple things like print (I realize it has gotten better). I'd just rattle off the "easy" commands to do it. And to me that was sufficient because I was a system administrator and part time programmer at the time. As long as things were straight forward enough to me, it didn't really matter that I was wasting my time doing it in the first place. In fact, I would go so far as to say that I got off on understanding what was going on and how to make things work. I like to solve problems, after all. But I gradually grew out of it though and it started to sink in that none of this was good for the USER. Sure, we could just train users to jump through all these "easy" hoops, or we can just make the shit work. Yeah, maybe we sometimes have to stray from what seems like the best technical solutions (i.e. waste some disk space by storing multiarch in everything or duplicate some shared libraries). But what's important is the user.

Or if you prefer, you could install the missing libraries, and use the (crude) OS X solution, like you normally do with proprietary software (games, e.g.). That is outside the scope of package managers, though.

That's just it though. Modern LInux distributions have invested so much in package management that they've basically created a system where anyone that deviates from the package managment is doing so at their own risk. That makes it very difficult to distribute proprietary software.

You know, in our company we keep a wiki-page over which systems runs with 64-bit, and which with 32-bit. Because noone can remember, otherwise, and sometimes you need to know.

Do you? I thnk that's my point. On PCs, you do need to know that information a lot more often than you do with Macs. I've never had to know that information on a Mac. Yes, PPC vs. x86 matters, but 32bit vs. 64bit doesn't. I'm not quite sure you're grasping how just seamless Apple has made the 64bit transition.

As for how exposed this information is, that is a culture thing. Mac owners don't want to know, linux users do.

Uh huh. You keep thinking that.

Resource wastage and security risks are real, but also something that is popular to ignore. I find it sad that you as a developer do not care about those things, but I suppose it is how it is these days.

I care about those things. I'm just not interested in purity. I am capable of compromise. I can recognizing when my sense of technical correctness is getting in the way of the best possible user experience. I can accept some wasted resources if it i means the user has a superior experience. While you would just rather have users jump through a bunch of hoops and the best you can offer them is hoops that are not on fire.

Comment Re:a step in the wrong direction (Score 1) 135

Bundles means multiple copies of shared libraries. This has 2 implication: 1. the libraries cannot share memory pages and 2. security updates are not applied across the board, meaning that security updates are cumbersome to apply.

Some shared libraries are copied, but since OS X is a predictable base system, you don't actually have to bundle most things. And the things you do bundle are likely not used by anthing else anyway. So they're non-issues. I used LInux for 10 years and I'll tell you, bundling eliminates so much hassle. Packages are fine for the base system, but user applications need to be much more flexible. Most Linux distributions are just big monolithic beasts where every damn application is tightly coupled with the next.

So it might look like a good idea to someone who doesn't know much about the subject,

LOL. Wow, dude. I'm a developer myself. I know what I'm taking about.

but it is in fact a rather bad idea. (But if you like it, I have seen a few distribution on package managers on Linux that does this.

Yea... not quite the same.

Sigh. As I said, it is a problem Apple has never handled. Why should they? You cannot install Apple on many architectures. The solution they chose was an interim solution, and as such the solution was rather crude.

Crude? What's crude about allowing multiple architectures to be bundled into a single file? It worked perfectly. How would it not work for architectures besides PPC and x86? You're blustering.

The specific problem of coexisting 32 and 64 bit has been working in Debian for many, many years.

It has worked, but not seamlessly or we wouldn't have so many LInux users complaining about things like Adobe Flash. And it doesn't work completely. Debian provides i386-libs, but if you happen to have a piece of software that needs a library not included in that rather limited set, you're SOL. You never even have to think about how many "bits" your system was on OS X. You never had to worry about whether Firefox was 32bit or 64bit. You simply cannot say that about LInux. I find it totally laughable that you can sit there and criticize how Apple handled things when your only complains are theoretical. In reality, everything worked out quite well.

Comment Re:a step in the wrong direction (Score 1) 135

Everything did just work --- bugs excepted. And do. The problem this is going to solve is "how to handle N architectures on the same installation, dependencies and all.

A problem that Apple solved years ago in a far more elegant way.

Of course, OS X doesn't handle automatic dependency resolution at all, does it? It relies on bundling instead, wasting resources left and right, if I recall correctly.

Bundling is one of the best parts of OS X, IMO. Who the hell cares if it wastes a bit of disk space? It WORKS. And it is totally hassle free. No installers. No package managers. You just copy the app to your desktop.. and run. Don't want it anymore? Delete it. Want to test two versions of the same software (one beta, perhaps) side by side? Keep both copies. You dont' have to worry about an installer or package manager trying to overwrite the old version. THAT is what computers should be like. OS X handles dependency resolution by making it a non-issue. Disk space is cheap as hell. I'll take a polished, hassle free user experience with some wasted disk space any day.

Oh please. Laziness is just stuffing in everything, that's the banal, ham-fisted solution.

Maybe, but it worked. As an OS X user, you never even had to know if your machine was 32 bit or 64 bit. Most people didn't even realize that the G5 was actually a 64bit CPU. It was that seamless. It is laughable that you can sit there and criticize how Apple handled it when Debian is only JUST NOW seriously addressing the general problem of managing multiple architectures.

Sigh. What little you are talking about (a base set of libraries) have been working as long as I've used 64-bit.

This thread is full of Linux users directly or indirectly complaining about how the transition to 64bit has been handled on LInux. Don't tell me it isn't a problem. I've dealt with it myself.

This is about solving the general problem.

What doublespeak is this?

Comment Re:a step in the wrong direction (Score 1) 135

It is more important to make the transition from 32 bit to 64 bit as smooth and seamless as possible. For the most part, 64bit programs are not necessary. In many cases they're actually slower. As long as your kernel is 64 bit (can address all of your memory) and any memory intensive apps are 64bit, that's all you need. Everyone else can keep writing 32bit software and as long as your system handles it gracefully, it doesn't matter.

Comment Re:a step in the wrong direction (Score 1) 135

This idea (nicknamed "FAT elf") was considered and rejected across the board --- noone wanted it.

Their loss because it could have made the current 32 to 64 bit transition a lot smoother had it been generally accepted and used. Think one kernel, at least 2 different architectures. Also, it would presumably extend down to the driver level too. You wouldn't even have to think about how many "bits" your system is. Everything would just work. The boot loader would pick the optimal arch and you'd be set. But I guess Linux users don't like things to be too easy. (I was a Linux user for many years, and this was actually the case in many ways).

While I am not an expert, it seems wasteful to me to load a bunch of architectures you won't ever need,

Define "load." It isn't l like you load all architectures into memory for all executables. They're just there if you need them. Also, I could be mistaken, but I believe the way OS X uses message passing, I don't think all shared libraries need to be multiarch. 32 bit programs can more smoothly interact with 64 bit programs/libraries.

and wasteful to install a bunch of libraries for architectures you don't use. Typically, only a very small part of your installation will actually need to be multiarch. Of course, if I were *also* selling hardware, I might be a bit more wasteful :)

Oh please. Disk space is dirt cheap. That's just an excuse for laziness on the part of developers, packagers, and distribution maintainers. Meanwhile, Linux users will continue to bitch and complain about things like Adobe Flash not working correctly due to architecture differences. Not properly handling multiarch has a a significant impact on user experience. Linux developers could learn from Apple. But instead, they insist on using Microsoft as the standard by which they measure themselves up to. And Microsoft has had their own 64bit transition problems. So sad. Bottom line is that Apple managed to not only seamlessly transition from 32bit to 64bit, but also from PPC to x86. You have to give them credit.

Typically, only a very small part of your installation will actually need to be multiarch.

Depends on how broadly you want to support 32 and 64 bit side by side. If you want to do it in the most transparent manner possible, you best make as much of the base system multiarch as possible. The problem with Linux is that there is no real "base" system. A Linux distribution is more or less one monolithic beast with every package tightly coupled with the next. It is kind all or nothing with Linux. Anyway, I'm just saying that this "multiarch" support by Debian seems like too little too late and others have done it much better. But if you just want to keep making excuses, whatever.

Comment Re:a step in the wrong direction (Score 1) 135

Apple officially supported 4 architectures. PPC32, PPC64, and x86, x86_64. And you could even store binaries optimized for sub architectures (i.e. G3 vs. G4). All completely transparent to the user. No extra files. The reason this is made practical is the Mach-O binary format. It was trivial to build and distribute a universal binaries and libraries. This was baked in from the start. Something inherited from NeXT.

Considering that we're mainly talking about running x86 binaries on x86_64 Linux machines, I don't see why it wasn't feasible for Debian to support that from the beginning. There'd be no point in installing ARM binaries on an x86 machine because you wouldn't be able to run them anyway. A "universal" install would be a neat trick though not terribly useful for the vast majority of people.

Another neat thing about Apple's universal binary system is that it extends down to the boot level. You can actually boot a PPC Mac and an x86 Mac with the same Leopard disk. This is one of those things that keeps me coming back to OS X/Mac. The shit just works.

Comment Re:Github? (Score 1) 442

If nothing else, it is just handy to have a free/cheap place to publicly (with the option of privacy) host your repository that is user centric rather than project centric. Even without the social aspects like easy forking and such.

Comment Re:Those aren't the real issues here. (Score 1) 145

The OP said he wanted something cross-platform, but made it sound like he'd be selling in a niche market where Windows would most likely be a given. I don't think we know nearly enough about the OPs actually requirements. I think he's probably approaching it all wrong. A locally installed web app is almost never a good idea.

Comment Re:Those aren't the real issues here. (Score 1) 145

That would depend on whether or not I thought I could provide a quality user experience through a standard HTML/JavaScript interface. I could also pare down that list of platforms initially. If I'm targeting business, chances are I can require Windows without much risk. Maybe branch out into OS X to get a slightly larger market share. I can probably ignore Linux completely (sorry Linux users, but that's just the way it is) At this point, the cost of iOS shouldn't be a big deal and I'm really just writing a UI. The hard bits, the business logic, will all be shared either through common libraries or centralized server.

Comment Re:Do you remember back before 2000? (Score 0) 145

Except install on a machine without administrative privileges.

Many applications can be written to install/run completely with user privileges. Though some platforms make this easier than others. A lot of apps I download for OS X, I just launch from the download location and only copy to /Applications when I know I want to use it long term. And even then, I can make the app owned by me so updates don't require admin privs.

Slashdot Top Deals

It is masked but always present. I don't know who built to it. It came before the first kernel.