Should You Pre-Compile Binaries or Roll Your Own? 301
Jane Walker writes "The completion of pre-compiled packages and maximizing machine performance are two powerful incentives for Windows admins to use Linux and compile an OSS package." TechTarget has an article taking a look at some of the "why" behind rolling your own. What preferences have other Slashdot users developed, and why?
Re:Gentoo? (Score:5, Interesting)
The story, and comment, is almost certain to generate a flamefest. So I'll get in early.
I'm a Debian user, and there are three things I know about gentoo:
As for the first I think that compiling from source may well give you a speedup. But when my computer is setting with me at the desktop/ssh session very few processes are running and the network latency / my thinking time are most likely to be the biggest source of delays.
True for heavily loaded servers the compilation might give you a boost but I'd be suprised if it was significant.
Next we have USE flags. These do strike me as an insanely useful thing. But I have one niggling little doubt: I suspect they only work for code that supports it. e.g. project foo has optional support for libbar. If the upstream/original code doesn't have a feature marked as optional I don't imagine the Gentoo people would rework it to strip it out.
So the ability to remove things from the source must be neutered, right?
Finally the merging of configuration files in /etc seems useful. But I wonder if this is the correct approach. My distribution of choice, Debian, already does its utmost to preserve all configuration file changes automagically. I find it hard to understand what Gentoo does differently which makes it better.
Ultimately I guess there are pros and cons to source based distributions depending on your needs. But one thing is true: If you're building from source and making use of modified USE flags and compiler flags then changes are you're the only person in the planet with a particular setup - that means bug reports are hard to manage.
Theres a great deal to be said from having a thousand machines running identical binaries when it comes to tracking down bugs. (Sure diversity is good, especially for security, but there comes a point where maybe people take it a little bit too far).
ObDisclaimer: I'm happy to be educated about Gentoo, but be gentle with me, k?
Re:Gentoo? (Score:4, Interesting)
Re:I am Between Self Compiling and Gentoo (Score:4, Interesting)
I used to compile every major package, back when I didn't know as much about Linux or being a sysadmin. Now that I know what I'm doing I have the confidence needed to use a binary package manager to its fullest.
Re:Gentoo? (Score:4, Interesting)
"The distro is based around compiling from source, which many suggest gives a huge speedup."
It probably does, especially when building for specific architectures
(like C3 or C3-2, etc..)
"... but I'd be suprised if it was significant."
Well, since you compile the compiler as well as everything else.
It does accumulate...
But point taken, in most cases it is not a reason in itself.
USE flags: "I suspect they only work for code that support"
"If the upstream/original code doesn't have a feature marked as optional I don't imagine the Gentoo people would rework it to strip it out."
Actually, that's not true: The Gentoo devs do apply some very useful patches, including some that make it possible to *remove* unused features like you described. Better yet, these patches do make it upstream eventually, albeit at a smaller pace (so the whole community benefits)
Re: configuration files: "Debian, already does its utmost to preserve all configuration file changes automagically. I find it hard to understand what Gentoo does differently which makes it better"
It is not that different, except maybe that Debian does not change as quickly as Gentoo.
"you're the only person in the planet with a particular setup - that means bug reports are hard to manage."
You would be surprised.... Check out the Gentoo ML, they are full of people ready to help, even you try to use that tweaked package XYZ and get into difficulty.
"thousand machines running identical binaries when it comes to tracking down bugs"
Well, if that's what you are looking for, you still can with Gentoo:
(as the parent posted noted) build binary packages on the build machine and deploy to all the others in binary form.
If you want to try it out, why not use UML to boot into it:
http://uml.nagafix.co.uk/ [nagafix.co.uk]
(images and kernels ready to use)
Re:Don't waste your time. (Score:3, Interesting)
But sometimes the results are contrary to expectations though. For instance, unless you set up the filesystem carefully over time the mess of files that is portage and the temp files from compiling will scatter programs all over the fs, making the system much slower to use than a binary distro like ubuntu.
Re:Other benefits (Score:3, Interesting)
In some circles (e.g. #mysql on Freenode) this is considered a Bad Thing. Users come in on Gentoo systems complaining about how 'Unstable' MySQL is. Did they compile from source? Yes. Did they compile from official source? Yes. What EXACTLY did they do to compile from official source? 'I just did "emerge mysql"'
The result is that the user's CFLAGS, Gentoo's patches/defaults, and so on, end up with a binary that is quite a bit different from the stock MySQL install, and it's not terribly surprising to me that the only 'unstable' MySQL situations I've seen are on Gentoo (which is not to say others don't occur).
Another issue with compiling from source is libraries. Even on Debian (with manual compiles by my predecessor), I've seen situations where I'll compile Apache 2 against libssl, but then a few updates later, I'll recompile PHP or curl, which will pick up a new version of libssl - resulting in hard-to-diagnose incompatibilities. The simplest solution I could find was to move the whole system over to complete debianisation, moving the manual Apache compile, configs, etc. over to the Debian package version. The result? Other pakages knew what was installed, I could be guaranteed of consistent compilation options (since I had no easy way to find out how Apache was compiled previously), and so on.
Binary packages for the win.
Re:Gentoo? (Score:4, Interesting)
More importantly, they enable parts of programs you do want/need, even if not many other people do.
For example, my desktop is one of the few *ix machines in my office, and our network is primarily based around Win2k3 and Active Directory. I really, really need Kerberos support in every package that supports it, and configuring 'USE="kerberos"' solves that problem.
This exact issue drove me away from Debian way back when. It made me chose between old Kerberized OpenSSH, or a newer un-Kerberized version [debian.org] (as of today: ssh-krb5 3.8.1p1-10 from OpenBSD 3.5, released 2004-05-01, or ssh 1:4.2p1-7). Gentoo didn't make me choose, so that's what I went with.
Gentoo isn't for everybody, but it has some features that I'd never give up. The ability to pick and choose obscure features that most other people won't need is high on that list.
Re:Gentoo? (Score:5, Interesting)
Compiler more important than compiling? (Score:5, Interesting)
I'm not saying we all have access to icc, but if someone wants to make a binary available, I'm more liable to use that than compiling from source. Call me crazy. And I know someone will.
Re:Gentoo? (Score:5, Interesting)
Only if it breaks api compatibility with the previous version. Otherwise, that's what dynamic linking is for, isn't it?
Right on
Personally, I think the big benefits of running gentoo over debian are things like
On the other hand, I'd say on a p4 3ghz desktop system with a very large software set, I'm probably averaging 2-3 hours a week of compiling for various updates, my debian and fc4 boxes spend more like 5-10 minutes a week downloading and unpacking them. But, if you're halfway decent at scheduling and don't have constant insanely-high demand everywhere, I'd say that update time isn't even a particularly big deal (after all, it's mostly non-interactive
Re:Don't waste your time. (Score:5, Interesting)
Multiple Versions Required (Score:3, Interesting)
For some packages a recompile is merely annoying, having to download and reconfigure with a new prefix and rebuild; but for others, it can be a horrible web of configuration options to find numerous dependencies in special locations. This complexity can be really frustrating if all you want to do is relocate the tool so two different versions can be installed.
Pre-built binaries should assume by default that they'll go into a version-specific directory (say
There are other details, of course...for example, it may matter what compiler you use, you may want 32-bit and 64-bit, etc. But the basic principle is still simple: have a standard package version tree on all Unix-like systems so you can "just download" binaries without conflicts, once and for all.
FreeBSD (Score:5, Interesting)
Not only had I built every package from source (using ports), I also took the trouble to rebuild the base system and kernel with a custom configuration and options.
The benefits to some of this were obvious; the FreeBSD GENERIC kernel at the time seemed (to my eyes) to suffer a massive performance loss from its configuration. Anyone running FreeBSD *must* build at least a custom kernel, even if they use the binary distribution of everything else.
It was a lot of effort. What did I get out of it? It was by the end one of the speediest systems I had ever used since the days of DOS. Most programs loaded faster than their binary equivalents (on older machines the differences were more glaringly obvious, such as the time it took to initialize X).
One time I clocked my old machine, running a custom built FreeBSD installation, against the other computers in the house from power-on to a full desktop (after login).
On my machine, the entire affair (BIOS, bootloader, bootstrapping, system loading, X, login, desktop environment (WindowMaker in this case)) cost a mere 45 seconds. My father's machine, which was in all respects a faster computer, loaded Windows 2000 in the course of perhaps two minutes. Also, I stopped timing after the desktop came up, but Windows does continue to load and fidget about for a good while after that. The extra time taken for it to settle down would have cost it another minute, but only because of all the crap my dad had set to load, which I don't blame Windows for.
The kitchen computer also ran Windows 2000, but had a slimmer configuration, so it loaded shortly over a minute. FreeBSD, however, still beat them both badly.
In light of my own experience, compiling from source can get you some rather wonderful results. However, I noticed that not all systems were created equal. While FreeBSD GENERIC was as slow as molasses, I find in linux that the binary kernels that come with my distributions seem to load and operate just as fast, if not faster than my custom build of FreeBSD. In linux I have used only binary packages, and the system overall "feels" just as fast, though some operations are a little slower (like loading emacs ;)).
I appreciate the arguments presented by both camps, but I feel the need to point out that some are too quick to downplay the possible performance gains offered by custom builds, because they certainly exist. Sometimes they can be noticeably significant.
Comment removed (Score:2, Interesting)