Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Should You Pre-Compile Binaries or Roll Your Own? 301

Jane Walker writes "The completion of pre-compiled packages and maximizing machine performance are two powerful incentives for Windows admins to use Linux and compile an OSS package." TechTarget has an article taking a look at some of the "why" behind rolling your own. What preferences have other Slashdot users developed, and why?
This discussion has been archived. No new comments can be posted.

Should You Pre-Compile Binaries or Roll Your Own?

Comments Filter:
  • Re:Gentoo? (Score:5, Interesting)

    by stevey ( 64018 ) on Tuesday March 14, 2006 @05:26PM (#14919248) Homepage

    The story, and comment, is almost certain to generate a flamefest. So I'll get in early.

    I'm a Debian user, and there are three things I know about gentoo:

    • The distro is based around compiling from source, which many suggest gives a huge speedup.
    • They have some neat tools for working with file merging conflicts in /etc, which sometimes happen when upgrading.
    • They make use of "USE" flags which can disable parts of programs you don't want/need.

    As for the first I think that compiling from source may well give you a speedup. But when my computer is setting with me at the desktop/ssh session very few processes are running and the network latency / my thinking time are most likely to be the biggest source of delays.

    True for heavily loaded servers the compilation might give you a boost but I'd be suprised if it was significant.

    Next we have USE flags. These do strike me as an insanely useful thing. But I have one niggling little doubt: I suspect they only work for code that supports it. e.g. project foo has optional support for libbar. If the upstream/original code doesn't have a feature marked as optional I don't imagine the Gentoo people would rework it to strip it out.

    So the ability to remove things from the source must be neutered, right?

    Finally the merging of configuration files in /etc seems useful. But I wonder if this is the correct approach. My distribution of choice, Debian, already does its utmost to preserve all configuration file changes automagically. I find it hard to understand what Gentoo does differently which makes it better.

    Ultimately I guess there are pros and cons to source based distributions depending on your needs. But one thing is true: If you're building from source and making use of modified USE flags and compiler flags then changes are you're the only person in the planet with a particular setup - that means bug reports are hard to manage.

    Theres a great deal to be said from having a thousand machines running identical binaries when it comes to tracking down bugs. (Sure diversity is good, especially for security, but there comes a point where maybe people take it a little bit too far).

    ObDisclaimer: I'm happy to be educated about Gentoo, but be gentle with me, k?

  • Re:Gentoo? (Score:4, Interesting)

    by autocracy ( 192714 ) <slashdot2007.storyinmemo@com> on Tuesday March 14, 2006 @06:37PM (#14919901) Homepage
    Having lived in the Linux From Scratch days, you'll find just about everything has use flags and parts that can be disabled.
  • by digidave ( 259925 ) on Tuesday March 14, 2006 @06:38PM (#14919919)
    Exactly. I started using Debian because not only are the packages the best in the world, but it's easy to get things working. Now I'm beta testing VMWare Server because that makes it even easier. I created a few virtual machines (one LAMP, one Ruby on Rails/Lighty, one database-only, etc) and can have them running in less than ten minutes + the time it takes to do any specific configuration for whatever app goes on there, which is usually only a few minutes. The VMs are configured to auto-update themselves from Debian's repositories every night, so out of the box I just run apt-get to update from when I made the VM and it's all set to go.

    I used to compile every major package, back when I didn't know as much about Linux or being a sysadmin. Now that I know what I'm doing I have the confidence needed to use a binary package manager to its fullest.
  • Re:Gentoo? (Score:4, Interesting)

    by tota ( 139982 ) on Tuesday March 14, 2006 @06:45PM (#14919965) Homepage
    I'll try to be gentle;)

    "The distro is based around compiling from source, which many suggest gives a huge speedup."
    It probably does, especially when building for specific architectures
    (like C3 or C3-2, etc..)
    "... but I'd be suprised if it was significant."
    Well, since you compile the compiler as well as everything else.
    It does accumulate...
    But point taken, in most cases it is not a reason in itself.

    USE flags: "I suspect they only work for code that support"
    "If the upstream/original code doesn't have a feature marked as optional I don't imagine the Gentoo people would rework it to strip it out."
    Actually, that's not true: The Gentoo devs do apply some very useful patches, including some that make it possible to *remove* unused features like you described. Better yet, these patches do make it upstream eventually, albeit at a smaller pace (so the whole community benefits)

    Re: configuration files: "Debian, already does its utmost to preserve all configuration file changes automagically. I find it hard to understand what Gentoo does differently which makes it better"
    It is not that different, except maybe that Debian does not change as quickly as Gentoo.

    "you're the only person in the planet with a particular setup - that means bug reports are hard to manage."
    You would be surprised.... Check out the Gentoo ML, they are full of people ready to help, even you try to use that tweaked package XYZ and get into difficulty.

    "thousand machines running identical binaries when it comes to tracking down bugs"
    Well, if that's what you are looking for, you still can with Gentoo:
    (as the parent posted noted) build binary packages on the build machine and deploy to all the others in binary form.

    If you want to try it out, why not use UML to boot into it:
    http://uml.nagafix.co.uk/ [nagafix.co.uk]
    (images and kernels ready to use)
  • by 0xABADC0DA ( 867955 ) on Tuesday March 14, 2006 @06:54PM (#14920039)
    Most programs may get an average speedup of 10%. But if the speedup happens in something critical you get a massive speed increase. I was using the resynth plugin in gimp to remove some text and fill it in with the background texture (interpolated). The difference between running it on gentoo linux and that of pre-compiled windows version was over 4x. Depending on the image size it can easily run for hours. Now that's not typical, but on the other hand I didn't do anything special on linux to get this speedup, it just happened because gentoo compiled it from source. I only discovered it because the difference was so huge.

    But sometimes the results are contrary to expectations though. For instance, unless you set up the filesystem carefully over time the mess of files that is portage and the temp files from compiling will scatter programs all over the fs, making the system much slower to use than a binary distro like ubuntu.
  • Re:Other benefits (Score:3, Interesting)

    by Sentry21 ( 8183 ) on Tuesday March 14, 2006 @07:23PM (#14920288) Journal
    The big benefits of source-based distros are the ability to tailor packages to each install (ie the ability to compile certain features in or out), to choose optimizations on each package (do you want -Os, -O2, -03, or are you really daring -> -ffast-math?).

    In some circles (e.g. #mysql on Freenode) this is considered a Bad Thing. Users come in on Gentoo systems complaining about how 'Unstable' MySQL is. Did they compile from source? Yes. Did they compile from official source? Yes. What EXACTLY did they do to compile from official source? 'I just did "emerge mysql"'

    The result is that the user's CFLAGS, Gentoo's patches/defaults, and so on, end up with a binary that is quite a bit different from the stock MySQL install, and it's not terribly surprising to me that the only 'unstable' MySQL situations I've seen are on Gentoo (which is not to say others don't occur).

    Another issue with compiling from source is libraries. Even on Debian (with manual compiles by my predecessor), I've seen situations where I'll compile Apache 2 against libssl, but then a few updates later, I'll recompile PHP or curl, which will pick up a new version of libssl - resulting in hard-to-diagnose incompatibilities. The simplest solution I could find was to move the whole system over to complete debianisation, moving the manual Apache compile, configs, etc. over to the Debian package version. The result? Other pakages knew what was installed, I could be guaranteed of consistent compilation options (since I had no easy way to find out how Apache was compiled previously), and so on.

    Binary packages for the win.
  • Re:Gentoo? (Score:4, Interesting)

    by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Tuesday March 14, 2006 @07:32PM (#14920369) Homepage Journal
    They make use of "USE" flags which can disable parts of programs you don't want/need.

    More importantly, they enable parts of programs you do want/need, even if not many other people do.

    For example, my desktop is one of the few *ix machines in my office, and our network is primarily based around Win2k3 and Active Directory. I really, really need Kerberos support in every package that supports it, and configuring 'USE="kerberos"' solves that problem.

    This exact issue drove me away from Debian way back when. It made me chose between old Kerberized OpenSSH, or a newer un-Kerberized version [debian.org] (as of today: ssh-krb5 3.8.1p1-10 from OpenBSD 3.5, released 2004-05-01, or ssh 1:4.2p1-7). Gentoo didn't make me choose, so that's what I went with.

    Gentoo isn't for everybody, but it has some features that I'd never give up. The ability to pick and choose obscure features that most other people won't need is high on that list.

  • Re:Gentoo? (Score:5, Interesting)

    by kbielefe ( 606566 ) * <karl,bielefeldt&gmail,com> on Tuesday March 14, 2006 @07:33PM (#14920371)
    I use gentoo for a few different reasons, none of which have anything to do with ekeing out every last cycle from my machine:
    • Nearly universal support for stack smash protection which must be enabled at compile time.
    • Incremental updates. I update my system a little bit at a time instead of doing a major upgrade when the distro makes a new major release. I've used gentoo for a long time (5+ years), so I don't know how much of a problem this still is with other distros or not.
    • Better dependency handling. No problems with different packages being compiled by different compilers against different development library versions.
    • Not strongly married to either KDE or Gnome.
    • Multimedia is easier to get working than any other distro I've tried. decss, win32 codecs, etc.
    • Can stay bleeding edge where I want to, and extremely stable in other areas.
    • Easy to make small changes to source. I occasionally add a minor feature, change a default, fix a bug, or apply a security patch from a mailing list instead of waiting for the next release.
    • Easy to distribute to multiple machines. It's a snap to compile and test on one machine, then quickly install my custom binary package on many machines.
    • USE flags. Almost everywhere you could use --disable-feature on a manual configure, there is a USE flag for that feature. This is very useful both for enabling features that most distro's wouldn't include and disabling features that most distros include by default. For example, when alsa was still pretty new and usually not enabled by default, the alsa USE flag made migration much easier.
  • by lawaetf1 ( 613291 ) on Tuesday March 14, 2006 @08:24PM (#14920711)
    Fine the subject doesn't make complete sense.. BUT... doesn't compiling code with Intel's cc result in significantly better binaries than any flag you can throw at gcc?? From http://www.intel.com/cd/ids/developer/asmo-na/eng/ 219902.htm [intel.com], MySQL claims 20 percent performance improvement over gcc.

    I'm not saying we all have access to icc, but if someone wants to make a binary available, I'm more liable to use that than compiling from source. Call me crazy. And I know someone will.
  • Re:Gentoo? (Score:5, Interesting)

    by ComputerizedYoga ( 466024 ) on Tuesday March 14, 2006 @08:34PM (#14920774) Homepage
    Imagine what happens when libpng4 comes out - every program using libpng must be rebuilt to get the new features, so you've only sidestepped the problem.


    Only if it breaks api compatibility with the previous version. Otherwise, that's what dynamic linking is for, isn't it?

    Having multiple versions of libraries installed isn't a big deal either, unless you're tight on space. And if you're tight on space the idea of compiling large applications probably isn't something you'd appreciate anyway!


    Right on ... openoffice 2 spooled out to fill 8 gigs of free space when I tried to compile it with "nostrip" and with debug symbols. The linux kernel source, unpacked, weighs in around 230 megs, where the binary .deb for 2.6.15 is somewhere closer to 16 megs. Sure, you don't get the ultra-lean heavily-tuned kernel that you'd get building from source, but it certainly works.

    Personally, I think the big benefits of running gentoo over debian are things like ... the runlevel abstraction system, the ability to turn on and off features with more freedom, and the simple slickness of portage. Oh, and a very helpful and newbie-friendly community.

    On the other hand, I'd say on a p4 3ghz desktop system with a very large software set, I'm probably averaging 2-3 hours a week of compiling for various updates, my debian and fc4 boxes spend more like 5-10 minutes a week downloading and unpacking them. But, if you're halfway decent at scheduling and don't have constant insanely-high demand everywhere, I'd say that update time isn't even a particularly big deal (after all, it's mostly non-interactive ... fire it, forget it, come back when it's done).
  • by Spoke ( 6112 ) on Tuesday March 14, 2006 @09:44PM (#14921124)
    Most packages I have seen on Linux distros are compiled with -O2 or -O3. It is highly unlikely that the various other switches provided by GCC will provide anything significant.
    Besides the obvious -O parameters to gcc, specifying the arch of the platform (-march=i386 for example) can sometimes have a decent effect on performance. A lot of distributions compile for the most common platform which usually means specifying -march=i386 -mtune=i686. That gets you binaries that run on any i386 or better, while tuning the code for i686 machines. If you're running an older processor or something like a VIA or AMD cpu, often compiling with -march=c3-2 or -march=athlon64 or whatever specific CPU you're running can provide a noticable benefit, especially on newer versions of gcc.
  • by Killer Eye ( 3711 ) on Tuesday March 14, 2006 @11:10PM (#14921483)
    In my experience, it is often necessary to recompile from source simply to have more than one version of the same package available at once! Too many pre-built binaries assume they are the only version in the universe you could want in /usr/local/bin.

    For some packages a recompile is merely annoying, having to download and reconfigure with a new prefix and rebuild; but for others, it can be a horrible web of configuration options to find numerous dependencies in special locations. This complexity can be really frustrating if all you want to do is relocate the tool so two different versions can be installed.

    Pre-built binaries should assume by default that they'll go into a version-specific directory (say /opt/pkgname/1.0), and at the same time they should assume their dependencies can also be found there. The /usr/local hierarchy would remain, but as a hierarchy of references to specific versions of things. The /usr/local hierarchy would contain selected default versions, it would be used for efficient runtime linking (have "ld" search one "lib", not 20 different packages), and it would be targeted for dependencies that truly don't care about the version that is used.

    There are other details, of course...for example, it may matter what compiler you use, you may want 32-bit and 64-bit, etc. But the basic principle is still simple: have a standard package version tree on all Unix-like systems so you can "just download" binaries without conflicts, once and for all.
  • FreeBSD (Score:5, Interesting)

    by vga_init ( 589198 ) on Wednesday March 15, 2006 @04:14AM (#14922563) Journal
    For years I lived in the world of FreeBSD.

    Not only had I built every package from source (using ports), I also took the trouble to rebuild the base system and kernel with a custom configuration and options.

    The benefits to some of this were obvious; the FreeBSD GENERIC kernel at the time seemed (to my eyes) to suffer a massive performance loss from its configuration. Anyone running FreeBSD *must* build at least a custom kernel, even if they use the binary distribution of everything else.

    It was a lot of effort. What did I get out of it? It was by the end one of the speediest systems I had ever used since the days of DOS. Most programs loaded faster than their binary equivalents (on older machines the differences were more glaringly obvious, such as the time it took to initialize X).

    One time I clocked my old machine, running a custom built FreeBSD installation, against the other computers in the house from power-on to a full desktop (after login).

    On my machine, the entire affair (BIOS, bootloader, bootstrapping, system loading, X, login, desktop environment (WindowMaker in this case)) cost a mere 45 seconds. My father's machine, which was in all respects a faster computer, loaded Windows 2000 in the course of perhaps two minutes. Also, I stopped timing after the desktop came up, but Windows does continue to load and fidget about for a good while after that. The extra time taken for it to settle down would have cost it another minute, but only because of all the crap my dad had set to load, which I don't blame Windows for.

    The kitchen computer also ran Windows 2000, but had a slimmer configuration, so it loaded shortly over a minute. FreeBSD, however, still beat them both badly.

    In light of my own experience, compiling from source can get you some rather wonderful results. However, I noticed that not all systems were created equal. While FreeBSD GENERIC was as slow as molasses, I find in linux that the binary kernels that come with my distributions seem to load and operate just as fast, if not faster than my custom build of FreeBSD. In linux I have used only binary packages, and the system overall "feels" just as fast, though some operations are a little slower (like loading emacs ;)).

    I appreciate the arguments presented by both camps, but I feel the need to point out that some are too quick to downplay the possible performance gains offered by custom builds, because they certainly exist. Sometimes they can be noticeably significant.

  • Comment removed (Score:2, Interesting)

    by account_deleted ( 4530225 ) on Wednesday March 15, 2006 @04:36AM (#14922619)
    Comment removed based on user account deletion

"Little else matters than to write good code." -- Karl Lehenbauer

Working...