Except statically linked binaries. Those were linked at build time, but they don't invoke the linker when they are executed. Guess what I use musl for? Building static binaries.
But you're changing the goalpost now. What I said was troublesome was using musl (or anything else, if glibc-incompatible) as a replacement for glibc. Using it for selected static binaries that you build yourself is another thing.
Nope. Binaries I only need one or the other, not both. Libraries only when I need a musl version of a library.
That's because I was talking about replacing glibc with musl, and be able to run binaries, over which you have no control, written for any of the two libraries. Which is the only scenario that would makes final users happy, should glibc coexist with musl.
No. Only for shit software that doesn't have any kind of protocol.
Again, that's stuff that is out of our control. Engineers have to design their systems for the worst case, not for the best.
Wrong. ISO C specifies stdint.h, int64_t is a standard way to get a 64-bit signed integer, and uint64_t is standard for 64-bit unsigned integers. The fact that you don't know ISO C is illuminating.
Microsoft's stuff is compatible with ISO C90. uint64_t was unsigned __int64 in their world until very recently.
Never works in practice. If the prototype for a library function is the same, and the calling semantics is the same, then it can be relinked without recompilation whether it was statically (unless it was stripped) or dynamically linked, and if either has changed, the code needs to be reviewed, rewritten and recompiled in either case.
With versioned symbols (glibc) you haven't this problem. But if you add support for them in your library, then you add "bloat". So you have to decide whether you want to add "bloat", or to make your users unhappy.
I don't know what adding getline has to do with existing programs. If they are not already making use of getline, then even if they are recompiled, they will still not make use of getline, and will not require it's symbol for linking. The Austin group (POSIX) are careful to not break stuff when revising standards, though I'm sure you can still point to some breakage; in general they tend to revise things by adding new symbols, or assuming the greatest common behavior between implementations.
They added getline() to the standard, and doing so they broke all the software that, perfectly standard-compliant until then, used the getline() name for their own stuff. This is to say that imagining to resolve binary compatibility with standard compliance simply doesn't work. They're two different problems.
Autotools gives me, as a user, and have done so since the 90s:
- cross compiling support;
I've bootstrapped my entire 64-bit Linux installation from a 32-bit host using it. It has almost always worked for me, even for packages whose developer hadn't ever thought about the possibility of cross-compilation. I have no doubt that packages using autotools were the ones that gave me fewer problems.
However plain old Makefiles, I just set CC, CFLAGS, LD, LDFLAGS and LIBDIR and things just work (also CXX and CXFLAGS for C++).
Only for the simplest cases. When things get more complex, you'll have to handle the difference between the host C compiler (which can be used to compile stuff which will run on the build host, such as code generators) and the target C compiler (which can only be used to produce binaries that won't run on the build host).
- ability to change any installation path;
I can do that with "PREFIX=/foo/bar make install" with any well written makefile.
That's the difference, with autotools you are almost sure that you get those features out of the box, and they're standardized, with other systems you have to hope that the developers wrote their makefiles well, and I can tell you that the current trend among developers is that they like less and less to invest time into packaging their source code. You also have to study the makefiles and see if it's DESTDIR or INSTALL_ROOT or ROOT_PREFIX or something else. Also, consider the difference between PREFIX and DESTDIR. The first can end up into paths stored in the generated code. The second won't.
- support for building shared and static libraries simultaneously using the best compiler options for each case;
and probably something else that I'm forgetting.
make staticlibs; make dynamiclibs
Then you have to take into account the different flags required for building static vs shared libraries on every platform that you want to target (e.g. Linux supports non-PIC code in shared libraries on i386, other OSes don't, x86_64 never supports it...). With autotools you get that for free and out-of-the-box.
Fixing a broken homemade Makefile takes me a few minutes. Fixing a broken autobarf takes me hours to days.
Modern .ac files are much, much simpler than the Makefiles and scripts that they generate. I can't see how you find it more difficult to fix them rather than the Makefiles themselves. Have you ever tried to debug a problem with a .cmake file and its arcane language? And what about scons, it works basically with raw python scripts...
Porting code to 9front is basically a rewrite as it's so alien, so let's put that to one side.
I don't think autotools work well, if at all, outside UNIXish systems. Even on Windows, they require impedance adaptation layers such as cygwin or mingw or interix.