[quote]Truth is, nothing irreplacable was provided by the GNU project. [/quote]
I can't see you you can possibly think that is relevant. Whether there were other options that *could* have been used doesn't change the fact that a circa 1992 "linux" system was largely a GNU system.
I certainly agree that a modern Linux based desktop is not a GNU OS, but I think it was a perfectly reasonable request in the early 90s. I still call it Linux, mostly because the name is shorter, and I am not about to call it GNU/X11/Gnome/Linux. And the reasons I choose to run Linux over (say) FreeBSD are mostly to do with the kernel and the kernel specific system tools, and not with the userland.
Compiler and toolchain, and all the 'standard' UNIX tools: the shell, the text utils like cat, grep, awk, etc.
Basically, back in the 80s, the FSF, reimplemented what was at that time nearly the entirety of what was called UNIX except the kernel (which was what the HURD project was/is). It was to be the GNU OS. While the kernel was in development, the userspace tools were developed and ported to other UNIX systems like sunos as a replacement for the often deficient historical versions supported by the UNIX vendors.
So when Linus came along and wrote a UNIX-like kernel using gcc, he could load all those programs on and have a mostly functioning UNIX environment. This was the reason RMS objected to calling it just Linux, at that time the majority of the code running on the system was GNU. It was probably a legitimate point at the time. And even if there were a different compiler, without a set of userspace tools that people could freely get and use it is unlikely Linux would have been able to take off.
Now, of course, a huge part of the user experience is provided by X11, the desktop environments, and various graphical appliations. GNOME is part of the GNU project but X.org, KDE, and most of the applications are not. So it isn't really true that GNU software is still the majority of the OS. Of course, the kernel is even less important in terms of the user environment, and despite all the other software around it, GNU utilities are what makes it (not) UNIX.
That is a pretty bad example. C++ references hardly count as references since you can't reassign them. They are really just syntactic sugar to make operator overloading look nice and reduce the number of -> operators. They cannot be used alone to create complex data structures.
A better example would be references in lisp, perl, or java. They solve many of the problems of C/C++ pointers. They can't be out of bounds. They must respect the type safety of the language. They can't point to an invalid or destroyed object due to garbage collection. However, they all support a null reference.
Maybe there is a better way to do this where you don't ever need null references, but I know two things for certain 1) SQL is not it 2) People will still make errors where data they expect to be there is not.
You think that quantum computers are not able to justify grants or PhDs? What world are you living in?
until these quantum computers exist and are cheap enough to fill datacentres
Yeah, because classical computers were never useful to anyone (or anyone important) until datacenters existed.
No, to be really useful, quantum computing has to be as easy to afford and deploy as current computing technology.
And until then, developments that bring us closer are irrelevant? Applications that could give us more reason to develop the technology are pointless?
What exactly is your point here?
"Stupidity, like virtue, is its own reward" -- William E. Davidsen