It was the damn dark ages.
You're off by (slightly more than) two hundred years. The "Dark Ages" by convention end with the end of the 10th century - not to mention the fact that this term has limited spatio-temporal applicability: applying it to the Eastern Roman (Byzantine) Empire is laughable perhaps with the sole exception of the period of early Muslim conquests after Muhammad's death, and even though France, e.g., had its dark periods, the first half of the 9th century was rather standing out.
Punched paper tape has better longevity than either floppies or optical media.
If you're going for *actual* longevity, you can't beat fired clay tablets. (Yeah, I know they weren't fired originally, but you have to decide how much you value your MP3 files. I'd certainly take the extra time!)
The C++ mentality is that you should catch as many errors as you can at compile time.
No, the Haskell mentality is that you should catch as many errors as you can at compile time (stopping only barely short of executing the whole program at compile time during the type checking
Every time I've seen reflection used, it was by some terrible programmer who'd just learned about reflection and was looking for an excuse to use it.
That sounds like a rather lame argument to me. You could say exactly the same thing about C++ itself: the majority of time when C++ gets used (overall population considered), it's some terrible programmer who just discovered that C++ is supposed to be fast, fast, fast (well, it isn't actually all that fast - certainly not on the most recent hardware) and wants to write a game, or perhaps just because the college professor steadfastly requires C++ for his course (and neither of them knows better). Only a small percentage of users ever get to the point that they actually understand the nuances of appropriate language feature usage (that goes for any language). Or perhaps it's like with violins, they don't suck just because most people who try to play them suck. Or do you think that Sturgeon's law somehow doesn't apply to people?
I've never seen its use actually improve a design. I've never even seen it make more loosely-coupled or reusable code, now that I think about it.
Well, perhaps you ought to look at Lisp or Smalltalk one day. The whole Smalltalk IDE effectively runs that way. In fact, any self-sustaining system similar to those essentially *requires* reflection and introspection by definition. Perhaps that's not an argument for C++ code (which uses a rather ancient kind of development workflow, what with the absence of incremental compilers), but as Gtk+ tries to cater to the needs of users of many languages (I'll leave aside how well they succeed at it, that's an entirely different question), it's no surprise to me that they're so heavily reflection-oriented - languages like Python, Ruby, or Lisp essentially *thrive* on this kind of binding library implementations.
Wrong. GTK+ did their weird object system because it's in C, not C++.
Yes, they did. And they also had a number of other reasons. Most of which perhaps disappeared in time, but not all of them.
Reflection isn't a factor
Sure, if you're happy to be forced to recompile all binding libraries, generate new packages and force hundreds of thousands of people to upgrade them whenever the interfaces change instead of just upgrading Gtk+.
and as a C++ programmer I don't feel like I'm missing anything without it.
Oh wait, you're C++ programmer. Of course, in that case your brain is already so horribly damaged that you can't perceive the comparatively mild benefits this would bring to a C++ programmer (removing only one cut from the thousand ones already killing you).
Range is no limitation for lasers and EMF my friend. For example, do the suns suddenly become unviewable at some distance? No? Then neither do your neurons or our radar systems to your brains.
Four words: signal-to-noise ratio. You got suckered like Tom Clancy with the life detector in Rainbow Six.
AFAIKT Aura is a more than just a UI Toolkit, it's a complete Window Manager.
You are aware, that - at least in sane computer system designs - this is mostly the same thing?