When a "high" level language require half a dozen or so ways to implement a cast, it's time to go.
Of all of the criticisms of C++, this one makes the least sense. Different casts in C++ have very different semantics and one of the worst things that a programming language can do (and which C++ does in many places) is give you the same syntax for different semantics. Are you taking the same set of bits and interpreting them as a different type (typesafe languages don't need this because they don't permit it)? Are you explicitly removing a const qualifier and thus declaring that you know what you're doing when you do something that might break the invariants of an interface (similarly, not permitted in typesafe languages)? Are you saying create a new instance of the target type initialised from this one?
Remember when a programming language was truly object-oriented?
I still use some that are, but C++ is not one of them. If you try to write Smalltalk-like code in C++, you will write almost as bad code as if you try to write C-like code in C++.
template <typename S, typename T>
requires Sequence<S> && Equality_comparable<Value_type<S>, T>
Iterator_of<S> find(S& seq, const T& value);
But today, you'll instead find something like this (from the C++17 spec):
template< class InputIt, class T >
InputIt find( InputIt first, InputIt last, const T& value );
You broke it because either T doesn't implement the operator== overload that compares against T correctly, or InputIt doesn't correctly implement operator++ and operator* with the (documented, but not expressed in the code anywhere) requirements of a sequential access iterator. Now, as soon as you find this you know that the first argument must be a sequence (of type S) with elements of type X, that the second argument must be a type such that X == T is well defined, and that the return value will be an iterator into S, which can be dereferenced to give a reference to an X.
If your error is something syntactic, for example you've deleted the operator==(T&) on S, then your compiler will say 'This find function can't be used with this type because you're missing this method', whereas today it will give you a cryptic error about S::operator==(T&) not being defined somewhere in a deep set of template instantiations.
Concepts give you better compile-time error checking, better compile-time error reporting, and better in-code documentation. They're one of the few C++ language features that are purely benefit.
I'm no Java fan, but at least everything is a reference, so you don't have copy-by-accident ooga booga.
That's true, but Java doesn't really have an equivalent of the C++11 move idiom. If you want Java-like semantics from C++, just alias your pointers (ideally wrapping them in something like std::shared_ptr first). The term move is actually a little bit misleading: you're copying the object, but you're transferring ownership of the expensive contents of the object. For example, when you move a string you're creating a new string object, but you're not copying all of the string data, you're just transferring ownership of the underlying buffer. This is even more important for collection types, where you really don't want to do a deep copy and delete.
You can implement the same thing in Java by splitting your objects into cheap wrappers that have a reference to an expensive object and then adding a move() method that transfers ownership of the expensive object to the new wrapper, but it's not integrated into the language. The language integration isn't actually essential in C++ either: people have implemented the same thing using a special Move<> template, which could be used as an argument to an overloaded constructor, which would do the move operation. The nice thing about having it in the standard library and with language support is that tooling understands it. Your compiler / static analyser can warn you if you move an object and then try to anything with the old version.
If copying is so bad (which apparently it is because you'll definitely get reamed during a code review if you do), force a copy action via clone(), ike Java
Saying 'copying is bad' is just as wrong as most other 'X is always wrong' rules. Copying for small types is fine. A std::pair of an integer and a float is a couple of instructions to copy and move semantics would make no sense for it. clone() in Java is also problematic because the standard library doesn't distinguish between a deep and shallow clone.
Really? Prior to 1998, there was no standard library, though the Standard Template Library from SGI was pretty much treated as the standard library. When C++ was standardised in 1998, most of the STL was incorporated into the C++ standard library, so almost everything that you'd learned from the STL would still be relevant. The next bump to the standard was in 2011. Lots of stuff was added to the standard library, but very few things were changed in incompatible ways (auto_ptr was deprecated, because in 13 years no one had figured out how to use it without introducing more problems than it solved) and almost all C++98 code compiles without problems against a C++11 library. C++14 and C++17 have both added a lot more useful things but removed or made incompatible changes to very few things.
Let's look at a commonly used class, std::vector. The only incompatible changes in the last 18 years have been subtle changes to how two of the types that are accessible after template instantiation are defined. Code using these types will still work (because the changes are not on the public side of the interface), but the chain for defining them is more explicit (e.g. the type of elements is now the type of elements, not the type of things allocated by the thing that allocates references - code would fail to compile if these weren't the same type). The changes in std::map are the same.
That said, you do need to learn new things. Modern C++ shouldn't use bare pointers anywhere and should create objects with std::make_shared or std::make_unique. The addition of std::variant, std::optional, and std::any in C++17 clean up a lot of code.
Yes, except with compile-time specialisation instead of run-time specialisation. One of the big problems that I have with C++ is that it has entirely separate mechanisms and syntax for implementing the same thing with compile-time and run-time specialisation and they don't always compose well. Languages such as Java sidestep this by providing only run-time specialisation and expecting the JIT compiler to generate the equivalent of compile-time specialisation.
With an abstract class in C++, you'd require that every method be called via a vtable, which makes inlining hard (though modern compiler can do devirtualisation to some extent). This often doesn't matter, but when it's something like an array access, which is 1-2 instructions, the cost of the method call adds up. In contrast, if you use a template then the compiler knows exactly which method implementation is called and will inline any trivial methods (at the cost of now having one version of each templated function for every data type, which can blow away your instruction cache if you're not careful). The down side of the template approach is that you have no (simple) way of saying 'this template argument must be a thing on which these operations are defined' and the error message when you get it wrong is often dozens of layers of template instantiation later and totally incomprehensible without a tool such as Templight.
I'm not convinced by Chris' argument here. GC is an abstract policy (objects go away after they become unreachable), ARC is a policy (GC for acyclic data structures, deterministic destruction when the object is no longer reachable) combined with a mechanism (per object refcounts, refcount manipulation on every update). There is a huge design space for mechanisms that implement the GC policy and they all trade throughput and latency in different ways. It would be entirely possible to implement the C# GC requirements using ARC combined with either a cycle detector or a full mark-and-sweep-like mechanism for collecting cycles. If you used a programming style without cyclic data structures then you'd end up with almost identical performance for both.
Most mainstream GC implementations favour throughput over latency. In ARC, you're doing (at least) an atomic op for every heap pointer assignment. In parallel code, this leads to false sharing (two threads updating references to point to the same object will contend on the reference count, even if they're only reading the object and could otherwise have it in the shared state in their caches). There is a small cost with each operation, but it's deterministic and it doesn't add much to latency (until you're in a bit of code that removes the last reference to a huge object graph and then has to pause while they're all collected - one of the key innovations of OpenStep was the autorelease pool, which meant that this kind of deallocation almost always happens in between runloop iteration). A number of other GC mechanisms are tuned for latency, to the extent that they can be used in hard realtime systems with a few constraints on data structure design (but fewer than if you're doing manual memory management).
This is, unfortunately, a common misconception regarding GC: that it implies a specific implementation choice. The first GC papers were published almost 60 years ago and it's been an active research area ever since, filling up the design space with vastly different approaches.
Of course, the nuclear family of the 1950s had:
a 1200 (not 2200) sqft house,
formica (not granite) counters,
But the house was owned - with a mortgage affordable on a single income and substantial equity in place.
The car was also either owned or being purchased on an auto loan (rather than leased), again with substantial equity from the down payment, and again paid for out of that single income - which was also feeding and clothing the 2.3 children and taking a nontrivial vacation once a year or so.
And I have no idea where you are getting those square footage numbers. Our family's houses (we moved a couple times once Dad got done with his degree and was buying rather than living in a student ghetto) were substantially larger than you describe, and were typical of the neighborhoods around them.
Yes, Formica: It was the big deal of the time. Granite is a recent vanity - and a REALLY STUPID idea if you actually USE the kitchen to prepare food on a regular basis. Drop a ceramic or glass utensil on a granite counter and it breaks. Drop it on Formica-over-plywood-or-hardwood and it usually bounces.
stainless steel appliances,
*might* have had a TV (not a 54" LCD),
Yeah we had all those boxes (though the appliances were be enamel rather than stainless). Also a console sound system - pre "Hi Fi" - AM, FM, and four-speed record changeer with diamond needle in the pickup.
The non-electronic appliances lasted for decades, too. (Even the electronics lasted a long time with occasional maintenance - which was required for vacuum tube based equipment - and was AVAILABLE.) Quite unlike the modern stuff. (My own family has been in our townhouse for about 17 years now and is on its third set of "stainless steel appliances", thanks to the rotten construction of post-outsourcing equipment by formerly high-end manufacturers. We're even on our third WATER HEATER: The brain of the new, governent mandated, eco-friendly, replacement flaked out after less than a year - and the manufacturer sent TWO MORE defective replacement brains and one defective gas sensor before lemon-replacing it.)
even Google itself washes their hands of any phone that is older than about 2 years.
Three years. Google devices get system upgrades for two years, and security updates for three years. That's still well short of five years, as you say. On the other hand, while Apple has a history of supporting devices for that long, they've made no commitment to any specific support timeline.
Yes. That's the nature of STL concepts. Unlike Haskell typeclasses, it's not enough just to see that there are operations with the right types, because STL concepts are often defined in terms like "this is a valid expression, which returns a value convertible to bool".
you are a true master, you should be able to explain concepts in a way that even a child can understand. Richard Feynman was famous for this. So was Albert Einstein. Of course you can go too far, and simplify too much, so the children only think they understand.
Richard Feynman and Albert Einstein both did exactly this. You really can't understand quantum mechanics or general relativity without math. You can think you do, and both of them were great at providing simple explanations that gave the illusion of understanding... but it was only an illusion, which of course they knew perfectly well.
And is that question scribbled on airport walls country wide?
1 Mole = 007 Secret Agents