On the other hand, many properties about programs themselves can and should be verified. A great deal of current programming language research is devoted toward both improving the capabilities of automatic program verification as well as designing languages more amenable to verification. Functional languages, for instance, rule out entire classes of bugs present in imperative languages. People complain that they're hard to understand. Maybe. I argue that they're hard to understand if you're the kind of person who does not care about whether your program is correct or not.
Unless and until some unforeseen, miraculous breakthrough happens in language design, GCd languages will always be slower when it comes to memory management. And because memory management is so critical for complex applications, GCd languages will effectively always be slower, period.
This isn't true. Have a look at Quantifying the Performance of Garbage Collection vs. Explicit Memory Management. The take-away is that GC'd languages are only slower if you are unwilling to pay an extra memory cost; typically 3-4x of your explicitly-managed program. Given that GC gives you safety from null-pointer dereferences for free, I think that's a fair tradeoff for most applications (BTW, you can run the Boehm collector on explicitly-managed code to identify pointer safety issues).
When you're done with that, go read What Every Computer Scientist Should Know About Floating-Point Arithmetic. I cribbed the example directly from the article.
(0.1 + 0.2) == 0.3
It doesn't matter how many bits you use in floating point. It is always an approximation. And in base-2 floating point, the above will never be true.
The only new idea here is using approximate computing specifically in trading high precision for lower power. The research has less to do with new algorithms and more to do with new applications of classic algorithms.
Also, to pre-emptively swat down this counter-argument: while the Mono people and Microsoft may have had some animosity in the past, it is most definitely not the case any more. Most of the Mono people I have spoken to (yes, in person) say that their relationship with Microsoft is pretty good.
Build systems and dependency management for the JVM are their own mini-nightmare.
Coming from the Java world, I was totally blown away by what
As someone who dabbles in techniques from AI to solve problems in my own domain (programming language research), solutions in AI tend to have the quality that the algorithms that produced them are extremely general. For example, a robot that can manipulate objects may not even possess a subroutine that tells it how it should move its hands. Often, it learns these things by example instead. It "makes sense" of the importance of these actions through the use of statistical calculations, or logical solvers, or both. Since information in this context is subject to many "interpretations", these algorithms often most closely resemble search algorithms! If a programmer provides anything, it's in the form of "hints" to the algorithm (i.e., heuristics). To an outsider, it's completely non-obvious how "search" and "object manipulation" are related, but when you frame the problem that way, you get weird and sometimes wonderful results. Most notably, autonomy. Sadly, you also sometimes get wrong answers
If your washing machine could go collect your dirty laundry and wash it without your help, I'd call it a laundry robot. Particularly if you could tell it something like "please do a better job cleaning stains", and it could figure out what you meant by that. Note that a Roomba doesn't know anything about your house until you turn it on the first time.
But it turns out-- even if you're a good engineer and you are careful with your floating point numbers, the fact is: floating point is approximate computation. And for many kinds of mathematical problems, like dynamical systems, this approximation changes the result. One of the founders of chaos theory, Edward Lorenz, of Lorenz attractor fame, discovered the problem by truncating the precision of FP numbers from a printout when he was re-entering them into a simulation. The simulation behaved completely differently despite the difference in precision being in the thousands. That was a weather simulation. See where I'm going with this?