That would require the Earth to be very very special indeed, and I just don't see it.
Not at all. For example, I just generated a random number between 1 and 1e9. It was 869,502,332. By your logic, therefore, that number must have been very, very special. But no, it was just really improbable and that number happened to come up.
It may very well be the same case with life. Life could just be extremely improbable, and Earth just happened to be "the number" that was picked. This is what the Anthropic Principle is all about. Our perceptions are colored by the fact that we're here, so we think, "Since the Earth is not special, therefore, other planets must have life like Earth." It might just be that Earth was the lottery winner.
I said this in another post, but I'll say it again: The best evidence against life being common is the fact that it only happened once on Earth. It's fairly conclusive that all life on Earth has a common ancestor. If abiogenesis were easy and common, it wouldn't just stop once it happened one time, it would happen continuously over the billions of years since it happened for us. But it didn't.
And honestly, life on Earth being completely unique in the universe isn't that hard for me to believe when I look at the utterly insane complexity of cellular machinery. But again, extreme improbability doesn't matter when we're deal with the anthropic principle. We don't sense how long it took for intelligent life to pop up, just like we didn't sense the 13 billion years until you and I were born to think about all this.
likely
We have zero evidence for life being likely, except wishful thinking in the form hand-waving like the utterly useless Drake equation. On the other hand, we do have some suggestive evidence that life itself is improbable. The biggest evidence is that, as near as we can determine, it only happened once on Earth. If life was probable, it should have continued to re-occur, but we're fairly certain that all life has a common ancestor.
I think you mean "wrong" rather than "dangerous". There is a limited impact to our having a wrong opinion on the matter.
To be fair, Minecraft is a (surprisingly) quality program that is written in Java. It's also a total memory pig and is much slower than other 3D games, though also to be fair, it's a quite complex 3D environment (infinitely changeable), so it's hard to compare to games with more static worlds. But it does show that it's at least possible to write a good game in Java. It does occasionally freeze up, however, probably doing garbage collection to my son's infinite annoyance.
Now, a fair comparison is comparing the Java version of the Scratch environment to the Flash version, and the Flash version is about 5-10x the speed.
Java can nearly as fast as C for very small pieces of code where the runtime can do straightforward JIT compilation, that is true. If you define that as "where Java is used", then your claim is true. However, for code of any size or complexity, Java slows down tremendously. Why do you think Java is "slow on the desktop"? It's because desktop apps are applications of size and aren't trivial pieces of glue code.
Or, to put it another way, if Java isn't inherently slow and is "as fast as C" as you claim, why would there be an exception around desktop apps or "graphics in general"?
"Engineering without management is art." -- Jeff Johnson