Many of the "improvements" to Java were done without the thought necessary to make them work right, such as the Generics capability.
However, I disagree about the problem of abstract static methods. I have come to dislike static methods and would prefer to see their use limited *further*, since much of the absolutely terrible code I've had to deal with over the years has been a result of the (ab)use of static methods.
Many colleges teach Java as a good first language, as the perceived alternative is C++, which is a terrible first language for anyone. Java's not a good first language, but it's by no means the worst.
Java Generics are indeed a total hack, which is a result of trying to cram features into a language without thinking through the consequences. Generics were the cool thing, therefore, to remain relevant, Java must have Generics... and thus we get this festering sore on the language.
I do not agree with the analysis for abstract static methods. In the Java object model, the concept makes no sense (unlike, say, Smalltalk), and, indeed, is arguably worse than useless. A great deal of the terrible code that I've run across over the years has been a direct result of programmers favoring static methods with only the shakiest of justifications.
It's true that meta-programming is awkward, but in the languages where it isn't, and is heavily used, I fail to see a significant improvement in code readability or maintainability. It allows for clever techniques that can be extremely difficult to debug, much less understand from reading the source code. Awkwardness in this domain is a disincentive, which is arguably a good thing.
Exception handling is awkward... and arguably more informative than in any other language in common use. Some languages allow the exception handler to "fix" the problem and resume, which is amazing and powerful and wonderful... until you discover a programmer who uses this capability for mixins and flow control, making the code virtually impossible to follow.
But then, I'm one of those throwbacks who consider having to use a debugger to develop or read code to be a bad thing. Code is a form of literature, not a performance art.
Multiple inheritance is an abomination.
Not all types being object is a wart, and not a significant one. Autoboxing is a hack that's worse than the flaw it attempts to hide.
Java is a language full of flaws, but when one tries to envision a replacement language, one needs to consider not its flaws, but what it did *right*, and /why/ that design decision was right for the language. (I assert that what's "right" may be different in the context of a different language; "these are my favorite things" is a poor way to assert what's right.)
In my opinion, some of the things Java did that was right was:
1) It ran on several platforms. MSWindows was the dominant desktop environment, and it sucked, and sucked hard. The more useful systems (Solaris and Linux at first) were far nicer for developers, but those systems weren't what the users and managers were using. Java could be developed on the hippie's Linux box, tested on the corporate Solaris server, and demonstrated to the manager on his MSWindows desktop.
That's a huge win. Nobody feels that the language chosen is being used to force someone else's environment on everyone else.
2) It supported concurrent programming out of the box. Most of the time, in most of the code, there's not a need to handle threaded or concurrent code. But in that window where it is useful to separate tasks into concurrent threads of execution -- such as keeping the database-access code out of the GUI drawing thread -- it's made vastly simpler in Java.
And given that MSWindows at the time had a laughable concurrency model, Java's ability to bring this sort of concurrency to Java was *very* attractive. You didn't have to rewrite the algorithms developed on a UNIX-type machine to handle the broken MSWindows environment, which ties into point #1.
3) It eliminated many programmer problems. The lack of pointer arithmetic and built-in garbage collection (in contrast to C++) was another major factor in why Java became a big thing. Thought was put in to what the /actual/ problems were (as opposed to what pundits *thought* were the problems), and solutions devised accordingly. (Once the development path started down the road of adding features that were _neat_ instead of solving a real problem, we started getting broken crap like Generics.)
This is really where the designers of new languages should concentrate their efforts. Look at the problems that are actually cropping up, and devise solutions for those problems, in the context of all the other problems and solutions devised for the proposed language.
4) It had a large and diverse library. The scope of this sort of effort is difficult to accomplish, as growing this much support code is time-consuming and expensive. There's a /lot/ of code that ships with the Java SDK, making a great many of the mundane-but-frequent tasks not just easier, but *easy*. From basic data structures to network servers, only a few lines of code were needed, using the well-documented libraries.
Which brings us to documentation...
5) It came with a built-in documentation system. The Javadoc system is, frankly, amazing. It made it easy to create usable, cross-linked, accurate documentation about a codebase, so that even mediocre developers could generate useful documentation. With a minimum of effort, a developer unacquainted with the code could easily start using a codebase just from the documentation generated with a tool.
The large and diverse library was documented with the same tool that developers could use on their own code. The library documentation served both as a reference for how to use the code, it also served as a reference for how to write documentation. Programmers who had no idea how to write decent documentation now had a ready reference to crib from, which made their managers (and, presumably, fellow developers) happy.
6) It scaled with the size of the development team. From Object-Oriented language features, to interfaces, to JAR files, the language supports large teams of developers. One of the most useful features (not original to Java, but certainly popularized by it) is the "interface". Also called a 'public abstract virtual class', often dismissively, its usefulness lies in the ability to clearly document what a class will look like without having to implement the class.
This lets teams of developers define the points of coupling between their respective modules, without at that time having to mandate how that module will work. With just an abstract class, behavior tends to leak across this boundary, leading to hidden coupling, which makes the overall system more fragile, and requires more manpower to manage the interactions between modules as the size of the team grows.
These sort of scaling issues should be a paramount concern of anyone devising a language to supersede Java. These sorts of language issues should be at the forefront of any language designer's mind -- what are the REAL problems, and how does any particular feature solve that problem?
I don't get that sense from this new language. It seems to be more of a hodgepodge of "cool features from C# we would like to see in a language but won't fit in Java".
Personally, I don't like the idea of having Just One Language to be used everywhere. I *liked* Pascal as an introductory language, and found its limitations useful in pushing the student into other languages once the pedagogical goals have been reached. I *like* the idea of having at least one language for systems programming, and a different set of languages for applications, and a different set for scripting, and so on and so forth.