All languages are syntactic sugar for the IR code that gets generated. Fundamentally, all compilers translate from a Source language to a Target language (machine language in a lot of cases).
Whether the compiler checks that the types match, that there will be overflow, that you're doing signed vs unsigned comparison - it's up to the compiler and its developers. Clearly, one compiler can have more features than the other one. To access those new features, the compiler needs to see syntactic constructs in the source code. If you can't extend Javascript (or it's not feasible, or you just don't want) to incorporate those syntax changes, you can always develop a new language, such as Dash (or Dart?), with the risk that it will fail, but with the opportunity of fixing the bad things in Javascript and add the features that you want.
This is just a natural way of language evolution (in the broad scope of translating source code to machine code). You can't anticipate all the requirements of the future, so you design a language that solves all previous problems (or a relevant subset) and the current ones. In time, it will become obsolete, as people will build on top of your language to solve the problems they'll face at that point in time.
Obviously, everyone hates change. I say: Give it a shot! At least don't dismiss it till you've looked at it and figured out how it may or may not help you. If it doesn't, stick with what works for you. If it does, hey, you're now better off. Chances are, people aren't wasting their time to develop a new language, unless it solves someone's problem (their problem, your problem, everyone's problem - depends on the goals of that language).
I, for one, always welcome a new programming language, just to see what it brings to the table.