Oak was originally designed for household appliances.
D looks intriguing, certainly superior in theory to C++ or C#, but I'm seeing nothing substantial in it so far.
For other C derivatives, there's Aspect C and related attempts at adding high-level abstraction. On the other end of the spectrum, you've Silk and UPC - efforts to make parallelism simpler, safer and usable. Again, though, how many here have even got these compilers, never mind written anything in them?
For highly protected work, Occam-Pi is unbeatable. And almost unusable. Extraordinarily powerful, but extraordinarily formal. You could easily write an OS or virtual machine using it that could exploit multicore, SMP and clustering transparently. You just couldn't easily get it to do anything else, like hot-swap resources, add memory, access the busses, support RDMA, exploit hardware...
That's the rub. Most of what is needed in an OS is inherently unsafe. It's why there's so much interest in splitting operating systems into unsafe parts (which often need to be fast and low-level) and safe parts (the stuff that does all the managing and abstraction). So long as the unsafe parts are well-behaved with valid data AND the safe bits provably give only valid data (though it doesn't have to be provably correct), then the system is guaranteed to be stable.
You ideally want to split these up further. The safe bit should access an independent security kernel that handles all the access control, for example. The security kernel should be provably correct, which is a very different constraint than that imposed on other safe sections. Some sections of code should be able to self-replicate or migrate, to take advantage of resources rather than create bottlenecks. That would require greater emphasis on abstraction and adaptability, rather than validity or correctness.
No single language can handle this level of versatility. All languages obtain specific characteristics through constraints and freedoms. This means you need superior linkage between languages and optimization that takes into account that different paradigms are used to solve different problems and that there is insufficient data to optimize at compile time, that it has to be done at link time.