I have been reading through all these comments to date, with amusement. But also with a little de-ja-vu.
You see, I know someone who got an iPad when they first came out, along with their iPhone, and they were telling all about how wonderful "face time" is, and how you can see and talk to people all over the world! But the point here, is they talk about these ideas like it's something new, and more importantly, Apple gave it to the world, and now we're all better for it. Hoorah!
So when I politely explain that I've been using microphones and talking with "team members" since the late 1990's, and even had web cams in the days where you stuck them on top of your CRT based monitor! Oh, and it was all for free. To which the look of puzzlement and surprise went across their face. Because in their mind, and that of so many of the public, Apple brings these things to the world, and not just their own flavour of it.
My point here, is that all the same promises are made by Java, and C#, where your code is running in a safe sand-box, and the languages encourage better more reliable code by use of well tested frameworks. So I'm really not jumping up and down with excitement here. Those languages are still heavily in demand, and have massive support. I realise we're talking about machine-level code, here, which is fine. But you don't get the benefits of a sandbox, without having a sandbox. A compiler can only do so much checking. Most needs to happen at runtime, as I'm sure you already know - and if your code isn't performing those checks explicitly, then you might want to think again. I'm sure that with Java, as you can with C#, you don't have to use any proprietary frameworks, and your code can compile down to an executable. (I don't mean MSIL /IL or Java byte codes!) But then, who wants all the hassle of writing re-usable objects that sit between OS level objects, and your code? Even calling system libraries doesn't always yield the results you're after - and you start writing what I call code-gymnastics to get at what you're really after, in a form that's most easily consumable (tasty!). To that degree, you can see my point about needing to use well tested frameworks to take the donkey work out of that. I believe this is why Java has been so ubiquitous - because it successfully divorces concerns of the operating system in which it's hosting, from the code you want to execute, and thus, the write once, run anywhere approach seems to be tangibly real. But to that end, as we now want to scale across an ever greater range of devices, there are concerns about display orientation, size, aspect ratios, etc, and so along with web development, we now have the likes of Xamarin, and their Forms technology. Claims of 90% code re-usability etc.
I'm sure this has been a boring recap of the state of things, but I'm going to have to side with the C / C++ guys here, simply because many of the arguments raised that I see so far could be successfully unit tested (code allowing!) and so I think we could see more robust code, it's just that the code base of so many libraries are so old, but equally so well tested, that we're not likely to see a re-write to enable unit testing, in an effort to find bugs in code that has long past been accepted as fit for purpose. I seriously doubt anything is going to remove the OS development crown from C and C++, simply because the future will only ever hold nightmares of a need to know the flavour of the month language, as well as C / C++ in order to either migrate or maintain that code base.
Don't get me wrong, I do believe there are more elegant languages that could be adopted for OS development, such as Java and C# in their raw forms, but if it hasn't happened for those languages, I can't see why Apple would succeed in this space, where the same frustrations surface from time to time, and still the world goes on. Good luck with that one, Apple!