It's not even the first time for Windows has tried though either. Microsoft ported Windows NT to some other CPU architectures. The version for DEC Alpha had x86 instruction set emulation built into it. Not sure how well it ran, but clearly it didn't convince anybody to move off the x86 instruction set.
Even if it worked perfectly, there is little be gained for the majority of users from emulation. At best they get functioning but slow emulation. Windows software is so intractibly tied to x86 that as a precursor Microsoft really need to change their toolchains to target something like LLVM instead. Let people build an architecture neutral executable which is compiled on first invocation to the host architecture.
So it's understandable why they might be close to war with each other.
Since 80% of type 2 diabetes is caused by obesity, the simplest way to avoid it is not be fat. It's mostly a self inflicted disease. If someone can follow a calorie restricted diet to put diabetes into remission, then maybe they have the willpower to not eat so much and exercise more and not get in that situation in the first place.
In the UK very few companies offer health insurance as a perk because people pay out of their taxes for it. And since everyone pays into the system, and the system itself is not for-profit, the "premium" is far less per capita too.
Then the computer makes an ESTIMATE based on adding up those values and your pizza resides in a particular state according to how long the chain it is. Is it "smoke and mirrors"? No because the estimate will usually be accurate assuming the system is working, traffic is usual, the driver doesn't get lost etc.
And like any system it's only as good as its inputs. Maybe "Melinda" is some dude, or Melinda doesn't like the "app truther" creep who tracks her online and swapped deliveries, or its maybe just the person who logged in that day. Maybe the driver did get lost. Maybe the pizza order got screwed up and so the tracking is out of whack with reality. Does that render the system worthless for the 99% of the time that it works as intended? Of course not.
All the domains where speed isn't the biggest deal and where reliability / uptime / portability / maintainability are more important. That's why languages like Java,
So where C/C++ tended to be all-encompassing, they're now relegated to performance critical areas where until recently there wasn't much choice. Kernels, embedded, systems services, games. Places where performance and/or memory footprint were critical.
But even there choice is opening up. Rust in particular produces code, that is for all intents and purposes as fast as C/C++ but which tends to be safer, more portable and reliable. If you prefer to tradeoff some speed for programming niceties then you can go for Swift and Go too.
If I were writing software from scratch these days I definitely consider other languages before C++. I might reject them for reasons but C++ and C would be the bottom of the pile.
Personally I'm quite okay about ignoring games that pull this shit. Grind stinks, skinner box gambling stinks. But clearly this common sense hasn't permeated the mainstream consciousness or 99% of mobile games wouldn't be this way. I expect EA knows it too.
Speed still remains a concern in games, video / audio capture, telemetry, databases, HMIs, industrial control, services / daemons etc. which tends to be referred to as systems programming. C and C++ still dominate in this space but I suspect Rust and Go will eat significantly into that.
And the general sentiment that C or C++ is good enough if you program them right flies in the face of reality. Yeah they can be programmed right but rarely are. Rust for example shuts down classes of programming error from even happening.
Tom Bombadil will probably get one all to himself, gaily prancing around the forest and singing for 50 minutes.