I wonder if, in time, we will see a regression back to city-states once urban populations get big enough. Tokyo is basically its own country, and the same goes for SF, LA, and NYC.
I believe the limiting factor on country size is 1) communication ability, and 2) transportation (force projection) ability.
Roads were a major factor in the size of the Roman Empire, for example. City-states were common when there was no force regionally large enough to conquer the city. City states also needed to maintain farmland surrounding them, so they could remain fed.
There is no accountability.
That's not important. Really.
Bridges aren't designed and tested by "trial & error"--if they were then half of them would fall down within a few weeks. Neither are buildings or pacemakers or computer chips.
Should we also assume that rockets are programmed with the same careful methods you (conveniently) omitted?
Rockets are known to never fail, after all.
but they already have a far better safety record than the average human driver.
I want you to realize that the only source we have for this is Google. It's not from a scientific journal, or an independent research team, or an auditor, it is from the same people who want you to eventually buy their product.
Furthermore, I want you to realize that the Google team is very careful in what information they reveal. All the information they present is shaped in a way that makes them look good, and to increase demand for the car. Now, maybe they've built the perfect driverless car, and it somehow got off the ground running with a near-perfect driver record, but the information they've given us isn't enough to determine that.
Does the sun go around the earth or does the earth go around the sun?
That's a tough question.
>liar and an attention seeker aka "every human being ever"
You just don't hear about the humans who aren't attention seekers, because......they aren't seeking attention. I think this is somewhat obvious?
As the years have gone by, the x86 decode overhead has been dwarfed by the overhead of other units like functional units, reorder buffers, branch prediction, caches etc. The years have been kind to x86, making the x86 overhead appear like noise in performance. Just an extra stage in an already long pipeline.
And all that long pipeline takes power to run (recently this argument comes up in discussions of mobile devices more than in the server room, because battery life is hugely important, and ARM in the serverroom is still a joke). ARM chips sometimes don't even have cache, let alone reorder buffers, and branch prediction. When you remove all that stuff, the ISA becomes more important in terms of power consumption.
Of course, as someone else pointed out, they were comparing 90nm chips to 32nm and 45nm. Why that is a problem will be left as an exercise for the reader.
It's an oldschool attitude to not touch things, from back in the day where software was so flaky that chances were someone had already 'exploited' the bug to do something non-malicious.
Ah, that actually makes sense, good analysis.
. It's pretty obvious from the description what the bug is, so saying you aren't going to fix it is, as you say, pure laziness.
This sort of thing worries me about glibc, and the attitude that 'bugs are no big deal' is a dangerous one that is infecting software developers all over.
"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs