Tonight? Now? This has to be an Onion post.
Otherwise it will go down as the greatest non-Onion headline that should have been on The Onion.
"[Chris] Urmson says these sorts of questions might be unresolved simply because engineers haven’t yet gotten to them."
Chris Urmson is the "director of the Google car team." He's guessing as to why his team hasn't solved problems yet, or if they've even attempted? It's like Google isn't even really taking this seriously.
I have been saying this for years. We are nowhere near the human-level AI that would be required for an actual self-driving car - there are an infinite number of scenarios that are impossible for any modern computer to handle - and when we do achieve that true AI, in decades or maybe hundreds of years, a self-driving car will be one of the least important possibilities. I will utterly stand by my prediction right up until I see a self-driving car in any real-world variable conditions. I will admit, I do not understand why Google is pretending they can achieve this; they should know better than anyone all the impossibilities.
Yes, and it's indefensible. I'm sure we'd all love to have the world forget about our past mistakes, which would be nice in a completely one-sided way. It fails to take into account anyone else affected by those mistakes, who will now lose the ability to search and reference events that legitimately affected their lives.
Yes, very much so.
Also anyone with the resources/money can still find any of this information. This "privacy through obscurity" will shift informational power back towards those with money, rather than the obviously preferable democratization of information the internet had enabled and encouraged.
There is no, cannot be any, justification for removing indexes of factual reference.
There may be discussions on the pros and cons of "too much" information, but that Pandora's box is already open and we have to live with it.
Any given program will expand to fill available memory.