You'd be hard pressed to find anyone who still has their original 1996 hotmail account.
I know of two. And there's no convincing them to consider anything else.
Your eyes are far better at matching light frequencies between both eyes to get the depth mapping correct. Your standard camera can only distinguish 24 bits of light frequency. At that level you get somewhat of a depth map but not a very good one.
Waymo uses LIDAR, not visual light cameras. It gets an extremely accurate depth map, far more accurate than any human could, because LIDAR measures the time it takes light to reach the "seen" object and bounce back to the receptor.
In a 3D mapped world, all the depth information is 100% accurate.
Which is only slightly better than LIDAR-derived depth information.
Maslow, what happened to your hierarchy?
So, when are they releasing you?
They invest the time and the learning to master a workflow. They expect a payoff from this investment in their ability to use these workflows to achieve other ends. When you mess with a workflow, you negate that investment. They have to spend time learning and mastering a workflow all over again before they can apply it toward their actual goals.
Nobody uses software "to be using software" or "for a good experience." They use it to get things done. If they have to spend two weeks mastering a new workflow then your improvements had better deliver a multiple of that value in return, or they're going to come back with "that's cool, but it would trip me up for all of my muscle and click memory to be invalidated."
People aren't averse to improvements. They're averse to evolutionary improvements that cost more to the user in practice (time invested and mistakes avoided) than they deliver on the other end. "Small tweaks" often fall into this category. Some dev moves a button to a more "logical" placement and for the next two weeks, the users lose five or ten seconds every single time they need to use it because their absent minded clicking—absent-minded because they're focusing on what they're really trying to accomplish, not on 'using the software'—keeps ending up in the wrong place vs. what they're accustomed to.
Dev says "BUT IT'S BETTER." User experience is actually that of being irritated and not getting things done as efficiently as usual, so their response is "IN PRACTICE, IN THE CURRENT CONTEXT OF MY LIFE, NO IT'S NOT."
So, not for moral reasons at all
they saw hacking as a "moral crusade", said Paul Hoare, senior manager at the NCA's cybercrime unit, who led the research. Others were motivated by a desire to tackle technical problems and prove themselves to friends
I realize that reading the article is too much to ask, but reading the summary really isn't.
I think I speak for all of us here when I say: Duh?
I mean, I'm glad they've realized this, but rather disappointed they didn't figure it out, oh, 30 years ago, back when kids were hacking the phone system. I mean, even back then some of them "stole" quite a bit of value in the form of hours-long international telephone calls (which used to be really expensive, not like now), but clearly the monetary value was irrelevant, except perhaps as a way to keep score.
Some of those kids grow up and turn their skills to deliberate crime for profit, sure. But I think it's always been clear that basically none of them start that way. Honestly, I don't think it's even possible. There has to be an overpowering love of and fascination with the technology at the beginning, that almost certainly overshadows any interest in material gain. Later, the glamor of the tech fades a bit, but that takes years.
And the Republicans insist climate change isn't real . . . well maybe when half the red leaning states are under water they'll open their eyes. Probably be way too late by that point though.
I wouldn't count on that. A lot of red-leaning states are inland, while the coasts are 2/3 blue.
All the simple programs have been written.