Apple II used 5.25" floppies. Mac is what forced 3.5" disks into the market.
Apple II used 5.25" floppies. Mac is what forced 3.5" disks into the market.
Revealing which models of devices they bought doesn't reduce their security, unless they're using units with widely known security flaws that they leave open.
Either they're really, really stupid or they think we are. Perhaps both?
If they're basing this on owning the copyright to the Olympics, this isn't going to work - owning a copyright on the name of a thing doesn't mean that you can prevent anyone from talking about your thing, just that nobody else can sell it. Lawsuits like this fail often - confused people think that they can use copyright to do more than control the right to copy...
Why not let users send their position by hitting a button? I'm thinking that it could insert text giving your location in text, or speech into a phone call, if you hit a "send my location" button. Then it'd work not just magically with E911 services (which, of course, is a great thing) but could work on normal phone calls (e.g. a kid calling Mom for help) or SMS (e.g. a kid texting Mom for help). The phone has the info, and it'd be easier to deploy, because it doesn't require any integration to anything outside of the phone.
Actually, collision data is quite clear - reducing speed of collision saves lives because the fatality rate of the accidents drop. The reason is pretty simple - F=MA. Mass of a car, person, etc., are constant. So the faster a car is going, the more acceleration is required to stop the car (i.e. during the collision), and thus the more force acting on the driver and passengers. And enough force kills people.
There's also quite clear aggregate data that highway driving fatality rates dropped when speed limits were reduced, both nationally and at the state level. Of course, at various times seatbelt, airbags, etc., also helped...
The car requires you to manually confirm that you understand that you're responsible for being alert and ready to take over from Autopilot, every single time you enable it. And exactly the same as Autopilot in airplanes, used by pilots for decades. In both cases, the Autopilot is a driver assist, automating a boring task, but ultimately the driver is responsible. Some people might not understand that (as evidenced by some really stupid YouTube videos, and your post) but Tesla is quite clear about the what Autopilot means, communicated consistently to every Tesla driver with Autopilot, and it's consistent with industry use of the term "Autopilot".
In the case of this accident, as often happens in truck under-run collisions, the driver didn't see the truck. Happens many times a year without Autopilot. It's possible that the driver this time wasn't being alert, but truck under-runs kill alert drivers (without Autopilot) routinely, so it's not clear that you particularly need to find anyone to blame this time.
To put it in perspective, look at the numbers. Tesla is 1 fatality for 130m miles driven, or 0.7 per 100m miles. The US average is 1.2 fatalities per 100m miles driven. So while you can't prove anything with small sample sizes (wait for 1B miles driven with Autopilot), it certainly indicates that Autopilot is relatively safe.
Exactly - the car knows speed limits from the maps. The reason that they don't enforce speed limits (outside of residential areas) is that buyers of high-end sports cars don't want speed limits enforced.
If there were a wall across the road, Autopilot would have seen that. Though avoiding a wall that suddenly appears across a highway might be problematic.
In this case, though, it wasn't a brick wall, it was a truck with a raised body. Which means that Autopilot saw clear road ahead (under the body of the truck), with a large flat object above it, like a sign over a highway. Incorrect in this case, but since people make the same mistake routinely (truck under-runs are common) it's not a trivial case. Should Autopilot be better than human drivers? Sure. But that takes lots of experience on the road, tuning the software. So, "silver lining", this accident will make future Autopilot versions safer.
I agree that people can be stupid, and that the software should be improved. Legally, though, since pilots have been flying airplanes with Autopilot that does the same thing Tesla's Autopilot does, and Tesla informs drivers repeatedly that they need to stay alert and ready to take over, just like airplane pilots, I suspect that Tesla's legal situation is pretty clean. The legal/regulatory situation will get more complex once cars are autonomous, rather than semi-autonomous. Until then, drivers are responsible for driving their cars safely, and it's more a matter of education that people learn to use the various safety mechanisms appropriately. If someone intentionally drove into a wall, they can't sue because the anti-collision braking didn't prevent them from doing so.
Might as well ask if they'd like to lose their pilot's license. They're required by law (and ethics) to always be prepared to take control away from the Autopilot, in a fraction of a second.
To support your post, Autopilot has already demonstrated that it's more situationally aware, always alert, and has faster reflexes, than a human driver.
The guy who died in the accident previously posted a video where Autopilot saved his life. A truck changed lanes into his car (presumably he was in the truck's "blind spot"). Autopilot saw the car, and got out of the truck's way, avoiding the accident, in less than a second. A human driver would have been side-swiped by the driver during reaction time.
Perhaps. But airplanes have been flying with Autopilot for decades, and the legal situation is quite clear - the pilot is responsible for flying the plane, and the Autopilot is just an assist that automates some of the boring stuff. But the pilot is required to be alert and prepared to jump in and take over whenever needed. Exactly the same as Tesla's Autopilot - probably why they named it Autopilot was to remind people of that.
Airplanes have had Autopilots for decades, and the pilots are responsible for flying the airplane. And every time you turn on Tesla's Autopilot you have to manually confirm that you know that the driver has to remain alert and hands-on-wheel.
"We just don't have the kind of evolving programs that you think we do."
First, you misread the OP. He didn't say that AI would fix the problem, he said that the software would be improved as a result of this accident, making future cars safer. Which you agreed with.
Second, people have been doing ML with software tuning itself on production data, for several decades now. More recently, perhaps you've heard of Google, Facebook and Amazon? Hand-coded rules don't scale and are nearly-impossible to QA in complex situations, which is why people using large volumes of data use a wider range of techniques, including machine learning / AI. In particular, computer vision is an area that uses tuned neural nets and such quite often, so in this case it's highly likely that Tesla would in fact be retraining their computer vision networks (etc.) to recognize trucks from the side as an obstacle rather than as a billboard over a highway.
Yes, the report is that the truck pulled square across a road with oncoming traffic (the Tesla). Presumably the truck didn't see the Tesla.
One correction - truck under-runs are common for drivers, and highly fatal. While you'd think a truck would be easy to see, it turns out that in reality a light truck pulled square across a road, against a light sky, is surprisingly hard to see. In particular, keep in mind that you can see the road ahead under the truck's elevated body. So, in reality, not trivial to avoid.
There are countries that require trucks to have side walls and bumpers, which would make the truck more visible, and make collisions with them less fatal.
As a data point, so far AutoPilot has 1 fatality for 130m miles driven (a month ago, more now), which is about 0.7 fatalities per 100m miles driven. The US average is about 1.2 fatalities per 100m miles driven. The numbers are small so they don't prove anything (wait for 1B miles driven to start drawing real conclusions) but it certainly suggests that AutoPilot is relatively safe.
There is no royal road to geometry. -- Euclid