If there were a wall across the road, Autopilot would have seen that. Though avoiding a wall that suddenly appears across a highway might be problematic.
In this case, though, it wasn't a brick wall, it was a truck with a raised body. Which means that Autopilot saw clear road ahead (under the body of the truck), with a large flat object above it, like a sign over a highway. Incorrect in this case, but since people make the same mistake routinely (truck under-runs are common) it's not a trivial case. Should Autopilot be better than human drivers? Sure. But that takes lots of experience on the road, tuning the software. So, "silver lining", this accident will make future Autopilot versions safer.
I agree that people can be stupid, and that the software should be improved. Legally, though, since pilots have been flying airplanes with Autopilot that does the same thing Tesla's Autopilot does, and Tesla informs drivers repeatedly that they need to stay alert and ready to take over, just like airplane pilots, I suspect that Tesla's legal situation is pretty clean. The legal/regulatory situation will get more complex once cars are autonomous, rather than semi-autonomous. Until then, drivers are responsible for driving their cars safely, and it's more a matter of education that people learn to use the various safety mechanisms appropriately. If someone intentionally drove into a wall, they can't sue because the anti-collision braking didn't prevent them from doing so.