Get a grip. The article didn't come to the conclusion of your strawman at all. And your reactionary stereotyping of Americans isn't "helpful" either.
The article merely notes that the philosophies of Boeing and Airbus regarding "humans in the loop" are different, and presumes that it may be a reflection of cultural differences.
Obviously, the companies do have different philosophies, but whether it's really a reflection of American vs. European culture or world view is of course highly debatable. It probably shouldn't have been included in the article, but it is an interesting topic/debate.
But lets set that argument aside and get back to the real topic.
If we presume that the accident was due to faulty data from the pitot/static system, and the software had no way to compensate for this, then wouldn't Boeing's philosophy of allowing pilot override make sense? Obviously, it would in this scenario. But would such overrides ultimately result in more accidents than they prevent?
Personally, I take the Boeing side of the argument. Not because I'm an American or having anything to do with "individual freedom" or whatever, but just because it makes sense to me.
Software has bugs. Hardware can fail. Sensor systems can fail. Even highly redundant systems. I think it's a dangerous to presume that the engineers who designed the systems and software of an aircraft have imagined or anticipated every scenario, every failure mode, every situation. I like the idea of a pilot being able to 'stick & rudder' the aircraft in a situation where the computers and associated systems aren't working right.