I have -7.0 diopters of myopia, you insensitive clod!
Based on personal experience.
. That's a myth you're misleading people with there.
The important part is not the physics, fundamentally this is a statistics problem across some population. "heavier cars are safer than lighter cars in equal-mass collisions"...right, but that also means the heavier your car, the less cars you'll encounter on the road that are heavier than you are. The person in a 90th percentile weight vehicle drives in a world where they are on the better side of a head-on collision 90% of the time. And because of that, you can't transplant cars from a vastly different weight distribution population and expect the same safety results for them.
The Autobahn does put speed limits on larger vehicles like buses and trucks, to try and limit the worst of the high mass + high velocity combinations possible. That's far easier to do than something like parallel roadways.
It's also worth noting that most of the traffic on the specific chunk of US highway I referenced (I-95) has roughly the same car fatality rate as Germany. There's a handy chart comparing Autobahn safety that breaks things down per-state in the US. The best US entries on that list overlap heavily with the busy parts of I-95. Delaware, Maryland, Maine, Massachusetts, New York, Virginia, New Jersey, New Hampshire, those are all states where I-95 is the primary north/south motorway. Those also happen to be some of the richest states in the country, meaning people are buying higher quality cars too--which may also be the case for typical Autobahn traffic. There are a lot of things that correlate with highway safety in some way.
I don't have an agenda, I just completely goofed when selecting a source to support what was supposed to be a factual observation. See my better comment for the argument I should have made the first time. Thanks for calling me out.
Lighter cars are typically safer than heavier cars (as is indicated by your link).
I screwed up with that source and deserved the moderation down, but this isn't true either. Heavier cars are safer for the person driving them. The direction US cars have gone is based on things like this 1997 weight study, where the conclusion was that passenger cars would be better with an extra 100 pounds.
However, having a fleet of heavy cars around is more dangerous for the average person, which is what the EU statistics show, and that study points it out too. At the same time as showing cars would be better if heavier, the study also shows making light truckers lighter would be good. The important point in their words, and I'll bold it because it's the most important thing here: "When trucks are reduced in weight and size, they become less crashworthy for their own occupants, but they become less capable of damaging other vehicles."
If everyone has a light car, the average accident isn't as bad as two heavy cars colliding. That's Europe right now. Average car is heavier but you're also in a heavy car, that's the American roads. Worse overall, but it's not as bad if you are in one of the heavy cars! The really bad case is when you're driving a light car and you hit a heavy one. That's what I was describing with the EU car on I-95 example. The end result is a sort of arms race in American car design. Everyone has a a personal incentive to drive something heavier for their own safety, but everyone would be safer if, collectively, we didn't do that.
Another reason the busy American highways are dangerous is all of the trucking used to move things around. My personal distaste for being in a light car here in the US comes from watching a few car -> tractor-trailer accidents back when I used to drive quite a lot here. Whenever I'm in something like a London taxi, worrying about a collision with a truck in that tiny vehicle makes me crazy. I have to remind myself that the road isn't filled with those big trucks though, and overall that's an improvement.
There is only one solution to the problem of how to get devices that last longer: you make them have longer warranties, so manufacturers have an incentive to make cost/longevity trade-offs on the lifetime side. That will drive up prices on everything. People would need to think of cost in terms of $/year assuming the lifetime is at least the warranty, to get a price metric that drops when quality improves.
Your run at finding easier answers has two major issues. First you're assuming that manufacturers know, in advance, which parts will wear out fast and which won't. The way things will fail in the field is unpredictable. The last thing I bothered to repair was a TV that filed due to the Capacitor plague. Quoth Wikpedia: "these capacitors should have a life expectancy of about 18 years of continuous operation; a failure after 1.5 to 2 years is very premature".
The idea that this could have been prevented by buying higher quality parts is not well founded. They already bought capacitors that were overbuilt by at least a 6X factor over their warranty period. But shit happens. You cannot overbuild to where shit doesn't happen. That's the road to the crazy town that's given us things like super-expensive "mil-spec" parts. And assemblies of things made from that quality level of part still fail early anyway; see "shit happens", again. Also, device failures are dictated by the first failing component. There's no sense overbuilding plastic parts into metal if the lifetime is normally dictated by a motor.
Second major flaw: designing for maintenance and repair is way more expensive than you give it credit for, and it's not clear it's even productive. Splitting a design into usefully modular components makes things more expensive, and while repairs are easier the failure rate goes up in the process. The way you've connected the modules becomes a whole new failure mode. Take a washing machine that was reliable as a single mechanism, split it into easy to repair modules, and the new type of failure you'll see in the field are modules that vibrate out of their module interface over time. There's a reason we've moved toward giant monolithic designs: they're simply more reliable than modular ones, on top of being cheaper to build and design too. People don't really like less reliable but easier to repair, and in a high labor cost world that's a correct preference.
You can have clean, renewable electricity, or you can have a ton of electricity. The reason we keep burning so much fuel is that you can't have both at once.
He suggested 32 MPH is good for a 10 year old car that's built to the safety standards in America. US cars from 2009 are a lot better too.
And the main reason European cars get better mileage is that they're smaller and lighter. We drive serious distances here in the US, and if our cars were as light as European ones, our fatal crash statistics would suffer enormously. I would not want to be driving the style of car that get better mileage in the EU, because they're smaller and lighter, into a car accident on a big American road like I95.
Visit List of countries by traffic-related death rate and sort by "Road fatalities per 100 000 motor vehicles" if you want some hard numbers on it. The highest entries are Malta, Norway, Iceland, Sweden, Denmark, Chile, Spain, Switzerland, UK, Finland, Ireland, Germany, and the Netherlands. Notice a pattern? That's the trade-off when everyone drives around tiny cars. The EU Econobox is a deathtrap by American standards.
Sounds about right. I played enough tournament games to estimate I was about a 1450 player at my best, and playing Sargon II on the Apple was a pretty evenly matched game. The key to beating early chess games like that, and this is still useful for any small memory chess opponent, is to play something weird. You need to get the computer out of its opening book library as soon as possible, without making an overtly bad move. Moving a pawn a single space forward where most players would taking advantage of being able to move forward two can be enough to break you out of a small book. You could easily tell when Sargon went "off book" because the time it spent thinking about moves went up dramatically, especially on its highest difficulty setting.
I learned some ideas like this from David Levy's excellent 1983 book Computer Gamesmanship. With Sargon, I recall I would do somewhere around 5 moves from the standard opening library before inserting one aimed to go off-book. The first few moves in a chess game tend to be very similar because they work. You don't want to yield control of the middle of the board in favor of breaking out of the book on your first move; that's counterproductive.
En passant moves are Chess's Easter Egg. "Dude, check it out...if you move the pawn like this, the game just silently takes it!"
In grade-school chess it's the 9 year old girls you have to watch out for.
There are two main types of chess games. In one, someone manages to checkmate while there are still a lot of pieces on the board. You seem to only be familiar with this type of game. It's possible to prioritize for that over holding onto pieces, with strategies like "gambits" taking that idea back to the opening move.
But when both players are good enough that this doesn't happen, you get a drawn out type of game where very subtle position advantages allow picking off pawns, or exchanging a better piece for a worse one. Eventually those swaps knock out most of the pieces on the board, and then the person with an advantage in "material"--the pieces they still have--will normally win. One of the things you need to learn as a competative chess player is how to checkmate when you only have a small advantage like that. Can you win a game where you have a king and a bishop left vs. just a king? There's a whole body of research on pawnless chess endings that to this day hasn't considered every possibility yet.
So how do you tell which type of game you're playing? That's the trick--you can't until it's over. If you goof on a risky push to checkmate and it fails, you can easily end up down in material and then playing the other type of game at a disadvantage. That's where people who are good at tactics instead of memorization can really shine--no one memorizes optimal play when you're already down a piece or two. The entire risk-reward evaluation changes when you're in a position where you must do something risky to win, because being conservative will eventually result in you losing to the person with more pieces.
And if you think there are so few combinations here that it's possible for the person who memorizes more to always win, you really need to revisit just who has the "small mind" here because you don't understand Chess at all. Go is really the simpler game here because it only has the long-term strategy to worry about. Chess players have to worry about a long-term game of position and material trade-offs, but at the same time you have to guard against short-term win approaches too. Your long-term game is worthless if you get nailed by a Fools Mate.
At the vaccination research lab funded by Jenny McCarthy, all of the workers who were treated for exposure are now autistic.
Now that the requirement for Dual_EC_DRBG has been dropped from NIST's checklist, it would be possible to have LibreSSL meet FIPS requirements without having the troublesome component. Most of FIPS certification is about throwing money at testing vendors, as described by OpenSSL themselves. Doing that would really be incompatible with the crusade LibreSSL is on though, because the result is believed by some to be less secure than using a library that isn't bound to the FIPS process. I don't see those developers ever accepting a process that prioritizes code stability over security.