I would be a little surprised if the engine monitoring and satellite link circuitry would be on battery backup since it is unlikely engines and passengers would have much use for satellite link after the plane hits water. For the satellite link to work, the antenna would also need to remain above water since submersion adds horrible attenuation to radio signals. Additionally, cabin electronics aren't water-tight so submersion in ocean water would ruin them in fairly short order.
It probably does just before air breaks down and the arc strikes. Bright flashing lights with loud bangs probably scare animals a whole lot more than a dim purple/violet haze and a soft steady hum.
And light years ahead of Windows 3.0's or OS/2's grid of icons on the desktop... or the grid of icons in graphical DOS-based programs... or the grids of text-based buttons in text-mode programs.
Yeah, re-inventing the grid is so patent-worthy.
There already is protection against "wholesale copying" of a product's look and feel; it is called a trademark.
The stupid USPTO rules are letting companies patent trademark elements, which is beyond dumb.
The most puzzling thing is how could those patents ever slip through? With the amount of tablet and touch-screen technology shown in Star Trek: TNG/DS9/Voyager/Enterprise, I'm having a hard time imagining how most of those could have ever gone through - most of the cosmetic ideas precede the first iPod by a decade.
Yeah, Canada is about 100C too warm.
Everyone who drives already has to buy liability insurance.
Some places have multiple different layers of insurances drivers must sign up for. Where I live, basic collective insurance built in the license and plate fees and drivers are required to buy additional vehicle liability insurance for a minimum of 1M$ on top of that. The mandatory collective insurance regime covers the balance of claims that exceed the mandatory minimum private insurance and cases where there is no insurer to file claims against such as hit-and-runners who don't get caught or people who drive illegally without vehicle insurance.
While I agree that taxis do feel a lot like a cartel in many ways, I can think of at least two legitimate reasons to treat them differently:
- in places with a mandatory public insurance regime built into license and plate fees, the higher taxi fees are there to cover the higher risk exposure associated with those drivers and vehicles
- a licensed taxi is (hopefully) more trust-worthy and reliable than random people - at the very least, you can file complaints against them and if those get taken seriously by the-powers-that-be, too many/severe complaints will get the bad taxi's license reviewed and possibly revoked
Which is more efficient?
1- a GPU architecture that can accept API calls almost straight-through
2- a GPU that requires middleware to re-arrange code and data going through the API
Can you honestly tell me AMD did not coordinate Mantle and GCN design efforts to provide close to the thinnest middleware layer possible between the API and GCN? Can you honestly tell me the middleware for other architectures won't be thicker to match API features GCN handles natively but other GPUs have no native direct equivalent for? Can you honestly tell me a thicker middleware is going to perform equally well?
I'm not saying there won't be any performance benefits for other vendors. Just that the decks are almost certainly stacked in AMD's favor to some potentially non-negligible extent. Unless Nvidia decides to adopt Mantle, we will never find out exactly how much that is.
That article merely says that GCN is not mandatory.
Just because you can beat the API into "working" on something else does not mean it will be as efficient. Mantle was designed around GCN so the API likely has tons of things that map almost exactly 1:1 with GCN hardware but not necessarily quite that neatly on anything else. That's where the extra middleware comes in.
If you read the conclusion of your cited article, they say exactly what I said, albeit in different words: Mantle's roots are likely too close to GCN to be truly portable between vendors (or AMD's own previous and possibly future post-GCN architectures) unless other vendors change their architectures to something more GCN-like.
AFAIK, AMD never answered when asked what will happen to Mantle beyond GCN - apart from saying GCN would be around for the foreseeable future.
Since AMD was very clear that Mantle only works with their GPUs based on GCN architecture, it seems to imply that although the API might be portable to other GPUs, the amount of middleware necessary to bridge the gap between the API and other architectures may not be worth the trouble - at least too much trouble to bother porting it to their own older GPUs. It certainly won't be if DX12 delivers on most of those closer-to-the-metal promises.
Trying to set maximum speeds based on 85th sounds like a futile thing to do: if that causes the speeds to rise, the 85th will likely rise when it gets re-evaluated at some future point until speeds are high enough that people do not dare go any faster and no matter what the maximum speed is, people still need to have the common sense to adjust speed based on driving conditions - people who fail to slow down when driving into fog, wet/snowy/icy roads, etc. is where/when monster pileups tend to start.
There is a road where I used to drive on a regular basis where the speed limit is 70km/h but people often drive through there at over 120km/h. With the number of blind corners and hills on that road, I would be nervous driving at over 90km/h there - I totaled a car on that road once due to a traffic jam backing up all the way to one of those blind hills and I was only going at the 70k limit. I would not want to find out at what speed people would drive through there if the speed limit was raised to 100km/h and prefer not thinking about how much nastier that crash would have been with twice the kinetic energy involved.
That would not necessarily work: it would definitely fry the IO front-end but most of the NVRAM matrix would likely remain intact and recoverable by stripping the top encapsulation and top metal layers then scanning the NVRAM cells with a magnetic force microscope.
Also, if the devices self-destructs through high voltage, someone who has already dissected one of these phones before would know where the high-voltage components are, how they operate, how they are triggered and would likely be able to come up with a way to prevent the high voltage pulse from reaching the NVRAM chips such as using a pneumatic framing nailer to destroy/short the high voltage circuitry faster than it can be triggered by tamper sensors.
So, even with physical destruction built-in, you would still need strong device-level encryption as a fail-safe.
The most beautiful thing about having a decryption key embedded in a secure microcontroller managing tamper-proofing sensors (which is itself embedded in the SoC running the rest of the device's functions) is that disabling tamper-proofing is impossible to do without disabling the secure micro-controller and disabling it either physically or by cutting power kills the decryption key just like tripping tamper-proofing sensors would.
You might want to re-check your definition of reckless driving.
By most states' definition, a reckless driver must display *wanton* (violent, intentional and unprovoked) disregard for the public's safety and traffic rules. As long as your electronic-device-using driver sticks to his lane in traffic, maintains safe speed and distances, follows signs, etc. reasonably well, there is no reckless there. Distracted and potentially dangerous, sure. But not reckless.
Weaving through traffic, ignoring speed limits, road signs, tailgating, street racing, etc. are reckless driving - deliberately ignoring safety and putting others at significant risk.
BTW, which one do you think is the most dangerous driver on the road:
1- a lost driver getting flustered from having no clue where he is
2- a lost driver using a GPS to pull a map and directions
From my experiences getting lost both with and without GPS as a backup, I would say having the GPS is safer: yes, the initial setup has potentially higher risk but once that is done, it immediately removes the stress from having no clue where I am and allows me to get off the road much quicker than driving around in a flustered state until I reach a road or highway I'm familiar with, continue on my way from there and hopefully not get lost on my next attempt.
Except many studies have shown that hands-free phone operation is about just as bad as hands-on.
Most of the distraction-based accidents are caused by people picking the wrong time to do something, even simple things like changing radio station, heating/AC settings or checking their speedometer.
Hands-free does not prevent people from letting themselves get distracted by or otherwise focusing their attention on the wrong things at the wrong time. Some people have suggested locking out non-essential controls while vehicles are in movement so drivers have no choice but to focus on the road but going to such an extreme would likely become a grievance for many people and cause its own lot of problems such as passengers being unable to access those controls either.
Ideally, people should be able to gauge circumstances and their own abilities to decide the most appropriate moments to do something safely but most people grossly over-estimate their abilities and the safety margins around them so we end up with stiff restrictions to eliminate most variables.