For copyright purposes both physical and electronic mail is the property of the sender or, more precisely, the author.
Freefall strictly speaking means 9.8m/s/s which, after 228 seconds, multiplies out to 5000mph. That's an order of magnitude more than Baumgartner's speed. Wikipedia explains:
"The example of a falling skydiver who has not yet deployed a parachute is not considered free fall from a physics perspective, since they experience a drag force which equals their weight once they have achieved terminal velocity (see below). However, the term "free fall skydiving" is commonly used to describe this case in everyday speech, and in the skydiving community."
Still, terminal velocity for a human at sea level is about 120mph which is 4.5 times slower than the quoted 536mph. Taking the square root gives an atmospheric pressure 2.1 times less than normal which translates to him popping the 'chute at about 25,000. Actually he had a pressure suit which would probably slow him down so it could have been higher than that.
Since we can predict the next (2013) close approach very accurately we're very confident it will be a miss. Therefore that approach doesn't rate a mention in the table.
The trouble comes in that, while we know the 2013 approach distance will be greater than 0km from the surface (>6400km from the centre) there's still some uncertainty. The earth is massive and the close approach will cause a relatively large change in the orbit of DA14. The size of the change is inversely proportional to the square of the approach distance. Thus even a small uncertainty for 2013 results in a large uncertainty for subsequent approaches. Celestial billiards at work.
In my view an important property of any ballot is that the great majority of people must be able to understand the whole process. That's the only way for people to have confidence that there's a reasonable chance of detecting and preventing rigging. It also rules out pretty well any form of electronic voting. Internet security involves very serious maths that very few people can handle.
Around here we still write numbers in squares on pieces of paper and drop them in the ballot box. It works. The cost is tiny compared to the cost of government. I just can't see the advantages of more automation being worth the risk.
People might think it weird that an IT guy would have this luddite view but I think, on the contrary, I'm better placed than most to know what could go wrong.
The fun is in considering what recourse Symantec has. If they didn't have some really expensive penalty clause in the non-dislosure agreement that will have been involved here they'll be kicking themselves right now. They'll also be wishing they gave themselves some way to identify the source of the leak. Their smart move would have been to insert some minor changes, e.g., to indentation or comments, to make each version released to third parties unique and therefore traceable.
I propose that, for the people to trust their democracy, they must be able to understand all aspects of the voting system. This rules out pretty well all automated systems, especially computers with cryptography and hashes. Just go back to people writing on paper and ballot boxes.
Sure counting the ballots by hand is expensive but it's tiny compared to the cost of travel and time for the voters. The risk of serious, undetected fixing of results can't be eliminated with automated systems.
Yes, it happens all the time and satellites get hit way less than the earth because, think about it, their surface area is *way* less. Sadly, hitting satellites will make the orbital debris problem worse since every hit just makes more smaller pieces. Even little pieces are a disaster for other satellites at 10km/second, though they fall out of orbit faster.
Interestingly, the frequency of hits is inversely proportional to mass (weight) of the object. Guessing this thing weighs about a 100 tonnes (probably more) and one hits earth every two years (burning up in the atmosphere). That means a 1000 tonne object will hit about every 20 years and a 10000 tonne object every 200 years, etc.
I have a simple commercial site that uses Google maps but is otherwise trivial. Using Google webmaster tools tells me my average page load time is 19.5 seconds and slower than 99% of sites. Guess how happy I'm going to be using Google maps if it causes my Google page rank to fall?
Personally I see much faster load times with a 1.5Mbps link. To get to 19.5 seconds implies the timings are coming from robots or customers with slow links or computers.
"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs