The key is to not use the fingerprint as a key for online authentication, we have a technique for that it is called cryptographic keys (either symmetric or asymmetric). Now people are generally bad at remembering these strong keys (and even worse at using them) so instead they use a trusted device (used to be a desktop computer but that day is past, now its a phone) to both store and use those keys. The user can then authenticate locally to their device using a less strong mechanism (traditionally passwords). Apple has this right, the device is the only thing that needs to use the fingerprint to authenticate the user (local authentication is by its nature two factor since you need the device). There is no advantage & clear disadvantages to using fingerprints directly for online authentication (passwords too as we have seen time and time again).
Instead of storing data in the box directly, where you then rely on media retaining viability over 25 years you could always strongly encrypt the data you would like to logically store in the box and then write (or etch in stone, whatever) the decryption key and store that human readable quantity of data in the box and then maintain the cipher-text outside the box in a redundant distributed fashion over multiple generations of media. Of course I fail to see what the advantage of keeping the data secret over the time period would obtain, and it prevents transcoding to new file formats, so I'd just suggest keeping copies of the data as you would any data you want to have in 25 years (not locked in a box).
You see, physical objects are placed into a time capsule because they would normally deteriorate and not be archived properly if they weren't removed from the harsh existence of everyday life. Data however doesn't work like that, neglect is the biggest problem and hence a time capsule is not a good means of preserving data the way it is for preserving objects.
I'll miss spring scape, watching frog & ladybug go through their day was great.
Additionally, I would highly recommend Leonard Suskind's Stanford continuing education physics series (available on iTunes & YouTube etc) which is currently in its third quarter of the second attempt. The first covers classical mechanics, the second quantum mechanics and the third (ongoing) special relativity and classical field theory. The fourth I believe will cover general relativity and then the fifth will head into quantum field theory and the standard model.
If only we had some sort of theory that could explain this inexplicable change in weather patterns.
A remotely controlled armed weapon should only use a one time pad for secure communications as that is provably secure (or rather as provably secure as putting a pilot in a plane since ground crews could be subverted to steal the pad). Then the threat model is reduced from controlling the aircraft to DOS and other jamming techniques, which is much more acceptable (considering the plane could be designed to self destruct if a watchdog signal is not received).
This is completely wrong.
We don't have to abandon paper money just because it is not possible to keep forgeries from being manufactured. The government just needs a private key and digitally sign each paper bill it produces (similar to the current serial numbers but with PKI powers) and then when you accept paper money for payment you will need a computer to read and verify the digital signature is valid. This would solve the problem (with the added expense of verifying bills) but the government won't propose such a simple solution because they would rather force people off paper currency to track them better.
Perhaps unfortunately neither factoring or discrete log are known to be NP-hard yet (fortunately) polynomial time algorithms have thus far alluded us although BQP algorthims (Shor's algorithm) have been found. Of course an NP-hard problem in BQP would be a major discovery. Also simulation of quantum mechanical systems (protein folding) is known to be in BQP, although no polynomial algorithm is known and it isn't known to be NP-hard. While its true that a great many interesting problems that apparently aren't in P but are in NP are NP-hard, but the above are examples of important problems that aren't.
I would fully expect that verifying that a set of dynamical equations does indeed fit experimental evidence is in P so in this case (physics) the problem is NP-complete, certainly for classical mechanics. Verifying predictions in quantum mechanics may not be in P but is certainly in BQP.
In this day of age of virtualization, cloud deployments and the like the idea of moving servers offshore being equivalent to physically moving boxes across the ocean seems absurd. You setup some new machines at the new location, sync the data across this thing called the Internet, flip a switch and then wipe the old boxes and sell off the hardware (if you ever owned it to begin with).
The designer of the car broke the law, the vehicle is defective breaking traffic laws and needs to be impounded and the builder fined for endangering the public.
When a computer is a box sitting on someone's desk that computes figures and shows lights on a display there is no reason to restrict who can do what with machines and they should be open to hacking and modification. When they are connected to networks the burden goes up a bit and maybe code has to be signed or restricted to a safe API on top of a trusted locked OS (but probably not, in my opinion). But by the time the computer is connected to hardware fully capable of killing people both inside and outside the computer the game has changed and the system needs to be locked down so it can't be hacked and the developers need to take responsibility for their actions. An owner of a car no longer has the right to hack the device because they own it, at least they can't then put it on public roads. Just as drivers need to pass a test the design of an autonomous vehicle needs to pass a test (regulated) to use our roads. This will probably mean leased vehicles owned by the builder company with per mile, per minute, per month fee structures to generate revenue to offset settlements for accidents (which will still happen). The law should then limit the costs of a computer caused accident to the same penalties that a human driver would face for an unintentional accident with the same circumstances.
Rather than understanding the FFT (an O(nlgn) algorthim for computing the DFT which is normally an O(n^2) operation) you should first understand how the basic DFT equation works, which is independent for each of the frequencies. It just takes each of the n elements in your discrete signal and multiplies it by a (complex) sinusoidal function of that frequency and sums them. If the data is correlating well with the sine wave the magnitudes of these products will be larger and of a consistent sign (+ for direct correlation and - for anti-correlation, small numbers for uncorrelated values). Then you can see that the DFT works and then it is an algorithmic exercise that the FFT produces the same result in less computations.
The whole concept of price capping these books at a low level, putting a text book in the same price range as a fiction novel (I don't believe fiction is price capped, and certainly apps aren't) is insane and downright offensive. Also the exclusivity requirements should be downright illegal.
So a satellite costing $2 billion to design construct and launch failed due to a small error. How much of that money was truly wasted? How much would it then cost to construct a replacement using the same design? One would hope that the majority of costs associated with this thing are design and testing related that would not be lost by the need to try again.