Dude I think he covered that: Moore Coors.
This seems a little bit more appropriate.
The real key here is that there is no advantage to the device at all.
In the cryptographic protocol that the authors (all physicists) believe to be novel, but which every cryptographer is aware of:
1. The authors have a perfectly secure channel (separate from the one established in the protocol).
2. They exchange as much information over that channel as the device stores.
3. The later established channel can only use that number of bits.
For real excitement they xor together their OTPs. Sorry guys but this is called a pre-shared key and the crypto world is quite aware of it. Good luck with the window dressing getting you past the PC of a physics venue.
Best slashvertisement. Ever.
Best editing of a summary. Ever.
Lowest point? We should be handing out awards for this shit.
Now that it has actually gone live and we can see what photos are being selected for relevance there is a longer and more complex answer:
Organising your paragraphs around trolls - it is a brave strategy. I guess that you no some grammar nazi will bite on the first, and who could resist correcting the abortion of a description of a fusion bomb on a tech forum. Don't mind me, I'll just sit here to watch
Do you know anything at all about the Blue Brain project?
Serious question: if you do not then there is a video floating around from ICC'11 with Henry Markram explaining an overview of the project. Given that they are building artificial simulations of biology specifically so that they can explore how they work, build hypotheses and then experimentally validate them it is somewhat hard to see how this approach can be described as cargo-cult AI.
Hopefully these guys can solve that problem.
Do you know why the target bandwidth for USR (15Gb) is lower than the bandwidth for SR (28Gb)?
It seems strange that they would not take advantage of the shorter distance to increase the transfer speed.
Obligitory grammar nitpick: surely you mean play too loose?
If bandwidth is finite, serializing downloads means one finishes first, and can be used while the others download.
No. If you run all of the downloads in parallel then one of them still finishes first and can be used while the others finish off.
Also, when the available bandwidth per-stream is lower than the available bandwidth per-link it is quicker to run the downloads in parallel. Lastly, when the total bandwidth across all the streams is still less than the link (which is frequently true) then the sequential time of each is unaffected by running them in parallel, but the total time is greatly reduced.
I think that you underestimate the value of soft modes of failure, particularly in maintaining quality standards. Examiners are human and it is much easier to say "not yet" than it is to say "and you're out of here".
So it really is like AIDS, Cancer or Death?
Thanks for the reply - that's a really interesting use for them.
What do you use it for? If you are plugging secure data into an untrusted box it seems that you have no defense against something on the box simply reading all of the data. For example if Spotlight indexes the drive then it has leaked data immediately.