An anonymous reader writes:
Google has a system for shipping terabytes of information around the world. This system was brought about by following on the work started by by Microsoft researcher Jim Grey, who delivered copies of the Terraserver mapping data to people around the world.
Google's open source team is working on ways to physically transfer huge data sets up to 120 terabytes in size.
From the BBC article:
"We have started collecting these data sets and shipping them out to other scientists who want them," said Google's Chris DiBona.
The program is currently informal and not open to the general public. Google either approaches bodies that it knows has large data sets or is contacted by scientists themselves.
One of the largest data sets copied and distributed was data from the Hubble telescope — 120 terabytes of data.
"We have a number of machines about the size of brick blocks, filled with hard drives.
"We send them out to people who copy the data on them and ship them back to us. We dump them on to one of our data systems and ship it out to people."
Google keeps a copy and the data is always in an open format, or in the public domain or perhaps covered by a creative commons license.