His music will be made available via an API powered by Musopen so anyone can come up with ways to explore and present Chopin's life."
Link to Original Source
At BGI they have 180 machines... each run from a machine is approximately 3 TB of raw data. A single run takes one week. That is 77 TB per day of data being produced. And that is only BGI.. there are at least 180 machines outside of BGI scattered across the world. So imagine 140TB per day.
CERN is nothing compared to Genomic data.
When LHC is running at full luminosity, it produces roughly a megabyte per event per detector (for CMS and ATLAS at least). Of course, the events are happening at ~40MHz, so 288 TB of raw data per hour. That's why they have to trigger, and hence throw out 99% of the data.
Genomic data is nothing compared to elementary particles.
"If you own a machine, you are in turn owned by it, and spend your time serving it..." -- Marion Zimmer Bradley, _The Forbidden Tower_