His music will be made available via an API powered by Musopen so anyone can come up with ways to explore and present Chopin's life."
Link to Original Source
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Just to be clear we, at Tech news Today have posted a counter-notice and YouTube requires our show to stay off YouTube for 10 days to give UMG the opportunity to decide whether to take us to court or not. We also did not submit this story to Slashdot.
At BGI they have 180 machines... each run from a machine is approximately 3 TB of raw data. A single run takes one week. That is 77 TB per day of data being produced. And that is only BGI.. there are at least 180 machines outside of BGI scattered across the world. So imagine 140TB per day.
CERN is nothing compared to Genomic data.
When LHC is running at full luminosity, it produces roughly a megabyte per event per detector (for CMS and ATLAS at least). Of course, the events are happening at ~40MHz, so 288 TB of raw data per hour. That's why they have to trigger, and hence throw out 99% of the data.
Genomic data is nothing compared to elementary particles.
Good day to avoid cops. Crawl to work.