Hugh Pickens writes: "Live Science has an interesting story on the flood of information that will pour from the Large Hadron Collider (LHC), the world's next-generation particle accelerator, an underground ring 27 kilometers around located at the European Centre for Nuclear Research (CERN) in Geneva, Switzerland, starting in mid-2008. Detectors stationed around the LHC ring will produce 15 trillion gigabytes of data every year, data that will be farmed out to computing centers worldwide. In the LHC computing model, data from the experiments will flow through tiers. The Tier 0 center at CERN takes the data directly from the experiments, stores a copy and sends it to Tier 1 sites. The Compact Muon Solenoid (CMS) experiment has seven Tier 1 sites in seven nations, and each site partitions its portion of the data based on the types of particles detected and sends these sub-samples off to one of the 30 CMS Tier 2 sites where researchers and students finally get their hands on the data. "We are really good at moving data from Fermilab to our Tier 2 center," says physicist Ken Bloom at the University of Nebaska where scientists have achieved the fastest rates for any Tier 1-to-Tier 2 connection worldwide. "We can manage a terabyte an hour easily, and a terabyte in half an hour is possible.""