Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Submission + - U.S. says exascale unlikely before 2020-22 because of budget woes (

dcblogs writes: The U.S. Dept. of Energy is now targeting 2020 to 2022 for an exascale system, two to four years later than earlier expectations. William Harrod, research division director in the advanced scientific computing in the DOE Office of Science, previewed its planned Exascale Computing Initiative report at the SC12 supercomputing conference last week. "When we started this, [the timetable was] 2018; now it's become 2020 but really it is 2022," said Harrod. DOE will soon release its report on its Exascale Computing Initiative as part of effort to get funding approved in the FY 2014 budget. But current fiscal problems in Congress, the so-called fiscal cliff in particular, makes Harrod pessimistic about funding for next year. "To be honest, I would be somewhat doubtful of that at this point in time," he said. "The biggest problem is the budget," said Harrod. "Until I have a budget, I really don't know what I'm doing," he said. DOE has not said how much money it will need, but analysts say billions of dollars will needed to develop an exascale system. A major research effort is needed because of power, memory, concurrency and resiliency challenges posed by exascale. Data transport may be the leading problem. In today's systems, data has to travel a long way which uses up power. Datasets are "being generated are so large that it's basically impractical to write the data out to disk and bring it all back in to analyze it," said Harrod. "We need systems that have large memory capacity. If we limit the memory capacity we limit the ability to execute the applications as they need to be run," he said.

Submission + - Climate change research gets petascale supercomputer (

dcblogs writes: The National Center for Atmospheric Research (NCAR) has begun has begun using a 1.5 petaflop IBM system, called Yellowstone. For NCAR researchers it is an enormous leap in compute capability — a roughly 30 times improvement over its existing 77 teraflop supercomputer. Yellowstone is a 1,500 teraflops system capable of 1.5 quadrillion calculations per second using 72,288 Intel Xeon cores. The supercomputer gives researchers new capabilities. They can run more experiments with increased complexity and at a higher resolution. This new system may be able to reduce resolution to as much as 10 km (6.2 miles), giving scientists the ability to examine climate impacts in greater detail. Increase complexity allows researchers to add more conditions to their models, such as methane gas released from thawing tundra on polar sea ice. NCAR believes it is the world's most powerful computer dedicated to geosciences.

Slashdot Top Deals

"Marriage is low down, but you spend the rest of your life paying for it." -- Baskins