dcblogs writes: The U.S. Dept. of Energy is now targeting 2020 to 2022 for an exascale system, two to four years later than earlier expectations. William Harrod, research division director in the advanced scientific computing in the DOE Office of Science, previewed its planned Exascale Computing Initiative report at the SC12 supercomputing conference last week. "When we started this, [the timetable was] 2018; now it's become 2020 but really it is 2022," said Harrod. DOE will soon release its report on its Exascale Computing Initiative as part of effort to get funding approved in the FY 2014 budget. But current fiscal problems in Congress, the so-called fiscal cliff in particular, makes Harrod pessimistic about funding for next year. "To be honest, I would be somewhat doubtful of that at this point in time," he said. "The biggest problem is the budget," said Harrod. "Until I have a budget, I really don't know what I'm doing," he said. DOE has not said how much money it will need, but analysts say billions of dollars will needed to develop an exascale system. A major research effort is needed because of power, memory, concurrency and resiliency challenges posed by exascale. Data transport may be the leading problem. In today's systems, data has to travel a long way which uses up power. Datasets are "being generated are so large that it's basically impractical to write the data out to disk and bring it all back in to analyze it," said Harrod. "We need systems that have large memory capacity. If we limit the memory capacity we limit the ability to execute the applications as they need to be run," he said.
e-credibility: the non-guaranteeable likelihood that the electronic data
you're seeing is genuine rather than somebody's made-up crap.
- Karl Lehenbauer