Stats

Humans Dominating Poker Super Computer 87

Posted by timothy
from the maybe-it's-a-ringer dept.
New submitter IoTdude writes: The Claudico super computer uses an algorithm to account for gargantuan amounts of complexity by representing the number of possible Heads-Up No-limit Texas Hold'em decisions. Claudico also updates its strategy as it goes along, but its basic approach to the game involves getting into every hand by calling bets. And it's not working out so far. Halfway through the competition, the four human pros had a cumulative lead of 626,892 chips. Though much could change in the week remaining, a lead of around 600,000 chips is considered statistically significant.
Supercomputing

Nuclear Fusion Simulator Among Software Picked For US's Summit Supercomputer 57

Posted by samzenpus
from the how-about-a-nice-game-of-chess dept.
An anonymous reader writes Today, The Register has learned of 13 science projects approved by boffins at the US Department of Energy to run on the 300-petaFLOPS Summit. These software packages, selected for the Center for Accelerated Application Readiness (CAAR) program, will be ported to the massive parallel machine, and are hoped to make full use of the supercomputer's architecture.They range from astrophysics, biophysics, chemistry, and climate modeling to combustion engineering, materials science, nuclear physics, plasma physics and seismology.
Intel

US Blocks Intel From Selling Xeon Chips To Chinese Supercomputer Projects 229

Posted by Soulskill
from the demands-recall-of-intel-inside-stickers-too dept.
itwbennett writes: U.S. government agencies have stopped Intel from selling microprocessors for China's supercomputers, apparently reflecting concern about their use in nuclear tests. In February, four supercomputing institutions in China were placed on a U.S. government list that effectively bans them from receiving certain U.S. exports. The institutions were involved in building Tianhe-2 and Tianhe-1A, both of which have allegedly been used for 'nuclear explosive activities,' according to a notice (PDF) posted by the U.S. Department of Commerce. Intel has been selling its Xeon chips to Chinese supercomputers for years, so the ban represents a blow to its business.
Intel

US Pens $200 Million Deal For Massive Nuclear Security-Focused Supercomputer 74

Posted by timothy
from the get-calculatin' dept.
An anonymous reader writes For the first time in over twenty years of supercomputing history, a chipmaker [Intel] has been awarded the contract to build a leading-edge national computing resource. This machine, expected to reach a peak performance of 180 petaflops, will provide massive compute power to Argonne National Laboratory, which will receive the HPC gear in 2018. Supercomputer maker Cray, which itself has had a remarkable couple of years contract-wise in government and commercial spheres, will be the integrator and manufacturer of the "Aurora" super. This machine will be a next-generation variant of its "Shasta" supercomputer line. The new $200 million supercomputer is set to be installed at Argonne's Leadership Computing Facility in 2018, rounding out a trio of systems aimed at bolstering nuclear security initiatives as well as pushing the performance of key technical computing applications valued by the Department of Energy and other agencies.
Supercomputing

NSF Commits $16M To Build Cloud-Based and Data-Intensive Supercomputers 29

Posted by Soulskill
from the petaflops-for-megabucks dept.
aarondubrow writes: As supercomputing becomes central to the work and progress of researchers in all fields, new kinds of computing resources and more inclusive modes of interaction are required. The National Science Foundation announced $16M in awards to support two new supercomputing acquisitions for the open science community. The systems — "Bridges" at the Pittsburgh Supercomputing Center and "Jetstream," co-located at the Indiana University Pervasive Technology Institute and The University of Texas at Austin's Texas Advanced Computing Center — respond to the needs of the scientific computing community for more high-end, large-scale computing resources while helping to create a more inclusive computing environment for science and engineering. Reader 1sockchuck adds this article about why funding for the development of supercomputers is more important than ever: America's high-performance computing (HPC) community faces funding challenges and growing competition from China and other countries. At last week's SC14 conference, leading researchers focused on outlining the societal benefits of their work, and how it touches the daily lives of Americans. "When we talk at these conferences, we tend to talk to ourselves," said Wilf Pinfold, director of research and advanced technology development at Intel Federal. "We don't do a good job communicating the importance of what we do to a broader community." Why the focus on messaging? Funding for American supercomputing has been driven by the U.S. government, which is in a transition with implications for HPC funding. As ComputerWorld notes, climate change skeptic Ted Cruz is rumored to be in line to chair a Senate committee that oversees NASA and the NSF.
Supercomputing

Does Being First Still Matter In America? 247

Posted by timothy
from the by-jingo dept.
dcblogs writes At the supercomputing conference, SC14, this week, a U.S. Dept. of Energy offical said the government has set a goal of 2023 as its delivery date for an exascale system. It may be taking a risky path with that amount of lead time because of increasing international competition. There was a time when the U.S. didn't settle for second place. President John F. Kennedy delivered his famous "we choose to go to the moon" speech in 1962, and seven years later a man walked on the moon. The U.S. exascale goal is nine years away. China, Europe and Japan all have major exascale efforts, and the government has already dropped on supercomputing. The European forecast of Hurricane Sandy in 2012 was so far ahead of U.S. models in predicting the storm's path that the National Oceanic and Atmospheric Administration was called before Congress to explain how it happened. It was told by a U.S. official that NOAA wasn't keeping up in computational capability. It's still not keeping up. Cliff Mass, a professor of meteorology at the University of Washington, wrote on his blog last month that the U.S. is "rapidly falling behind leading weather prediction centers around the world" because it has yet to catch up in computational capability to Europe. That criticism followed the $128 million recent purchase a Cray supercomputer by the U.K.'s Met Office, its meteorological agency.
Supercomputing

US DOE Sets Sights On 300 Petaflop Supercomputer 127

Posted by timothy
from the who-is-this-we-paleface? dept.
dcblogs writes U.S. officials Friday announced plans to spend $325 million on two new supercomputers, one of which may eventually be built to support speeds of up to 300 petaflops. The U.S. Department of Energy, the major funder of supercomputers used for scientific research, wants to have the two systems – each with a base speed of 150 petaflops – possibly running by 2017. Going beyond the base speed to reach 300 petaflops will take additional government approvals. If the world stands still, the U.S. may conceivably regain the lead in supercomputing speed from China with these new systems. How adequate this planned investment will look three years from now is a question. Lawmakers weren't reading from the same script as U.S. Energy Secretary Ernest Moniz when it came to assessing the U.S.'s place in the supercomputing world. Moniz said the awards "will ensure the United States retains global leadership in supercomputing." But Rep. Chuck Fleischmann (R-Tenn.) put U.S. leadership in the past tense. "Supercomputing is one of those things that we can step up and lead the world again," he said.
Supercomputing

Researchers Simulate Monster EF5 Tornado 61

Posted by Soulskill
from the any-way-the-wind-blows dept.
New submitter Orp writes: I am the member of a research team that created a supercell thunderstorm simulation that is getting a lot of attention. Presented at the 27th Annual Severe Local Storms Conference in Madison, Wisconsin, Leigh Orf's talk was produced entirely as high def video and put on YouTube shortly after the presentation. In the simulation, the storm's updraft is so strong that it essentially peels rain-cooled air near the surface upward and into the storm's updraft, which appears to play a key role in maintaining the tornado. The simulation was based upon the environment that produced the May 24, 2011 outbreak which included a long-track EF5 tornado near El Reno Oklahoma (not to be confused with the May 31, 2013 EF5 tornado that killed three storm researchers).
Earth

Interviews: Ask CMI Director Alex King About Rare Earth Mineral Supplies 62

Posted by timothy
from the dude-I-loved-their-2nd-album dept.
The modern electronics industry relies on inputs and supply chains, both material and technological, and none of them are easy to bypass. These include, besides expertise and manufacturing facilities, the actual materials that go into electronic components. Some of them are as common as silicon; rare earth minerals, not so much. One story linked from Slashdot a few years back predicted that then-known supplies would be exhausted by 2017, though such predictions of scarcity are notoriously hard to get right, as people (and prices) adjust to changes in supply. There's no denying that there's been a crunch on rare earths, though, over the last several years. The minerals themselves aren't necessarily rare in an absolute sense, but they're expensive to extract. The most economically viable deposits are found in China, and rising prices for them as exports to the U.S., the EU, and Japan have raised political hackles. At the same time, those rising prices have spurred exploration and reexamination of known deposits off the coast of Japan, in the midwestern U.S., and elsewhere.

Alex King is director of the Critical Materials Institute, a part of the U.S. Department of Energy's Ames Laboratory. CMI is heavily involved in making rare earth minerals slightly less rare by means of supercomputer analysis; researchers there are approaching the ongoing crunch by looking both for substitute materials for things like gallium, indium, and tantalum, and easier ways of separating out the individual rare earths (a difficult process). One team there is working with "ligands – molecules that attach with a specific rare-earth – that allow metallurgists to extract elements with minimal contamination from surrounding minerals" to simplify the extraction process. We'll be talking with King soon; what questions would you like to see posed? (This 18-minute TED talk from King is worth watching first, as is this Q&A.)
Supercomputing

16-Teraflops, £97m Cray To Replace IBM At UK Meteorological Office 125

Posted by Soulskill
from the crayzy-powerful dept.
Memetic writes: The UK weather forecasting service is replacing its IBM supercomputer with a Cray XC40 containing 17 petabytes of storage and capable of 16 TeraFLOPS. This is Cray's biggest contract outside the U.S. With 480,000 CPUs, it should be 13 times faster than the current system. It will weigh 140 tons. The aim is to enable more accurate modeling of the unstable UK climate, with UK-wide forecasts at a resolution of 1.5km run hourly, rather than every three hours, as currently happens. (Here's a similar system from the U.S.)
Supercomputing

First Demonstration of Artificial Intelligence On a Quantum Computer 98

Posted by Soulskill
from the teaching-a-new-dog-old-tricks dept.
KentuckyFC writes: Machine learning algorithms use a training dataset to learn how to recognize features in images and use this 'knowledge' to spot the same features in new images. The computational complexity of this task is such that the time required to solve it increases in polynomial time with the number of images in the training set and the complexity of the "learned" feature. So it's no surprise that quantum computers ought to be able to rapidly speed up this process. Indeed, a group of theoretical physicists last year designed a quantum algorithm that solves this problem in logarithmic time rather than polynomial, a significant improvement.

Now, a Chinese team has successfully implemented this artificial intelligence algorithm on a working quantum computer, for the first time. The information processor is a standard nuclear magnetic resonance quantum computer capable of handling 4 qubits. The team trained it to recognize the difference between the characters '6' and '9' and then asked it to classify a set of handwritten 6s and 9s accordingly, which it did successfully. The team says this is the first time that this kind of artificial intelligence has ever been demonstrated on a quantum computer and opens the way to the more rapid processing of other big data sets — provided, of course, that physicists can build more powerful quantum computers.
Software

Brown Dog: a Search Engine For the Other 99 Percent (of Data) 23

Posted by Soulskill
from the because-it-fetches-data dept.
aarondubrow writes: We've all experienced the frustration of trying to access information on websites, only to find that the data is trapped in outdated, difficult-to-read file formats and that metadata — the critical data about the data, such as when and how and by whom it was produced — is nonexistent. Led by Kenton McHenry, a team at the National Center for Supercomputing Applications is working to change that. Recipients in 2013 of a $10 million, five-year award from the National Science Foundation, the team is developing software that allows researchers to manage and make sense of vast amounts of digital scientific data that is currently trapped in outdated file formats. The NCSA team recently demonstrated two publicly-available services to make the contents of uncurated data collections accessible.
Supercomputing

Supercomputing Upgrade Produces High-Resolution Storm Forecasts 77

Posted by samzenpus
from the clearer-pictures dept.
dcblogs writes A supercomputer upgrade is paying off for the U.S. National Weather Service, with new high-resolution models that will offer better insight into severe weather. This improvement in modeling detail is a result of a supercomputer upgrade. The National Oceanic and Atmospheric Administration, which runs the weather service, put into production two new IBM supercomputers, each 213 teraflops, running Linux on Intel processors. These systems replaced 74-teraflop, four-year old systems. More computing power means systems can run more mathematics, and increase the resolution or detail on the maps from 8 miles to 2 miles.
Google

Google To Build Quantum Information Processors 72

Posted by Soulskill
from the remember-when-google-was-a-search-company dept.
An anonymous reader writes The Google Quantum AI Team has announced that they're bringing in a team from the University of California at Santa Barbara to build quantum information processors within the company. "With an integrated hardware group the Quantum AI team will now be able to implement and test new designs for quantum optimization and inference processors based on recent theoretical insights as well as our learnings from the D-Wave quantum annealing architecture." Google will continue to work with D-Wave, but the UC Santa Barbara group brings its own areas of expertise with superconducting qubit arrays.
Cloud

IBM Opens Up Its Watson Supercomputer To Researchers 28

Posted by samzenpus
from the try-it-out dept.
An anonymous reader writes IBM has announced the "Watson Discovery Advisor" a cloud-based tool that will let researchers comb through massive troves of data, looking for insights and connections. The company says it's a major expansion in capabilities for the Watson Group, which IBM seeded with a $1 billion investment. "Scientific discovery takes us to a different level as a learning system," said Steve Gold, vice president of the Watson Group. "Watson can provide insights into the information independent of the question. The ability to connect the dots opens up a new world of possibilities."
Supercomputing

How a Supercomputer Beat the Scrap Heap and Lived On To Retire In Africa 145

Posted by Unknown Lamer
from the spread-the-computing dept.
New submitter jorge_salazar (3562633) writes Pieces of the decommissioned Ranger supercomputer, 40 racks in all, were shipped to researchers in South Africa, Tanzania, and Botswana to help seed their supercomputing aspirations. They say they'll need supercomputers to solve their growing science problems in astronomy, bioinformatics, climate modeling and more. Ranger's own beginnings were described by the co-founder of Sun Microsystems as a 'historic moment in petaflop computing."
Supercomputing

A Peek Inside D-Wave's Quantum Computing Hardware 55

Posted by Soulskill
from the hamsters-are-neither-alive-nor-dead dept.
JeremyHsu writes: A one-second delay can still seem like an eternity for a quantum computing machine capable of running calculations in mere millionths of a second. That delay represents just one of the challenges D-Wave Systems overcame in building its second-generation quantum computing machine known as D-Wave Two — a system that has been leased to customers such as Google, NASA and Lockheed Martin. D-Wave's rapid-scaling approach to quantum computing has plenty of critics, but the company's experience in building large-scale quantum computing hardware could provide valuable lessons for everyone, regardless of whether the D-Wave machines live up to quantum computing's potential by proving they can outperform classical computers. (D-Wave recently detailed the hardware design changes between its first- and second-generation quantum computing machines in the the June 2014 issue of the journal IEEE Transactions on Applied Superconductivity.)

"We were nervous about going down this path," says Jeremy Hilton, vice president of processor development at D-Wave Systems. "This architecture requires the qubits and the quantum devices to be intermingled with all these big classical objects. The threat you worry about is noise and impact of all this stuff hanging around the qubits. Traditional experiments in quantum computing have qubits in almost perfect isolation. But if you want quantum computing to be scalable, it will have to be immersed in a sea of computing complexity.
Supercomputing

Computing a Cure For HIV 89

Posted by Soulskill
from the petaflops-for-science dept.
aarondubrow writes: The tendency of HIV to mutate and resist drugs has made it particularly difficult to eradicate. But in the last decade scientists have begun using a new weapon in the fight against HIV: supercomputers. Using some of the nation's most powerful supercomputers, teams of researchers are pushing the limits of what we know about HIV and how we can treat it. The Huffington Post describes how supercomputers are helping scientists understand and treat the disease.
Bitcoin

NSF Researcher Suspended For Mining Bitcoin 220

Posted by Unknown Lamer
from the probably-shouldn't-do-that dept.
PvtVoid (1252388) writes "In the semiannual report to Congress by the NSF Office of Inspector General, the organization said it received reports of a researcher who was using NSF-funded supercomputers at two universities to mine Bitcoin. The computationally intensive mining took up about $150,000 worth of NSF-supported computer use at the two universities to generate bitcoins worth about $8,000 to $10,000, according to the report. It did not name the researcher or the universities."