Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror

Submission + - Simulating 800,000 years of California earthquake history to pinpoint risks (utexas.edu)

aarondubrow writes: A study in the Bulletin of the Seismological Society of America presents results from a new earthquake simulator, RSQSim, that simulates hundreds of thousands of years of seismic history in California. Coupled with another code, CyberShake, the framework can calculate the amount of shaking that would occur for each quake. The framework makes use of two of the most powerful supercomputers on the planet: Frontera, at the Texas Advanced Computing Center, and Summit, at Oak Ridge National Laboratory. The new approach improves seismologists' ability to pinpoint how big an earthquake might occur at a given location, allowing building code developers, architects, and structural engineers to design more resilient buildings that can survive earthquakes.

Comment Where we're at with Mars exploration (Score 1) 1

"Imagine, you're an alien and you know almost nothing about Earth, and you land on seven or eight points on Earth and drive a few hundred kilometers. Does that alien species know enough about Earth?" Ono asked. "No. If we want to represent the huge diversity of Mars we'll need more measurements on the ground, and the key is substantially extended distance, hopefully covering thousands of miles."

Submission + - Deep learning helps future Mars Rovers go farther, faster, and do more science (utexas.edu) 1

aarondubrow writes: NASA JPL are developing autonomous capabilities that could allow future Mars rovers to go farther, faster and do more science. Training machine learning models on the Maverick2 supercomputer at the Texas Advanced Computing Center, their team developed and optimized models for Drive-By Science and Energy-Optimal Autonomous Navigation. The team presented results of their work at the IEEE Aerospace Conference in March 2020. The project was a finalist for the NASA Software Award.

Submission + - Texas boosts U.S. science with fastest academic supercomputer in the world (utexas.edu)

aarondubrow writes: The Texas Advanced Computing Center (TACC) at The University of Texas at Austin today launched Frontera, the fastest supercomputer at any university and the 5th most powerful system in the world. TACC is also home to Stampede2, the second fastest supercomputer at any American university. The launch of Frontera solidifies UT Austin among the world's academic leaders in this realm.

Frontera has been supporting science applications since June and has already enabled more than three dozen teams to conduct research on a range of topics from black hole physics to climate modeling to drug design, employing simulation, data analysis, and artificial intelligence at a scale not previously possible.

First announced in August 2018, Frontera was built in early 2019, and earned the #5 spot on the twice-annual TOP500 list in June, achieving 23.5 PetaFLOPS (23.5 thousand million million floating-point operations per second) on the high-performance LINPACK benchmark, a measure of the system's computing power.

In August, Frontera added two new subsystems — using technologies from NVIDIA, IBM and Green Revolution Cooling (GRC) — which provide 11 PetaFLOPS of additional single precision performance and to explore alternate computational architectures for the future.

Submission + - Supercomputers help peel back the darkness on M87 (utexas.edu)

aarondubrow writes: Supercomputers at the Texas Advanced Computing Center (TACC) made vital contributions to the first-ever image of a black hole in the galaxy M87. Those systems helped lay the groundwork for black hole imaging, and provided the theoretical foundation that allowed scientists to read the mass, underlying structure, and orientations of the black hole and its environment. Using data collected by the Event Horizon Telescope (EHT), a global network of radio telescopes, research teams employed TACC's Stampede1 and Stampede2 supercomputers to three-dimensionally simulate the physical properties of M87, and predict observational features of the black hole. Further models rendered the dynamics of the phenomenon into an image of how it would appear from Earth, using ray-tracing methods. Another team used TACC's Jetstream cloud environment to develop cloud-based data analysis pipelines, used to combine massive EHT data troves, and to share the data worldwide.

Submission + - Anticipating the dangers of space (utexas.edu)

aarondubrow writes: Astronauts and future space tourists face risks from radiation, which can cause illness and injure organs. Researchers from Texas A&M, NASA and the University of Texas Medical Branch used supercomputers at the Texas Advanced Computing Center to investigate the radiation exposure related to the Manned Orbiting Laboratory mission, planned for the 1960s and 70s, during which a dangerous solar storm occurred. They also explored the historical limitations of radiation research and how such limitations could be addressed in future endeavors.

Submission + - Supercomputers assist in search for new, better cancer drugs (utexas.edu)

aarondubrow writes: Finding new drugs that can more effectively kill cancer cells or disrupt the growth of tumors is one way to improve survival rates for ailing patients. Researchers are using supercomputers at the Texas Advanced Computing Center to find new chemotherapy drugs and to test known compounds to determine if they can fight different types of cancer. Recent efforts have yielded promising drug candidates, potential plant-derived compounds and new target sites that can lead to more effective drugs.

Submission + - Scientists turn mammalian cells into complex biocomputers (sciencemag.org)

sciencehabit writes: Computer hardware is getting a softer side. A research team has come up with a way of genetically engineering the DNA of mammalian cells to carry out complex computations, in effect turning the cells into biocomputers. The group hasn’t put those modified cells to work in useful ways yet, but down the road researchers hope the new programming techniques will help improve everything from cancer therapy to on-demand tissues that can replace worn-out body parts.

Submission + - Climate Change Is Altering Global Air Currents (independent.co.uk)

An anonymous reader writes: One of the scientists who demonstrated conclusively that global warming was an unnatural event with the famous “hockey stick” graph is now warning that giant jetstreams which circle the planet are being altered by climate change. Jetstreams are influenced by the difference in temperatures between the Arctic and the equator. But the Arctic has been warming much faster than tropical climates – the island of Svalbard, for example was 6.5 degrees celsius warmer last year compared to the average between 1961 and 1990. The land has also been warming faster than the sea. Both of those factors were changing the flow of these major air currents to create “extreme meanders” which were helping to cause “extreme weather events”, Professor Michael Mann said. In a paper in the journal Scientific Reports, Professor Mann and other researchers wrote that evidence of the effect of climate change on the jetstreams had “only recently emerged from the background noise of natural variability." They said that projections of the effect on the jetstreams in “state-of-the-art” climate models were “mirrored” in “multiple” actual temperature measurements. The jetstream normally flows reasonably consistently around the planet, but can develop loops extending north and south. The researchers, who studied temperature records going back to 1870 as well as satellite data, said these loops could grow “very large” or even “grind to a halt” rather than moving from west to east. The effect has been most pronounced during the past 40 years, they found.

Submission + - Supercomputers help researchers improve severe hail storms forecasts

aarondubrow writes: Researchers working on the Severe Hail Analysis, Representation and Prediction (SHARP) project at University of Oklahoma used the Stampede supercomputer to gain a better understanding of the conditions that cause severe hail to form, and to produce hail forecasts with far greater accuracy than those currently used operationally. The model the team used is six times more resolved that the National Weather Service's highest-resolution official forecasts and applies machine learning algorithms to improve its predictions. The researchers will publish their results in an upcoming issue of the American Meteorological Society journal Weather and Forecasting.

Submission + - Fighting food poisoning in Las Vegas with machine learning

aarondubrow writes: Computer science researchers from the University of Rochester developed an app for health departments that uses natural language processing and artificial intelligence to identify likely food poisoning hot spots. Las Vegas health officials recently used the app, called nEmesis, to improve the city's inspection protocols and found it was 63% more effective at identifying problematic venues than the current state of the art. The researchers estimate that if every inspection in Las Vegas became adaptive, it could prevent over 9,000 cases of foodborne illness and 557 hospitalizations annually. The team presented the results at the 30th Association for the Advancement of Artificial Intelligence conference in February.

Submission + - NSF and federal partners award $37M to advance nation's co-robots

aarondubrow writes: Today, NSF, in partnership with DOD, NASA, NSF, NIH and USDA, announced $37 million in new awards to spur the development and use of co-robots, robots that work cooperatively with people. From unmanned vehicles that can inspect and fix ailing infrastructure to co-robots that can collaborate with workers on manufacturing tasks, scientists and engineers are developing the next generation of robots that can handle critical tasks in close proximity to humans, providing for unprecedented safety and resilience. This year, the initiative funded 66 new research proposals to 49 distinct institutions in 27 states.

Submission + - NSF awards $74.5 million to support interdisciplinary cybersecurity research (nsf.gov)

aarondubrow writes: The National Science Foundation announced $74.5 million in grants for basic research in cybersecurity. Among the awards are projects to understand and offer reliability to cryptocurrencies; invent technologies to broadly scan large swaths of the Internet and automate the detection and patching of vulnerabilities; and establish the science of censorship resistance by developing accurate models of the capabilities of censors. According to NSF, long-term support for fundamental cybersecurity research has resulted in public key encryption, software security bug detection, spam filtering and more.

Submission + - Robots to the Rescue: 5 Lessons From 23 Emergency Robot Deployments (huffingtonpost.com)

aarondubrow writes: Robin Murphy, director of Center for Robot-Assisted Search and Rescue at Texas A&M University and one of the leading researchers in the field of disaster robotics, has used robots and UAVs for search-and-rescue missions and structural inspections during more than 20 disasters, from 9/11 to Katrina to Fukushima and the 2015 Texas floods. The Huffington Post carried a story where she describes five lessons she's learned from her robot deployments and research.

Submission + - Democratizing the Maker Movement (huffingtonpost.com)

aarondubrow writes: To its advocates and participants, the Maker Movement resonates with those characteristics that we believe makes America great: independence and ingenuity, creativity and resourcefulness. But as impressive as today's tools are, they're not accessible to many Americans simply because of their cost and high technological barrier to entry. An article in the Huffington Post describes efforts supported by the National Science Foundation and other federal agencies to create new tools, technologies and approaches to make the Maker movement more inclusive and democratic.

Slashdot Top Deals

You can now buy more gates with less specifications than at any other time in history. -- Kenneth Parker

Working...