China

The Engineering Marvel That China Hopes Will Help Wean It Off Foreign Energy (wsj.com) 58

China has begun construction of a $167 billion hydropower facility on Tibet's Yarlung Tsangpo River that would generate triple the output of the Three Gorges Dam. The project employs a run-of-the-river design, drilling deep tunnels through mountains to bypass the Yarlung Tsangpo Grand Canyon, where the river drops nearly two vertical miles over 300 miles. Water diverted through the tunnels will drive turbines at both ends without creating a large reservoir. The river currently produces just 2% of its hydropower potential. A $7 billion transmission network will deliver electricity to Guangdong province, Hong Kong, and Macau. China imported nearly a quarter of its energy supply in 2023.
Programming

'Hour of Code' Announces It's Now Evolving Into 'Hour of AI' (hourofcode.com) 35

Last month Microsoft pledged $4 billion (in cash and AI/cloud technology) to "advance" AI education in K-12 schools, community and technical colleges, and nonprofits (according to a blog post by Microsoft President Brad Smith). But in the launch event video, Smith also says it's time to "switch hats" from coding to AI, adding that "the last 12 years have been about the Hour of Code, but the future involves the Hour of AI."

Long-time Slashdot reader theodp writes: This sets the stage for Code.org CEO Hadi Partovi's announcement that his tech-backed nonprofit's [annual educational event] Hour of Code is being renamed to the Hour of AI... Explaining the pivot, Partovi says: "Computer science for the last 50 years has had a focal point around coding that's been — sort of like you learn computer science so that you create code. There's other things you learn, like data science and algorithms and cybersecurity, but the focal point has been coding.

"And we're now in a world where the focal point of computer science is shifting to AI... We all know that AI can write much of the code. You don't need to worry about where did the semicolons go, or did I close the parentheses or whatnot. The busy work of computer science is going to be done by the computer itself.

"The creativity, the thinking, the systems design, the engineering, the algorithm planning, the security concerns, privacy concerns, ethical concerns — those parts of computer science are going to be what remains with a focal point around AI. And what's going to be important is to make sure in education we give students the tools so they don't just become passive users of AI, but so that they learn how AI works."

Speaking to Microsoft's Smith, Partovi vows to redouble the nonprofit's policy work to "make this [AI literacy] a high school graduation requirement so that no student graduates school without at least a basic understanding of what's going to be part of the new liberal arts background [...] As you showed with your hat, we are renaming the Hour of Code to an Hour of AI."

Security

Google Says Its AI-Based Bug Hunter Found 20 Security Vulnerabilities (techcrunch.com) 17

"Heather Adkins, Google's vice president of security, announced Monday that its LLM-based vulnerability researcher Big Sleep found and reported 20 flaws in various popular open source software," reports TechCrunch: Adkins said that Big Sleep, which is developed by the company's AI department DeepMind as well as its elite team of hackers Project Zero, reported its first-ever vulnerabilities, mostly in open source software such as audio and video library FFmpeg and image-editing suite ImageMagick. [There's also a "medium impact" issue in Redis]

Given that the vulnerabilities are not fixed yet, we don't have details of their impact or severity, as Google does not yet want to provide details, which is a standard policy when waiting for bugs to be fixed. But the simple fact that Big Sleep found these vulnerabilities is significant, as it shows these tools are starting to get real results, even if there was a human involved in this case.

"To ensure high quality and actionable reports, we have a human expert in the loop before reporting, but each vulnerability was found and reproduced by the AI agent without human intervention," Google's spokesperson Kimberly Samra told TechCrunch.

Google's vice president of engineering posted on social media that this demonstrates "a new frontier in automated vulnerability discovery."
The Almighty Buck

OpenAI Pays Bonuses Ranging Up To Millions of Dollars To 1,000 Researchers, Engineers (theinformation.com) 19

An anonymous reader shares a report: OpenAI is paying bonuses to around 1,000 employees on its technical research and engineering teams, or about a third of the company, ranging from the low hundreds of thousands to millions, as the company gears up to release its latest flagship GPT-5 model and faces an ever-rising battle for AI talent, according to a person with knowledge of the bonuses.
Security

Google Suffers Data Breach in Ongoing Salesforce Data Theft Attacks (bleepingcomputer.com) 3

Google is the latest company to suffer a data breach in an ongoing wave of Salesforce CRM data theft attacks conducted by the ShinyHunters extortion group. BleepingComputer: In June, Google warned that a threat actor they classify as 'UNC6040' is targeting companies' employees in voice phishing (vishing) social engineering attacks to breach Salesforce instances and download customer data. This data is then used to extort companies into paying a ransom to prevent the data from being leaked.

In a brief update to the article last night, Google said that it too fell victim to the same attack in June after one of its Salesforce CRM instances was breached and customer data was stolen. "In June, one of Google's corporate Salesforce instances was impacted by similar UNC6040 activity described in this post. Google responded to the activity, performed an impact analysis and began mitigations," reads Google's update.

News

Deadly Titan Submersible Implosion Was Preventable Disaster, Coast Guard Concludes 124

The U.S. Coast Guard determined the implosion of the Titan submersible that killed five people while traveling to the wreckage of the Titanic was a preventable disaster caused by OceanGate Expeditions's inability to meet safety and engineering standards. WSJ: A 335-page report [PDF] detailing a two-year inquiry from the U.S. Coast Guard's Marine Board of Investigation found the company that owned and operated the Titan failed to follow maintenance and inspection protocols for the deep-sea submersible.

OceanGate avoided regulatory review and managed the submersible outside of standard protocols "by strategically creating and exploiting regulatory confusion and oversight challenges," the report said. The Coast Guard opened its highest-level investigation into the event in June 2023, shortly after the implosion occurred. "There is a need for stronger oversight and clear options for operators who are exploring new concepts outside of the existing regulatory framework," Jason Neubauer, the chair of the Coast Guard Marine Board of Investigation for the Titan submersible, said in a statement.
Power

Hyundai To Help Build Nuclear-Powered Datacenter In Texas (theregister.com) 44

Fermi America is planning to build a colossal AI datacenter complex in Amarillo, Texas, powered by up to six gigawatts of nuclear energy. According to The Register, the company has selected Hyundai to support the deployment of the "HyperGrid," describing it as the "world's largest advanced energy campus." From the report: The project is backed by Rick Perry, who served as Texas governor and US Energy Secretary, and investor Toby Neugebauer, and aims to establish Texas as the US's largest energy and intelligence campus. Construction of the first of four Westinghouse AP1000 reactors is set to begin next year in Amarillo with the plant funneling behind-the-meter power to GPU bit barns by 2032, at least that's according to a memorandum of understanding (MoU). In other words, there is no guarantee the 23 million square meter project (1.1 MilliWales) will actually be built in its entirety, but if it is, Hyundai will oversee it.

"This agreement is significant in that it allows us to participate from the early stages of this project and contribute to the creation of the world's largest integrated energy and artificial intelligence campus, which leverages a diverse range of energy infrastructure," Hyundai said in a canned statement. At the very least, Hyundai knows what it's doing when it comes to nuclear developments. The industrial giant has led the deployment of some 22 reactors. Ambitious as the project may be, it won't be cheap. A single AP1000 reactor was estimated to cost $6.8 billion two years ago. That's a lot of money, but nothing compared to what the hyperscalers and neo-clouds are pumping into datacenters these days. Meta, for reference, expects to spend $66-72 billion on bit barns this year. [...] How exactly Fermi America or its founders Perry and Neugebauer expect to pay for one AP1000 reactor, let alone four, isn't clear. [...]

IT

'A Black Hole': America's New Graduates Discover a Dismal Job Market (nbcnews.com) 200

NBC News reports that in the U.S., many recent graduates looking to enter the labor force "are painting a dire picture of their job search." NBC News asked people who recently finished technical school, college or graduate school how their job application process was going, and in more than 100 responses, the graduates described months spent searching for a job, hundreds of applications and zero responses from employers — even with degrees once thought to be in high demand, like computer science or engineering.

Some said they struggled to get an hourly retail position or are making salaries well below what they had been expecting in fields they hadn't planned to work in. "It was very frustrating," said Jensen Kornfeind, who graduated this spring from Temple University with a degree in international trade. "Out of 70-plus job applications, I had three job interviews, and out of those three, I got ghosted from two of them."

The national economic data backs up their experience. The unemployment rate among recent graduates has been increasing this year to an average of 5.3%, compared to around 4% for the labor force as a whole, making it one of the toughest job markets for recent graduates since 2015, according to an analysis by the Federal Reserve Bank of New York released Friday. "Recent college graduates are on the margin of the labor market, and so they're the first to feel when the labor market slows and hiring slows," said Jaison Abel, an economist at the Federal Reserve Bank of New York.

Across the economy, hiring in recent months has ground to its slowest pace since the start of the pandemic, with employers adding just 73,000 jobs in July, according to data released Friday... Tech workers have been some of the hardest hit in a slowing job market, with more than 400 employers including Meta, Intel and Cisco announcing more than 130,000 jobs cut in 2025, according to tech job site TrueUp.

The article cites an economist at Indeed Hiring Lab who believes early adoption of AI "is also likely driving some of the cuts and leading employers to rethink hiring plans in anticipation of AI's future role." So besides federal policy changes, the article blames "the emergence of AI, which some companies have said they are using to replace certain entry-level jobs, like those in customer support or basic software development."

Seven months after graduating, one CS major told NBC News he'd applied for 100 jobs, and got one job offer — for the 4 a.m. shift at Starbucks.
Microsoft

Internal Microsoft Documents Detail Pay Scales (businessinsider.com) 43

Microsoft's internal pay guidelines show exactly how much the company will pay new engineering hires, according to documents obtained by Business Insider. The guidelines, updated in May, break down salary ranges, stock awards, and bonuses for every level from entry-level engineers to the company's most senior technical talent.

The documents come with an important caveat: recruiters can get approval to pay more when competing for exceptional candidates. At Microsoft's highest tier, Level 70 "distinguished engineers" can earn up to $408,000 in annual salary. But the real money comes from stock: these hires get up to $1.9 million in stock when they join, plus annual stock awards reaching $1.476 million.

The company uses different pay scales depending on location. Engineers in expensive markets like San Francisco get higher ranges than those at Microsoft's Redmond headquarters, where most hiring happens. For entry-level engineers at Level 57, Microsoft offers salaries between $83,000 and $108,000 in its main markets, with higher ranges of $95,800 to $124,600 in expensive areas like San Francisco. These new hires get modest stock awards of $5,000 to $13,000 and signing bonuses up to $9,000.

The company considers levels 57 through 59 as entry-level positions. The compensation jumps significantly as engineers advance. By Level 63, when engineers reach senior status, salaries range from $145,000 to $237,600 depending on location, with stock awards reaching $220,000.
The Internet

Scammers Unleash Flood of Slick Online Gaming Sites (krebsonsecurity.com) 29

Brian Krebs writes via KrebsOnSecurity: Fraudsters are flooding Discord and other social media platforms with ads for hundreds of polished online gaming and wagering websites that lure people with free credits and eventually abscond with any cryptocurrency funds deposited by players. Here's a closer look at the social engineering tactics and remarkable traits of this sprawling network of more than 1,200 scam sites. The scam begins with deceptive ads posted on social media that claim the wagering sites are working in partnership with popular social media personalities, such as Mr. Beast, who recently launched a gaming business called Beast Games. The ads invariably state that by using a supplied "promo code," interested players can claim a $2,500 credit on the advertised gaming website.

The gaming sites all require users to create a free account to claim their $2,500 credit, which they can use to play any number of extremely polished video games that ask users to bet on each action. At the scam website gamblerbeast[.]com, for example, visitors can pick from dozens of games like B-Ball Blitz, in which you play a basketball pro who is taking shots from the free throw line against a single opponent, and you bet on your ability to sink each shot. The financial part of this scam begins when users try to cash out any "winnings." At that point, the gaming site will reject the request and prompt the user to make a "verification deposit" of cryptocurrency -- typically around $100 -- before any money can be distributed. Those who deposit cryptocurrency funds are soon asked for additional payments. However, any "winnings" displayed by these gaming sites are a complete fantasy, and players who deposit cryptocurrency funds will never see that money again. Compounding the problem, victims likely will soon be peppered with come-ons from "recovery experts" who peddle dubious claims on social media networks about being able to retrieve funds lost to such scams. [...]

[T]hreat hunting platform Silent Push reveals at least 1,270 recently-registered and active domains whose names all invoke some type of gaming or wagering theme. Here is a list of all domains that Silent Push found were using the scambling network's chat API.

Google

Google Execs Say Employees Have To 'Be More AI-Savvy' 88

An anonymous reader quotes a report from CNBC: Google executives are pushing employees to act with more urgency in their use of artificial intelligence as the company looks for ways to cut costs. That was the message at an all-hands meeting last week, featuring CEO Sundar Pichai and Brian Saluzzo, who runs the teams building the technical foundation for Google's flagship products. "Anytime you go through a period of extraordinary investment, you respond by adding a lot of headcount, right?" Pichai said, according to audio obtained by CNBC. "But in this AI moment, I think we have to accomplish more by taking advantage of this transition to drive higher productivity. [...] We are competing with other companies in the world," Pichai said at the meeting. "There will be companies which will become more efficient through this moment in terms of employee productivity, which is why I think it's important to focus on that." [...]

"We are going to be going through a period of much higher investment and I think we have to be frugal with our resources, and I would strive to be more productive and efficient as a company," Pichai said, adding that he's "very optimistic" about how Google is doing. At the meeting, Saluzzo highlighted a number of tools the company is building for software engineers, or SWEs, to help "everybody at Google be more AI-savvy." "We feel the urgency to really quickly and urgently get AI into more of the coding workflows to address top needs so you see a much more rapid increase in velocity," Saluzzo said. Saluzzo said Google has a portfolio of AI products available to employees "so folks can go faster." He mentioned an internal site called "AI Savvy Google" which has courses, toolkits and learning sessions, including some for individual product areas.

Google's engineering education team, which develops courses for internal and external use, partnered with DeepMind on a training called "Building with Gemini" that the company will start promoting soon, Saluzzo said. He also referenced a new internal AI coding tool called Cider that helps software engineers with various aspects of the development process. Since May, when the company first introduced Cider, 50% of users tap the service on a weekly basis, Saluzzo said. Regarding Google's internal AI tools, Saluzzo said that employees should "expect them to continuously get better" and that "they'll become a pretty integral part of most SWE work."
China

Huawei Shows Off 384-Chip AI Computing System That Rivals Nvidia's Top Product (msn.com) 118

Long-time Slashdot reader hackingbear writes: China's Huawei Technologies showed off an AI computing system on Saturday that can rival Nvidia's most advanced offering, even though the company faces U.S. export restrictions. The CloudMatrix 384 system made its first public debut at the World Artificial Intelligence Conference (WAIC), a three-day event in Shanghai where companies showcase their latest AI innovations, drawing a large crowd to the company's booth. The CloudMatrix 384 incorporates 384 of Huawei's latest 910C chips, optically connected through an all-to-all topology, and outperforms Nvidia's GB200 NVL72 on some metrics, which uses 72 B200 chips, according to SemiAnalysis. A full CloudMatrix system can now deliver 300 PFLOPs of dense BF16 compute, almost double that of the GB200 NVL72. With more than 3.6x aggregate memory capacity and 2.1x more memory bandwidth, Huawei and China "now have AI system capabilities that can beat Nvidia's," according to a report by SemiAnalysis.

The trade-off is that it takes 4.1x the power of a GB200 NVL72, with 2.5x worse power per FLOP, 1.9x worse power per TB/s memory bandwidth, and 1.2x worse power per TB HBM memory capacity, but SemiAnalysis noted that China has no power constraints only chip constraints. Nvidia had announced DGX H100 NVL256 "Ranger" Platform [with 256 GPUs], SemiAnalysis writes, but "decided to not bring it to production due to it being prohibitively expensive, power hungry, and unreliable due to all the optical transceivers required and the two tiers of network. The CloudMatrix Pod requires an incredible 6,912 400G LPO transceivers for networking, the vast majority of which are for the scaleup network."



Also at this event, Chinese e-commerce giant Alibaba released a new flagship open-source reasoning model Qwen3-235B-A22B-Thinking-2507 which has "already topped key industry benchmarks, outperforming powerful proprietary systems from rivals like Google and OpenAI," according to industry reports. On the AIME25 benchmark, a test designed to evaluate sophisticated, multi-step problem-solving skills, Qwen3-Thinking-2507 achieved a remarkable score of 92.3. This places it ahead of some of the most powerful proprietary models, notably surpassing Google's Gemini-2.5 Pro, while Qwen3-Thinking secured a top score of 74.1 at LiveCodeBench, comfortably ahead of both Gemini-2.5 Pro and OpenAI's o4-mini, demonstrating its practical utility for developers and engineering teams.
EU

To Fight Climate Change, Norway Wants to Become Europe's Carbon Dump (msn.com) 69

Liquefied CO2 will be transported by ship to "the world's first carbon shipping port," reports the Washington Post — an island in the North Sea where it will be "buried in a layer of spongy rock a mile and a half beneath the seabed."

Norway's government is covering 80% of the $1 billion first phase, with another $714 million from three fossil fuel companies toward an ongoing expansion (with an additional $150 million E.U. subsidy). As Europe's top oil and gas producer, Norway is using its fossil fuel income to see if they can make "carbon dumping" work. The world's first carbon shipment arrived this summer, carrying 7,500 metric tons of liquefied CO2 from a Norwegian cement factory that otherwise would have gone into the atmosphere... If all goes as planned, the project's backers — Shell, Equinor and TotalEnergies, along with Norway — say their facility could pump 5 million metric tons of carbon dioxide underground each year, or about a tenth of Norway's annual emissions...

[At the Heidelberg Materials cement factory in Brevik, Norway], when hot CO2-laden air comes rushing out of the cement kilns, the plant uses seawater from the neighboring fjord to cool it down. The cool air goes into a chamber where it gets sprayed with amine, a chemical that latches onto CO2 at low temperatures. The amine mist settles to the bottom, dragging carbon dioxide down with it. The rest of the air floats out of the smokestack with about 85 percent less CO2 in it, according to project manager Anders Pettersen. Later, Heidelberg Materials uses waste heat from the kilns to break the chemical bonds, so that the amine releases the carbon dioxide. The pure CO2 then goes into a compressor that resembles a giant steel heart, where it gets denser and colder until it finally becomes liquid. That liquid CO2 remains in storage tanks until a ship comes to carry it away. At best, operators expect this system to capture half the plant's CO2 emissions: 400,000 metric tons per year, or the equivalent of about 93,000 cars on the road...

[T]hree other companies are lined up to follow: Ørsted, which will send CO2 from two bioenergy plants in Denmark; Yara, which will send carbon from a Dutch fertilizer factory; and Stockholm Exergi, which will capture carbon from a Swedish bioenergy plant that burns wood waste. All of these projects have gotten significant subsidies from national governments and the European Union — essentially de-risking the experiment for the companies. Experts say the costs and headaches of installing and running carbon-capture equipment may start to make more financial sense as European carbon rules get stricter and the cost of emitting a ton of carbon dioxide goes up. Still, they say, it's hard to imagine many companies deciding to invest in carbon capture without serious subsidies...

The first shipments are being transported by Northern Pioneer, the world's biggest carbon dioxide tanker ship, built specifically for this project. The 430-foot ship can hold 7,500 metric tons of CO2 in tanks below deck. Those tanks keep it in a liquid state by cooling it to minus-15 degrees Fahrenheit and squeezing it with the same pressure the outside of a submarine would feel 500 feet below the waves. While that may sound extreme, consider that the liquid natural gas the ship uses for fuel has to be stored at minus-260 degrees. "CO2 isn't difficult to make it into a liquid," said Sally Benson, professor of energy science and engineering at Stanford University. Northern Pioneer is designed to emit about a third less carbon dioxide than a regular ship — key for a project that aims to eliminate carbon emissions. The ship burns natural gas, which emits less CO2 than marine diesel produces (though gas extraction is associated with methane leaks). The vessel uses a rotor sail to capture wind power. And it blows a constant stream of air bubbles to reduce friction as the hull cuts through the water, allowing it to burn less fuel. For every 100 tons of CO2 that Northern Lights pumps underground, it expects to emit three tons of CO2 into the atmosphere, mainly by burning fuel for shipping.

Eventually the carbon flows into a pipeline "that plunges through the North Sea and into the rocky layers below it — an engineering feat that's a bit like drilling for oil in reverse..." according to the article.

"Over the centuries, it should chemically react with the rock, eventually being locked away in minerals."
Businesses

Graduate Job Postings Plummet, But AI May Not Be the Primary Culprit (ft.com) 41

Job postings for entry-level roles requiring degrees have dropped nearly two-thirds in the UK and 43% in the US since ChatGPT launched in 2022, according to Financial Times analysis of Adzuna data. The decline spans sectors with varying AI exposure -- UK graduate openings fell 75% in banking, 65% in software development, but also 77% in human resources and 55% in civil engineering.

Indeed research found only weak correlation between occupations mentioning AI most frequently and those with the steepest job posting declines. US Bureau of Labor Statistics data showed no clear relationship between an occupation's AI exposure and young worker losses between 2022-2024. Economists say economic uncertainty, post-COVID workforce corrections, increased offshoring, and reduced venture capital funding are likely primary drivers of the graduate hiring slowdown.
Microsoft

Microsoft Used China-Based Support for Multiple U.S. Agencies, Potentially Exposing Sensitive Data (propublica.org) 15

Microsoft used China-based engineering teams to maintain cloud computing systems for multiple federal departments including Justice, Treasury, and Commerce, extending the practice beyond the Defense Department that the company announced last week it would discontinue. The work occurred within Microsoft's Government Community Cloud, which handles sensitive but unclassified federal information and has been used by the Justice Department's Antitrust Division for criminal and civil investigations, as well as parts of the Environmental Protection Agency and Department of Education.

Microsoft employed "digital escorts" -- U.S.-based personnel who supervised the foreign engineers -- similar to the arrangement it used for Pentagon systems. Following ProPublica's reporting, Microsoft issued a statement indicating it would take "similar steps for all our government customers who use Government Community Cloud to further ensure the security of their data." Competing cloud providers Amazon Web Services, Google, and Oracle told ProPublica they do not use China-based support for federal contracts.
Intel

Intel Will Shed 24,000 Employees This Year, Retreat In Germany, Poland, Costa Rica, and Ohio (theverge.com) 43

Intel announced it will cut approximately 24,000 jobs in 2025 and cancel or scale back projects in Germany, Poland, Costa Rica, and Ohio as part of CEO Lip-Bu Tan's sweeping restructuring efforts. By the end of the year, the struggling chipmaker plans to have "just around 75,000 'core employees' in total," according to The Verge. "It's not clear if the layoffs will slow now that we're over halfway through the year, but Intel states today that it has already 'completed the majority of the planned headcount actions it announced last quarter to reduce its core workforce by approximately 15 percent.'" From the report: Intel employed 109,800 people at the end of 2024, of which 99,500 were "core employees," so the company is pushing out around 24,000 people this year -- shrinking Intel by roughly one-quarter. (It has also divested other businesses, shrinking the larger organization as well.) [...] Today, on the company's earnings call, Intel's says that Intel had overinvested in new factories before it had secured enough demand, that its factories had become "needlessly fragmented," and that it needs to grow its capacity "in lock step" with achieving actual milestones. "I do not subscribe to the belief that if you build it, they will come. Under my leadership, we will build what customers need when they need it, and earn their trust," says Tan.

Now, in Germany and Poland, where Intel was planning to spend tens of billions of dollars respectively on "mega-fabs" that would employ 3,000 workers, and on an assembly and test facility that would employ 2,000 workers, the company will "no longer move forward with planned projects" and is apparently axing them entirely. Intel has had a presence in Poland since 1993, however, and the company did not say its R&D facilities there are closing. (Intel had previously pressed pause on the new Germany and Poland projects "by approximately two years" back in 2024.)

In Costa Rica, where Intel employs over 3,400 people, the company will "consolidate its assembly and test operations in Costa Rica into its larger sites in Vietnam." Metzger tells The Verge that over 2,000 Costa Rica employees should remain to work in engineering and corporate, though. The company is also cutting back in Ohio: "Intel will further slow the pace of construction in Ohio to ensure spending is aligned with market demand." Intel CFO David Zinsner says Intel will continue to make investments there, though, and construction will continue.

Communications

Starlink Suffers Worldwide Outage (mirror.co.uk) 43

Longtime Slashdot reader gbkersey shares a report from The Mirror: Elon Musk's satellite internet Starlink has been hit with a global outage preventing thousands of users from accessing the internet. According to DownDetector, reports of issues began to surge around 8pm GMT, with nearly 60,000 global users affected at the peak of the outage. "Starlink is currently in a network outage and we are actively implementing a solution," the company said in a post on X. "We appreciate your patience, we'll share an update once this issue is resolved."

Outages are being reported across the U.S., as well as along the Ukrainian frontline. Meanwhile, more than 10,000 people in the UK have logged issues with Starlink since 8pm this evening. "The majority of the reports (64%) are concerning a total blackout, while the rest point to internet problems," the report says.

UPDATE: Michael Nicolls, VP of Starlink Engineering, wrote in a post: "Starlink has now mostly recovered from the network outage, which lasted approximately 2.5 hours. The outage was due to failure of key internal software services that operate the core network. We apologize for the temporary disruption in our service; we are deeply committed to providing a highly reliable network, and will fully root cause this issue and ensure it does not occur again."

UPDATE #2: Starlink said in an update at 5:18 PM PT: "The network issue has been resolved, and Starlink service has been restored. We understand how important connectivity is and apologize for the disruption."
AI

Nvidia's CUDA Platform Now Support RISC-V (tomshardware.com) 20

An anonymous reader quotes a report from Tom's Hardware: At the 2025 RISC-V Summit in China, Nvidia announced that its CUDA software platform will be made compatible with the RISC-V instruction set architecture (ISA) on the CPU side of things. The news was confirmed during a presentation during a RISC-V event. This is a major step in enabling the RISC-V ISA-based CPUs in performance demanding applications. The announcement makes it clear that RISC-V can now serve as the main processor for CUDA-based systems, a role traditionally filled by x86 or Arm cores. While nobody even barely expects RISC-V in hyperscale datacenters any time soon, RISC-V can be used on CUDA-enabled edge devices, such as Nvidia's Jetson modules. However, it looks like Nvidia does indeed expect RISC-V to be in the datacenter.

Nvidia's profile on RISC-V seems to be quite high as the keynote at the RISC-V Summit China was delivered by Frans Sijsterman, who appears to be Vice President of Hardware Engineering at Nvidia. The presentation outlined how CUDA components will now run on RISC-V. A diagram shown at the session illustrated a typical configuration: the GPU handles parallel workloads, while a RISC-V CPU executes CUDA system drivers, application logic, and the operating system. This setup enables the CPU to orchestrate GPU computations fully within the CUDA environment. Given Nvidia's current focus, the workloads must be AI-related, yet the company did not confirm this. However, there is more.

Also featured in the diagram was a DPU handling networking tasks, rounding out a system consisting of GPU compute, CPU orchestration, and data movement. This configuration clearly suggests Nvidia's vision to build heterogeneous compute platforms where RISC-V CPU can be central to managing workloads while Nvidia's GPUs, DPUs, and networking chips handle the rest. Yet again, there is more. Even with this low-profile announcement, Nvidia essentially bridges proprietary CUDA stack to an open architecture, one that seems to develop fast in China. Yet, being unable to ship flagship GB200 and GB300 offerings to China, the company has to find ways to keep its CUDA thriving.

Microsoft

Microsoft Poaches Top Google DeepMind Staff in AI Talent War (ft.com) 26

Microsoft has recruited more than 20 AI employees from Google's DeepMind research division, the newest front in a talent war being waged by Silicon Valley's tech giants as they jostle to gain an edge in the nascent technology. From a report: Amar Subramanya, the former head of engineering for Google's Gemini chatbot, is the latest to move to Microsoft from its rival, according to a post on his LinkedIn profile on Tuesday. "The culture here is refreshingly low ego yet bursting with ambition," he wrote, confirming his appointment as corporate vice-president of AI.

Subramanya will join other DeepMind staff including engineering lead Sonal Gupta, software engineer Adam Sadovsky and product manager Tim Frank, according to people familiar with Microsoft's recruiting. The Seattle-based company has persuaded at least 24 staff to join in the past six months, they added.

NASA

How NASA Saved a Camera From 370 Million Miles Away (phys.org) 38

An anonymous reader quotes a report from Phys.org: The mission team of NASA's Jupiter-orbiting Juno spacecraft executed a deep-space move in December 2023 to repair its JunoCam imager to capture photos of the Jovian moon Io. Results from the long-distance save were presented during a technical session on July 16 at the Institute of Electrical and Electronics Engineers Nuclear & Space Radiation Effects Conference in Nashville. JunoCam is a color, visible-light camera. The optical unit for the camera is located outside a titanium-walled radiation vault, which protects sensitive electronic components for many of Juno's engineering and science instruments. This is a challenging location because Juno's travels carry it through the most intense planetary radiation fields in the solar system. While mission designers were confident JunoCam could operate through the first eight orbits of Jupiter, no one knew how long the instrument would last after that. Throughout Juno's first 34 orbits (its prime mission), JunoCam operated normally, returning images the team routinely incorporated into the mission's science papers. Then, during its 47th orbit, the imager began showing hints of radiation damage. By orbit 56, nearly all the images were corrupted.

While the team knew the issue might be tied to radiation, pinpointing what was specifically damaged within JunoCam was difficult from hundreds of millions of miles away. Clues pointed to a damaged voltage regulator that was vital to JunoCam's power supply. With few options for recovery, the team turned to a process called annealing, where a material is heated for a specified period before slowly cooling. Although the process is not well understood, the idea is that heating can reduce defects in the material. Soon after the annealing process finished, JunoCam began cranking out crisp images for the next several orbits. But Juno was flying deeper and deeper into the heart of Jupiter's radiation fields with each pass. By orbit 55, the imagery had again begun showing problems.

"After orbit 55, our images were full of streaks and noise," said JunoCam instrument lead Michael Ravine of Malin Space Science Systems. "We tried different schemes for processing the images to improve the quality, but nothing worked. With the close encounter of Io bearing down on us in a few weeks, it was Hail Mary time: The only thing left we hadn't tried was to crank JunoCam's heater all the way up and see if more extreme annealing would save us." Test images sent back to Earth during the annealing showed little improvement in the first week. Then, with the close approach of Io only days away, the images began to improve dramatically. By the time Juno came within 930 miles (1,500 kilometers) of the volcanic moon's surface on Dec. 30, 2023, the images were almost as good as the day the camera launched, capturing detailed views of Io's north polar region that revealed mountain blocks covered in sulfur dioxide frosts rising sharply from the plains and previously uncharted volcanoes with extensive flow fields of lava. To date, the solar-powered spacecraft has orbited Jupiter 74 times. Recently, the image noise returned during Juno's 74th orbit.

Slashdot Top Deals