Businesses

Snapchat Blames AI As It Cuts 1,000 Jobs 43

Snap is laying off about 1,000 employees, or 16% of its workforce, while closing 300 open roles as it tries to cut costs and push toward profitability with more AI-driven efficiency. "While these changes are necessary to realize Snap's long-term potential, we believe that rapid advancements in artificial intelligence enable our teams to reduce repetitive work, increase velocity, and better support our community, partners, and advertisers," CEO Evan Spiegel wrote in a memo, which was included in the company's 8-K filing (PDF). "We have already witnessed small squads leveraging AI tools to drive meaningful progress across several important initiatives." The Verge reports: The changes are expected to save Snap $500 million by the second half of 2026. Snap had about 5,261 full-time employees as of December 2025, and now joins the growing list of tech companies that have already announced significant layoffs this year, including Meta, Amazon, Oracle, GoPro, and Jack Dorsey's Block.

"Last fall, I described Snap as facing a crucible moment, requiring a new way of working that is faster and more efficient, while pivoting towards profitable growth," Spiegel wrote. "Over the past several months, we have carefully reviewed the work required to best serve our community and partners, and made tough choices to prioritize the investments we believe are most likely to create long-term value."
AI

Jack Dorsey's Block Accused of 'AI-Washing' to Excuse Laying Off Nearly Half Its Workforce (entrepreneur.com) 28

When Block cut 4,000 jobs — nearly half its workforce — co-founder Jack Dorsey "pointed to AI as the culprit," writes Entrepreneur magazine. "Dorsey claimed that AI tools now allow fewer employees to accomplish the same work."

"But analysts see a different explanation: poor management." Block more than tripled its employee base between 2019 and 2022, growing from 3,835 to 12,430 workers. The company's stock had fallen 40% since early 2025, creating pressure to cut costs. "This is more about the business being bloated for so long than it is about AI," Zachary Gunn, a Financial Technology Partners analyst, told Bloomberg.

The phenomenon has earned a nickname: "AI-washing," where companies use artificial intelligence as cover for traditional cost-cutting. Goldman Sachs economists estimate that AI is eliminating only 5,000 to 10,000 jobs per month across all U.S. sectors, hardly enough to justify Block's massive cuts.

"European Central Bank President Christine Lagarde told lawmakers in Brussels last week that ECB economists are monitoring for signs that AI is causing job losses," reports Bloomberg, "and are 'not yet seeing' the 'waves of redundancies that are feared'..." And "a recent survey of global executives published in the Harvard Business Review found that while AI has been cited as the reason for some layoffs, those cuts are almost entirely anticipatory: executives expect big efficiency gains that have not yet been realized."

Even a former senior Block executive "is questioning whether AI is truly the reason behind the cuts," writes Inc.: In a recent opinion piece for The New York Times, Aaron Zamost, Block's former head of communications, policy, and people, asked whether the layoffs reflect a genuine "new reality in which the work they do might no longer be viable," or whether artificial intelligence is "just a convenient and flashy new cover for typical corporate downsizing." Zamost acknowledged that the answer is unclear and perhaps unknowable, even within Block itself...

Looking more closely at the layoffs, Zamost argued that the specific roles affected suggest more traditional corporate cost-cutting than a sweeping AI transformation... Many of the responsibilities being eliminated, he argued, rely on distinctly human skills that AI systems still cannot replicate. "A chatbot can't meet with the mayor, cast commercial actors, or negotiate with the Securities and Exchange Commission," Zamost wrote. "Not all the roles I've heard that Block is eliminating can be handled by AI, yet executives are treating it as equally useful today to all disciplines."

Ultimately, Zamost suggested that the sincerity of companies' AI explanations may not really matter. "It matters less whether a company knows how to deploy AI and more whether investors believe it is on track to do so," he wrote.

Indeed, whatever the rationale for Dorsey's statement, " Wall Street didn't seem to mind..." Entrepreneur magazine — since Block's stock shot up 15% after the announcement.
AI

Sam Altman Would Like To Remind You That Humans Use a Lot of Energy, Too (techcrunch.com) 142

OpenAI CEO Sam Altman is pushing back on growing concerns about AI's environmental footprint, dismissing claims about ChatGPT's water consumption as "totally fake" and arguing that the fairer way to measure AI's energy use is to compare it against humans.

In an interview with Indian Express, Altman acknowledged that evaporative cooling in data centers once made water usage a real concern but said that is no longer the case, calling internet claims of 17 gallons of water per query "completely untrue, totally insane, no connection to reality."

On energy, he conceded it is "fair" to worry about total consumption given how heavily the world now relies on AI, and called for a rapid shift toward nuclear, wind and solar power. He took particular issue with comparisons that pit the cost of training a model against a single human inference, noting it "takes like 20 years of life and all of the food you eat" before a person gets smart -- and that on a per-query basis, AI has "probably already caught up on an energy efficiency basis."
Microsoft

Microsoft Pledges Full Power Costs, No Tax Breaks in Response To AI Data Center Backlash (geekwire.com) 33

Microsoft announced Tuesday what it calls a "community first" initiative for its AI data centers, pledging to pay full electricity costs and reject local property tax breaks following months of growing opposition from residents facing higher power bills. The announcement in Washington, D.C. marks a clear departure from past practices; Microsoft has previously accepted tax abatements for data centers in Ohio and Iowa.

Brad Smith, Microsoft's president, said the company has been developing the initiative since September. Residential power prices in data center hubs like Virginia, Illinois, and Ohio jumped 12-16% over the past year, faster than the U.S. average. Three Democratic senators launched an investigation last month into whether tech giants are raising residential bills. Microsoft also pledged a 40% improvement in water efficiency by 2030 and committed to replenishing more water than it uses in each district where it operates.
Earth

How Aviation Emissions Could Be Halved Without Cutting Journeys (theguardian.com) 118

Climate-heating emissions from aviation could be slashed in half -- without reducing passenger journeys -- by getting rid of premium seats, ensuring flights are near full and using the most efficient aircraft, according to analysis. The Guardian: These efficiency measures could be far more effective in tackling the fast-growing carbon footprint of flying than pledges to use "sustainable" fuels or controversial carbon offsets, the researchers said. They believe their study, which analysed more than 27m commercial flights out of approximately 35m in 2023, is the first to assess the variation in operational efficiency of flights across the globe. The study, led by Prof Stefan Gossling at Sweden's Linnaeus University, examined flights between 26,000 city pairs carrying 3.5 billion passengers across 6.8 trillion kilometers. First and business class passengers are responsible for more than three times the emissions of economy travelers, and up to 13 times more in the most spacious premium cabins.

The average seat occupancy across all flights in 2023 was almost 80%. US airports accounted for a quarter of all aviation emissions and ran 14% more polluting than the global average. Atlanta and New York ranked among the least efficient airports overall, nearly 50% worse than top performers like Abu Dhabi and Madrid.
Programming

Rust's 'Vision Doc' Makes Recommendations to Help Keep Rust Growing (rust-lang.org) 80

The team authoring the Rust 2025 Vision Doc interviewed Rust developers to find out what they liked about the language — and have now issued three recommendations "to help Rust continue to scale across domains and usage levels."

— Enumerate and describe Rust's design goals and integrate them into our processes, helping to ensure they are observed by future language designers and the broader ecosystem.

— Double down on extensibility, introducing the ability for crates to influence the develop experience and the compilation pipeline.

— Help users to navigate the crates.io ecosystem and enable smoother interop


The real "empowering magic" of Rust arises from achieving a number of different attributes all at once — reliability, efficiency, low-level control, supportiveness, and so forth. It would be valuable to have a canonical list of those values that we could collectively refer to as a community and that we could use when evaluating RFCs or other proposed designs... We recommend creating an RFC that defines the goals we are shooting for as we work on Rust... One insight from our research is that we don't need to define which values are "most important". We've seen that for Rust to truly work, it must achieveallthe factors at once...

We recommenddoubling down on extensibilityas a core strategy. Rust's extensibility — traits, macros, operator overloading — has been key to its versatility. But that extensibility is currently concentrated in certain areas: the type system and early-stage proc macros. We should expand it to coversupportive interfaces(better diagnostics and guidance from crates) andcompilation workflow(letting crates integrate at more stages of the build process)... Doubling down on extensibility will not only make current Rust easier to use, it will enable and support Rust's use in new domains. Safety Critical applications in particular require a host of custom lints and tooling to support the associated standards. Compiler extensibility allows Rust to support those niche needs in a more general way.

We recommend finding ways to help users navigate the crates.io ecosystem... [F]inding which crates to use presents a real obstacle when people are getting started. The Rust org maintains a carefully neutral stance, which is good, but also means that people don't have anywhere to go for advice on a good "starter set" crates... Part of the solution is enabling better interop between libraries.

AI

Does AI Really Make Coders Faster? (technologyreview.com) 139

One developer tells MIT Technology Review that AI tools weaken the coding instincts he used to have. And beyond that, "It's just not fun sitting there with my work being done for me."

But is AI making coders faster? "After speaking to more than 30 developers, technology executives, analysts, and researchers, MIT Technology Review found that the picture is not as straightforward as it might seem..." For some developers on the front lines, initial enthusiasm is waning as they bump up against the technology's limitations. And as a growing body of research suggests that the claimed productivity gains may be illusory, some are questioning whether the emperor is wearing any clothes.... Data from the developer analytics firm GitClear shows that most engineers are producing roughly 10% more durable code — code that isn't deleted or rewritten within weeks — since 2022, likely thanks to AI. But that gain has come with sharp declines in several measures of code quality. Stack Overflow's survey also found trust and positive sentiment toward AI tools falling significantly for the first time. And most provocatively, a July study by the nonprofit research organization Model Evaluation & Threat Research (METR) showed that while experienced developers believed AI made them 20% faster, objective tests showed they were actually 19% slower...

Developers interviewed by MIT Technology Review generally agree on where AI tools excel: producing "boilerplate code" (reusable chunks of code repeated in multiple places with little modification), writing tests, fixing bugs, and explaining unfamiliar code to new developers. Several noted that AI helps overcome the "blank page problem" by offering an imperfect first stab to get a developer's creative juices flowing. It can also let nontechnical colleagues quickly prototype software features, easing the load on already overworked engineers. These tasks can be tedious, and developers are typically glad to hand them off. But they represent only a small part of an experienced engineer's workload. For the more complex problems where engineers really earn their bread, many developers told MIT Technology Review, the tools face significant hurdles...

The models also just get things wrong. Like all LLMs, coding models are prone to "hallucinating" — it's an issue built into how they work. But because the code they output looks so polished, errors can be difficult to detect, says James Liu, director of software engineering at the advertising technology company Mediaocean. Put all these flaws together, and using these tools can feel a lot like pulling a lever on a one-armed bandit. "Some projects you get a 20x improvement in terms of speed or efficiency," says Liu. "On other things, it just falls flat on its face, and you spend all this time trying to coax it into granting you the wish that you wanted and it's just not going to..." There are also more specific security concerns, she says. Researchers have discovered a worrying class of hallucinations where models reference nonexistent software packages in their code. Attackers can exploit this by creating packages with those names that harbor vulnerabilities, which the model or developer may then unwittingly incorporate into software.

Other key points from the article:
  • LLMs can only hold limited amounts of information in context windows, so "they struggle to parse large code bases and are prone to forgetting what they're doing on longer tasks."
  • "While an LLM-generated response to a problem may work in isolation, software is made up of hundreds of interconnected modules. If these aren't built with consideration for other parts of the software, it can quickly lead to a tangled, inconsistent code base that's hard for humans to parse and, more important, to maintain."
  • "Accumulating technical debt is inevitable in most projects, but AI tools make it much easier for time-pressured engineers to cut corners, says GitClear's Harding. And GitClear's data suggests this is happening at scale..."
  • "As models improve, the code they produce is becoming increasingly verbose and complex, says Tariq Shaukat, CEO of Sonar, which makes tools for checking code quality. This is driving down the number of obvious bugs and security vulnerabilities, he says, but at the cost of increasing the number of 'code smells' — harder-to-pinpoint flaws that lead to maintenance problems and technical debt."

Yet the article cites a recent Stanford University study that found employment among software developers aged 22 to 25 dropped nearly 20% between 2022 and 2025, "coinciding with the rise of AI-powered coding tools."

The story is part of MIT Technology Review's new Hype Correction series of articles about AI.


Microsoft

Extortion and Ransomware Drive Over Half of Cyberattacks — Sometimes Using AI, Microsoft Finds (microsoft.com) 23

Microsoft said in a blog post this week that "over half of cyberattacks with known motives were driven by extortion or ransomware... while attacks focused solely on espionage made up just 4%."

And Microsoft's annual digital threats report found operations expanding even more through AI, with cybercriminals "accelerating malware development and creating more realistic synthetic content, enhancing the efficiency of activities such as phishing and ransomware attacks." [L]egacy security measures are no longer enough; we need modern defenses leveraging AI and strong collaboration across industries and governments to keep pace with the threat...

Over the past year, both attackers and defenders harnessed the power of generative AI. Threat actors are using AI to boost their attacks by automating phishing, scaling social engineering, creating synthetic media, finding vulnerabilities faster, and creating malware that can adapt itself... For defenders, AI is also proving to be a valuable tool. Microsoft, for example, uses AI to spot threats, close detection gaps, catch phishing attempts, and protect vulnerable users. As both the risks and opportunities of AI rapidly evolve, organizations must prioritize securing their AI tools and training their teams...

Amid the growing sophistication of cyber threats, one statistic stands out: more than 97% of identity attacks are password attacks. In the first half of 2025 alone, identity-based attacks surged by 32%. That means the vast majority of malicious sign-in attempts an organization might receive are via large-scale password guessing attempts. Attackers get usernames and passwords ("credentials") for these bulk attacks largely from credential leaks. However, credential leaks aren't the only place where attackers can obtain credentials. This year, we saw a surge in the use of infostealer malware by cybercriminals...

Luckily, the solution to identity compromise is simple. The implementation of phishing-resistant multifactor authentication (MFA) can stop over 99% of this type of attack even if the attacker has the correct username and password combination.

"Security is not only a technical challenge but a governance imperative..." Microsoft adds in their blog post. "Governments must build frameworks that signal credible and proportionate consequences for malicious activity that violates international rules." (The report also found that America is the #1 most-targeted country — and that many U.S. companies have outdated cyber defenses.)

But while "most of the immediate attacks organizations face today come from opportunistic criminals looking to make a profit," Microsoft writes that nation-state threats "remain a serious and persistent threat." More details from the Associated Press: Russia, China, Iran and North Korea have sharply increased their use of artificial intelligence to deceive people online and mount cyberattacks against the United States, according to new research from Microsoft. This July, the company identified more than 200 instances of foreign adversaries using AI to create fake content online, more than double the number from July 2024 and more than ten times the number seen in 2023.
Examples of foreign espionage cited by the article:
  • China is continuing its broad push across industries to conduct espionage and steal sensitive data...
  • Iran is going after a wider range of targets than ever before, from the Middle East to North America, as part of broadening espionage operations..
  • "[O]utside of Ukraine, the top ten countries most affected by Russian cyber activity all belong to the North Atlantic Treaty Organization (NATO) — a 25% increase compared to last year."
  • North Korea remains focused on revenue generation and espionage...

There was one especially worrying finding. The report found that critical public services are often targeted, partly because their tight budgets limit their incident response capabilities, "often resulting in outdated software.... Ransomware actors in particular focus on these critical sectors because of the targets' limited options. For example, a hospital must quickly resolve its encrypted systems, or patients could die, potentially leaving no other recourse but to pay."


Open Source

Rust Foundation Announces 'Innovation Lab' to Support Impactful Rust Projects (webpronews.com) 30

Announced this week at RustConf 2025 in Seattle, the new Rust Innovation Lab will offer open source projects "the opportunity to receive fiscal sponsorship from the Rust Foundation, including governance, legal, networking, marketing, and administrative support."

And their first project will be the TLS library Rustls (for cryptographic security), which they say "demonstrates Rust's ability to deliver both security and performance in one of the most sensitive areas of modern software infrastructure." Choosing Rustls "underscores the lab's focus on infrastructure-critical tools, where reliability is paramount," argues explains WebProNews. But "Looking ahead, the foundation plans to expand the lab's portfolio, inviting applications from promising Rust initiatives. This could catalyze innovations in areas like embedded systems and blockchain, where Rust's efficiency shines."

Their article notes that the Rust Foundation "sees the lab as a way to accelerate innovation while mitigating the operational burdens that often hinder open-source development." [T]he Foundation aims to provide a stable, neutral environment for select Rust endeavors, complete with governance oversight, legal and administrative backing, and fiscal sponsorship... At its core, the Rust Innovation Lab addresses a growing need within the developer community for structured support amid Rust's rising adoption in sectors like systems programming and web infrastructure. By offering a "home" for projects that might otherwise struggle with sustainability, the lab ensures continuity and scalability. This comes at a time when Rust's memory safety features are drawing attention from major tech firms, including those in cloud computing and cybersecurity, as a counter to vulnerabilities plaguing languages like C++...

Industry observers note that such fiscal sponsorship could prove transformative, enabling projects to secure funding from diverse sources while maintaining independence. The Rust Foundation's involvement ensures compliance with best practices, potentially attracting more corporate backers wary of fragmented open-source efforts... By providing a neutral venue, the foundation aims to prevent the pitfalls seen in other ecosystems, such as project abandonment due to maintainer burnout or legal entanglements... For industry insiders, the Rust Innovation Lab represents a strategic evolution, potentially accelerating Rust's integration into mission-critical systems.

Biotech

As Demand for Plant-Based Meat Weakens in the US, Beyond Disappoints Wall Street (msn.com) 222

Wedneday Beyond Meat "missed Wall Street estimates for second-quarter revenue," reports Reuters. "Consumers' growing concerns about processed foods are severely diminishing the appeal of Beyond Meat's product line, causing retailers and quick service restaurants to pull back sharply on orders," Rachel Wolff, analyst at Emarketer, said.

Retail sales of refrigerated plant-based meat alternative products in the U.S. have fallen 17.2% so far this year, and frozen plant-based meat alternatives have fallen 8.1%, according to data from SPINS... [Beyond's] revenue for the quarter ended June 28 fell nearly 20% to $75 million, compared with analysts' average estimate of $82 million, according to data compiled by LSEG.

While the company arguably invented a new market for plant-based meat substitutes, it also "owns no real intellectual property," argues The Street. "And every company in the meat and grocery business (more or less) now sells a take-off of a product that already had limited appeal..." Beyond Meat has admitted it's in trouble by hiring corporate restructuring expert John Boken from consultancy AlixPartners as interim chief transformation officer [with a focus that includes "operating expense reduction" and "broader operational efficiency"]. It has also let go of 44 employees in North America (6% of its global workforce) as it seeks to cut operating expenses amid disappointing sales... Beyond Meat also has a significant cash problem. As of June 28, 2025, Beyond Meat's cash and cash equivalents balance was $117.3 million, and total outstanding debt was $1.2 billion. The company does have time to fend off a Chapter 11 bankruptcy filing, but it also has limited, if any, prospects to meet its impending cash needs.
Programming

Apple Migrates Its Password Monitoring Service to Swift from Java, Gains 40% Performance Uplift (infoq.com) 109

Meta and AWS have used Rust, and Netflix uses Go,reports the programming news site InfoQ. But using another language, Apple recently "migrated its global Password Monitoring service from Java to Swift, achieving a 40% increase in throughput, and significantly reducing memory usage."

This freed up nearly 50% of their previously allocated Kubernetes capacity, according to the article, and even "improved startup time, and simplified concurrency." In a recent post, Apple engineers detailed how the rewrite helped the service scale to billions of requests per day while improving responsiveness and maintainability... "Swift allowed us to write smaller, less verbose, and more expressive codebases (close to 85% reduction in lines of code) that are highly readable while prioritizing safety and efficiency."

Apple's Password Monitoring service, part of the broader Password app's ecosystem, is responsible for securely checking whether a user's saved credentials have appeared in known data breaches, without revealing any private information to Apple. It handles billions of requests daily, performing cryptographic comparisons using privacy-preserving protocols. This workload demands high computational throughput, tight latency bounds, and elastic scaling across regions... Apple's previous Java implementation struggled to meet the service's growing performance and scalability needs. Garbage collection caused unpredictable pause times under load, degrading latency consistency. Startup overhead — from JVM initialization, class loading, and just-in-time compilation, slowed the system's ability to scale in real time. Additionally, the service's memory footprint, often reaching tens of gigabytes per instance, reduced infrastructure efficiency and raised operational costs.

Originally developed as a client-side language for Apple platforms, Swift has since expanded into server-side use cases.... Swift's deterministic memory management, based on reference counting rather than garbage collection (GC), eliminated latency spikes caused by GC pauses. This consistency proved critical for a low-latency system at scale. After tuning, Apple reported sub-millisecond 99.9th percentile latencies and a dramatic drop in memory usage: Swift instances consumed hundreds of megabytes, compared to tens of gigabytes with Java.

"While this isn't a sign that Java and similar languages are in decline," concludes InfoQ's article, "there is growing evidence that at the uppermost end of performance requirements, some are finding that general-purpose runtimes no longer suffice."
United States

CISA Loses Nearly All Top Officials (cybersecuritydive.com) 56

Multiple readers shared the following report about the executive departures at CISA: Virtually all of the top officials at the Cybersecurity and Infrastructure Security Agency (CISA) have departed the agency or will do so this month, according to an email obtained by Cybersecurity Dive, further widening a growing void in expertise and leadership at the government's lead cyber defense force at a time when tensions with foreign adversaries are escalating.

Five of CISA's six operational divisions and six of its 10 regional offices will have lost top leaders by the end of the month, the agency's new deputy director, Madhu Gottumukkala, informed employees in an email on Thursday. [...] The exits of these leaders could undermine the efficiency and strategic clarity of CISA's partnerships with critical infrastructure operators, private security firms, foreign allies, state governments and local emergency managers, experts say.

Open Source

OSU's Open Source Lab Eyes Infrastructure Upgrades and Sustainability After Recent Funding Success (osuosl.org) 11

It's a nonprofit that's provide hosting for the Linux Foundation, the Apache Software Foundation, Drupal, Firefox, and 160 other projects — delivering nearly 430 terabytes of information every month. (It's currently hosting Debian, Fedora, and Gentoo Linux.) But hosting only provides about 20% of its income, with the rest coming from individual and corporate donors (including Google and IBM). "Over the past several years, we have been operating at a deficit due to a decline in corporate donations," the Open Source Lab's director announced in late April.

It's part of the CS/electrical engineering department at Oregon State University, and while the department "has generously filled this gap, recent changes in university funding makes our current funding model no longer sustainable. Unless we secure $250,000 in committed funds, the OSL will shut down later this year."

But "Thankfully, the call for support worked, paving the way for the OSU Open Source Lab to look ahead, into what the future holds for them," reports the blog It's FOSS.

"Following our OSL Future post, the community response has been incredible!" posted director Lance Albertson. "Thanks to your amazing support, our team is funded for the next year. This is a huge relief and lets us focus on building a truly self-sustaining OSL." To get there, we're tackling two big interconnected goals:

1. Finding a new, cost-effective physical home for our core infrastructure, ideally with more modern hardware.
2. Securing multi-year funding commitments to cover all our operations, including potential new infrastructure costs and hardware refreshes.


Our current data center is over 20 years old and needs to be replaced soon. With Oregon State University evaluating the future of this facility, it's very likely we'll need to relocate in the near future. While migrating to the State of Oregon's data center is one option, it comes with significant new costs. This makes finding free or very low-cost hosting (ideally between Eugene and Portland for ~13-20 racks) a huge opportunity for our long-term sustainability. More power-efficient hardware would also help us shrink our footprint.

Speaking of hardware, refreshing some of our older gear during a move would be a game-changer. We don't need brand new, but even a few-generations-old refurbished systems would boost performance and efficiency. (Huge thanks to the Yocto Project and Intel for a recent hardware donation that showed just how impactful this is!) The dream? A data center partner donating space and cycled-out hardware. Our overall infrastructure strategy is flexible. We're enhancing our OpenStack/Ceph platforms and exploring public cloud credits and other donated compute capacity. But whatever the resource, it needs to fit our goals and come with multi-year commitments for stability. And, a physical space still offers unique value, especially the invaluable hands-on data center experience for our students....

[O]ur big focus this next year is locking in ongoing support — think annualized pledges, different kinds of regular income, and other recurring help. This is vital, especially with potential new data center costs and hardware needs. Getting this right means we can stop worrying about short-term funding and plan for the future: investing in our tech and people, growing our awesome student programs, and serving the FOSS community. We're looking for partners, big and small, who get why foundational open source infrastructure matters and want to help us build this sustainable future together.

The It's FOSS blog adds that "With these prerequisites in place, the OSUOSL intends to expand their student program, strengthen their managed services portfolio for open source projects, introduce modern tooling like Kubernetes and Terraform, and encourage more community volunteers to actively contribute."

Thanks to long-time Slashdot reader I'm just joshin for suggesting the story.
AI

Indian IT Faces Its Kodak Moment (indiadispatch.com) 54

An anonymous reader shares a report: Generative AI offers remarkable efficiency gains while presenting a profound challenge for the global IT services industry -- a sector concentrated in India and central to its export economy.

For decades, Indian technology firms thrived by deploying their engineering talent to serve primarily Western clients. Now they face a critical question. Will AI's productivity dividend translate into revenue growth? Or will fierce competition see these gains competed away through price reductions?

Industry soundings suggest the deflationary dynamic may already be taking hold. JPMorgan's conversations with executives, deal advisors and consultants across India's technology hubs reveal growing concern -- AI-driven efficiencies are fuelling pricing pressures. This threatens to constrain medium-term industry growth to a modest 4-5%, with little prospect of acceleration into fiscal year 2026. This emerging reality challenges the earlier narrative that AI would primarily unlock new revenue streams.

AI

Meet the Journalists Training AI Models for Meta and OpenAI (niemanlab.org) 18

After completing a journalism graduate degree, Carla McCanna took a job "training AI models to optimize accuracy and efficiency," according an article by Nieman Journalism Lab: Staff jobs are scarce... and the competition for them is daunting. (In 2024, the already beleaguered U.S. news industry cut nearly 5,000 jobs, up 59% from the previous year, according to an annual report from Challenger, Gray & Christmas....) For the past couple months, McCanna has been working close to full-time for [AI training data company] Outlier, picking up projects on its gig platform at about $35 per hour. Data work has quickly become her primary source of income and a hustle she's recommended [to her journalism program classmates]. "A lot of us are still looking for jobs. Three times I told someone what I do, and they're like, please send it to me," she said. "It's hard right now, and a lot of my colleagues are saying the same thing."

McCanna is just one of many journalists who has been courted by Outlier to take on part-time, remote data work over the past year... Several of them told me they have taken on Outlier projects to supplement their income or replace their work in journalism entirely, because of dwindling staff jobs or freelance assignments drying up. Some are early-career journalists like McCanna, but others are reporters with over a decade of experience. One thing they all had in common? Before last year they'd never heard of Outlier or even knew that this type of work existed.

Launched back in 2023, Outlier is a platform owned and managed by Scale AI, a San Francisco-based data annotation company valued at $13.8 billion. It counts among its customers the world's largest AI companies, including OpenAI, Meta, and Microsoft. Outlier, and similar platforms like CrowdGen and Remotasks, use networks of remote human workers to improve the AI models of their clients. Workers are paid by the hour for tasks like labeling training data, drafting test prompts, and grading the factual accuracy and grammar of outputs. Often their work is fed back into an AI model to improve its performance, through a process called reinforcement learning with human feedback (RLHF). This human feedback loop has been core to building models like OpenAI's GPT and Meta's Llama.

Aside from direct recruitment messages, I also found dozens of recent public job postings that underscore this growing trend of hiring journalists for data work... Rather than training a replacement, McCanna sees her data work as an asset, growing her knowledge of AI tools as they continue to embed in the workplace. "Actually doing this work you realize AI models still need us ... I think it's going to be a really, really long time until they can truly write like humans."

AI

After DeepSeek Shock, Alibaba Unveils Rival AI Model That Uses Less Computing Power (venturebeat.com) 59

Alibaba has unveiled a new version of its AI model, called Qwen2.5-Max, claiming benchmark scores that surpass both DeepSeek's recently released R1 model and industry standards like GPT-4o and Claude-3.5-Sonnet. The model achieves these results using a mixture-of-experts architecture that requires significantly less computational power than traditional approaches.

The release comes amid growing concerns about China's AI capabilities, following DeepSeek's R1 model launch last week that sent Nvidia's stock tumbling 17%. Qwen2.5-Max scored 89.4% on the Arena-Hard benchmark and demonstrated strong performance in code generation and mathematical reasoning tasks. Unlike U.S. companies that rely heavily on massive GPU clusters -- OpenAI reportedly uses over 32,000 high-end GPUs for its latest models -- Alibaba's approach focuses on architectural efficiency. The company claims this allows comparable AI performance while reducing infrastructure costs by 40-60% compared to traditional deployments.
Power

US Data-Center Power Use Could Nearly Triple By 2028, DOE-Backed Report Says (reuters.com) 37

U.S. data center power demand could nearly triple in the next three years, and consume as much as 12% of the country's electricity, as the industry undergoes an AI transformation, according to an unpublished Department of Energy-backed report seen by Reuters. The publication adds: The Lawrence Berkeley National Laboratory report, which is expected to be released on Friday, comes as the U.S. power industry and government agencies attempt to understand how the sudden rise of Big Tech's data-center demand will affect electrical grids, power bills and the climate.

By 2028, data-center annual energy use could reach between 74 and 132 gigawatts, or between 6.7% and 12% of total U.S. electricity consumption, according to the Berkeley Lab report. The industry standard-setting report included ranges that depended partly on the availability and demand for a type of AI chip known as GPUs. Currently, data centers make up a little more than 4% of the country's power load. "This really signals to us where the frontier is in terms of growing energy demand in the U.S.," said Avi Shultz, director of the DOE's Industrial Efficiency and Decarbonization Office.

Power

Can Heat Pumps Still Save the Planet from Climate Change? (msn.com) 310

"One technology critical to fighting climate change is lagging," reports the Washington Post, "thanks to a combination of high interest rates, rising costs, misinformation and the cycle of home construction. Adoption of heat pumps, one of the primary ways to cut emissions from buildings, has slowed in the United States and stalled in Europe, endangering the switch to clean energy.

"Heat pump investment in the United States has dropped by 4 percent in the past two years, even as sales of EVs have almost doubled, according to data from MIT and the Rhodium Group. In 13 European countries, heat pump sales dropped nearly in half in the first half of 2024, putting the European Union off-track for its climate goals." "Many many markets are falling," said Paul Kenny, the director general of the European Heat Pump Association. "It takes time to change people's minds about a heating system." Heat pumps — essentially air conditioners that can also work in reverse, heating a space as well as cooling it — are crucial to making buildings more climate-friendly. Around 60 percent of American homes are still heated with furnaces running on oil, natural gas, or even propane; to cut emissions from homes, all American houses and apartments will need to be powered by electricity...

In the United States, experts point to lags in construction, high interest rates, and general belt-tightening from inflation... [Cora Wyent, director of research for the electrification advocacy group Rewiring America] added, heat pumps are still growing as a share of overall heating systems, gaining ground on gas furnaces. In 2023, heat pumps made up 55 percent of all heating systems sold, while gas furnaces made up just 45 percent. "Heat pumps are continuing to increase their total market share," she said.

Homeowners may also run into trouble when trying to find contractors to install heat pumps. Barton James, the president and CEO of the Air Conditioning Contractors of America, says many contractors don't have training on how to properly install heat pumps; if they install them incorrectly, the ensuing problems can sour consumers on the technology... In the United States, low gas prices also make the economics of heat pumps more challenging. Gas is around three times cheaper than electricity — while heat pumps make up most of that ground with efficiency, they aren't the most cost-effective option for every household.

The Post also spoke to the manager for the carbon-free buildings team at the clean energy think tank RMI. They pointed out that heating systems need to be replaced roughly every 15 years — and the next cycle doesn't start until 2035.

The article concludes that "even with government policies and subsidies, many parts of the move to clean energy will require individual people to make changes to their lives. According to the International Energy Agency, the number of heat pumps will have to triple by 2030 to stay on track with climate goals. The only way to do that, experts say, is if incentives, personal beliefs, and technology all align."
Graphics

The Future of Halo Is Being Built With Unreal Engine 5 (theverge.com) 21

Along with 343 Industries now becoming Halo Studios, future Halo games will be developed using Unreal Engine 5. The Verge's Tom Warren reports: Halo moving to Unreal Engine 5 is being positioned as the first step of a transformation for Halo Studios to change its technology, structure, processes, and even culture. "We're not just going to try improve the efficiency of development, but change the recipe of how we make Halo games," says Pierre Hintze, studio head at Halo Studios. The team building Halo will move from the studio's Slipspace Engine to Unreal, after the proprietary engine it built for Halo Infinite became difficult to use and strained development. Halo Studios has had to dedicate a lot of staff to developing the Slipspace Engine, and parts of it are almost 25 years old.

"One of the primary things we're interested in is growing and expanding our world so players have more to interact with and more to experience," says Chris Matthews, art director at Halo Studios. "Nanite and Lumen [Unreal's rendering and lighting technologies] offer us an opportunity to do that in a way that the industry hasn't seen before. As artists, it's incredibly exciting to do that work." Halo Studios isn't committing to any release dates or new Halo game announcements just yet, but the team has been building some examples of Halo running in Unreal. Dubbed Project Foundry, the work is "neither a game nor a tech demo," but more of a research, development, and training tool. It's also the foundation for how the studio is changing up the way it builds Halo games.

Project Foundry has been built as if it was a shipping game so that a bunch of it can appear in Halo games in the future. "It's fair to say that our intent is that the majority of what we showcased in Foundry is expected to be in projects which we are building, or future projects," says Hintze. Project Foundry includes more detailed landscapes for Halo biomes, as well as foliage levels we haven't seen in Halo games in the past. Master Chief's armor has even been remodeled in this footage [...]. Halo Studios is now working on multiple Halo games, while the Slipstream Engine will continue to power Halo Infinite. "We had a disproportionate focus on trying to create the conditions to be successful in servicing Halo Infinite," says Hintze. "[But switching to Unreal] allows us to put all the focus on making multiple new experiences at the highest quality possible."

Wireless Networking

Bluetooth Upgrade Boosts Precision Tracking and Device Efficiency 55

The Bluetooth Special Interest Group (SIG) has released version 6.0 of the Bluetooth Core Specification, introducing several new features and enhancements. The update includes Bluetooth Channel Sounding, which brings true distance awareness to devices, potentially improving "Find My" solutions and digital key security.

Other additions include decision-based advertising filtering to improve scanning efficiency, and a monitoring advertisers feature to inform devices when other Bluetooth units move in and out of range. The specification also enhances the Isochronous Adaptation Layer to reduce latency in certain use cases. Version 6.0 expands the Link Layer Extended Feature Set to support a larger number of features, reflecting Bluetooth LE's growing sophistication. Additionally, it introduces negotiable frame spacing in connections and connected isochronous streams, moving away from the fixed 150 us value in previous versions.

Slashdot Top Deals