Earth

Water on Earth May Not Have Originated from an Asteroid Impact, Study Finds (discovermagazine.com) 29

Discover magazine reports that a team of researchers have produced evidence that the ancient building blocks for water have been here on earth "since early in the planet's history, according to a study published in the journal Icarus." Pinpointing when and where Earth's hydrogen [originated] is an essential key to understanding how life arose on the planet. Without hydrogen, there's no water, and without water, life can't exist here. Ironically, researchers turned to a meteorite containing hydrogen to prove that such former bodies did not provide the H2 ingredient of water's H2O recipe. They examined a rare type of meteorite — known as an enstatite chondrite — that was built similarly to early Earth 4.5 billion years ago and the team discovered hydrogen present in the chemical. The logic is that if this material resembling early Earth's composition can contain hydrogen, so too could the young planet....

Since the proto-Earth was made of material similar to enstatite chondrites, by the time the immature planet had grown large enough to be struck by asteroids, it would have already stashed enough hydrogen to explain Earth's present-day water supply.Although this study likely won't resolve the debate over Earth's original water source, it tilts the ta ble toward an internal, not external one. "We now think that the material that built our planet — which we can study using these rare meteorites — was far richer in hydrogen than we thought previously," James Bryson, an Oxford professor and an author of the paper, said in a press release. "This finding supports the idea that the formation of water on Earth was a natural process, rather than a fluke of hydrated asteroids bombarding our planet after it formed."

Data Storage

China Develops Flash Memory 10,000x Faster With 400-Picosecond Speed (interestingengineering.com) 91

Longtime Slashdot reader hackingbear shares a report from Interesting Engineering: A research team at Fudan University in Shanghai, China has built the fastest semiconductor storage device ever reported, a nonvolatile flash memory dubbed "PoX" that programs a single bit in 400 picoseconds (0.0000000004 s) -- roughly 25 billion operations per second. Conventional static and dynamic RAM (SRAM, DRAM) write data in 1-10 nanoseconds but lose everything when power is cut while current flash chips typically need micro to milliseconds per write -- far too slow for modern AI accelerators that shunt terabytes of parameters in real time.

The Fudan group, led by Prof. Zhou Peng at the State Key Laboratory of Integrated Chips and Systems, re-engineered flash physics by replacing silicon channels with two dimensional Dirac graphene and exploiting its ballistic charge transport. Combining ultralow energy with picosecond write speeds could eliminate separate highspeed SRAM caches and remove the longstanding memory bottleneck in AI inference and training hardware, where data shuttling, not arithmetic, now dominates power budgets. The team [which is now scaling the cell architecture and pursuing arraylevel demonstrations] did not disclose endurance figures or fabrication yield, but the graphene channel suggests compatibility with existing 2Dmaterial processes that global fabs are already exploring.
The result is published in the journal Nature.
Google

Federal Judge Declares Google's Digital Ad Network Is an Illegal Monopoly (apnews.com) 47

Longtime Slashdot reader schwit1 shares a report from the Associated Press: Google has been branded an abusive monopolist by a federal judge for the second time in less than a year, this time for illegally exploiting some of its online marketing technology to boost the profits fueling an internet empire currently worth $1.8 trillion. The ruling issued Thursday by U.S. District Judge Leonie Brinkema in Virginia comes on the heels of a separate decision in August that concluded Google's namesake search engine has been illegally leveraging its dominance to stifle competition and innovation. [...] The next step in the latest case is a penalty phase that will likely begin late this year or early next year. The same so-called remedy hearings in the search monopoly case are scheduled to begin Monday in Washington D.C., where Justice Department lawyers will try to convince U.S. District Judge Amit Mehta to impose a sweeping punishment that includes a proposed requirement for Google to sell its Chrome web browser.

Brinkema's 115-page decision centers on the marketing machine that Google has spent the past 17 years building around its search engine and other widely used products and services, including its Chrome browser, YouTube video site and digital maps. The system was largely built around a series of acquisitions that started with Google's $3.2 billion purchase of online ad specialist DoubleClick in 2008. U.S. regulators approved the deals at the time they were made before realizing that they had given the Mountain View, California, company a platform to manipulate the prices in an ecosystem that a wide range of websites depend on for revenue and provides a vital marketing connection to consumers.

The Justice Department lawyers argued that Google built and maintained dominant market positions in a technology trifecta used by website publishers to sell ad space on their webpages, as well as the technology that advertisers use to get their ads in front of consumers, and the ad exchanges that conduct automated auctions in fractions of a second to match buyer and seller. After evaluating the evidence presented during a lengthy trial that concluded just before Thanksgiving last year, Brinkema reached a decision that rejected the Justice Department's assertions that Google has been mistreating advertisers while concluding the company has been abusing its power to stifle competition to the detriment of online publishers forced to rely on its network for revenue.

"For over a decade, Google has tied its publisher ad server and ad exchange together through contractual policies and technological integration, which enabled the company to establish and protect its monopoly power in these two markets." Brinkema wrote. "Google further entrenched its monopoly power by imposing anticompetitive policies on its customers and eliminating desirable product features." Despite that rebuke, Brinkema also concluded that Google didn't break the law when it snapped Doubleclick nor when it followed up that deal a few years later by buying another service, Admeld. The Justice Department "failed to show that the DoubleClick and Admeld acquisitions were anticompetitive," Brinkema wrote. "Although these acquisitions helped Google gain monopoly power in two adjacent ad tech markets, they are insufficient, when viewed in isolation, to prove that Google acquired or maintained this monopoly power through exclusionary practices." That finding may help Google fight off any attempt to force it to sell its advertising technology to stop its monopolistic behavior.

EU

Meta Starts Using Data From EU Users To Train Its AI Models (engadget.com) 29

Meta said the company plans to start using data collected from its users in the European Union to train its AI systems. Engadget reports: Starting this week, the tech giant will begin notifying Europeans through email and its family of apps of the fact, with the message set to include an explanation of the kind of data it plans to use as part of the training. Additionally, the notification will link out to a form users can complete to opt out of the process. "We have made this objection form easy to find, read, and use, and we'll honor all objection forms we have already received, as well as newly submitted ones," says Meta.

The company notes it will only use data it collects from public posts and Meta AI interactions for training purposes. It won't use private messages in its training sets, nor any interactions, public or otherwise, made by users under the age of 18. As for why the company wants to start using EU data now, it claims the information will allow it to fine tune its future models to better serve Europeans.
"We believe we have a responsibility to build AI that's not just available to Europeans, but is actually built for them. That's why it's so important for our generative AI models to be trained on a variety of data so they can understand the incredible and diverse nuances and complexities that make up European communities," Meta states.

"That means everything from dialects and colloquialisms, to hyper-local knowledge and the distinct ways different countries use humor and sarcasm on our products. This is particularly important as AI models become more advanced with multi-modal functionality, which spans text, voice, video, and imagery."
United States

Nvidia To Make AI Supercomputers in US for First Time (nvidia.com) 37

Nvidia has announced plans to manufacture AI supercomputers entirely within the United States, commissioning over 1 million square feet of manufacturing space across Arizona and Texas. Production of Blackwell chips has begun at TSMC's Phoenix facilities, while supercomputer assembly will occur at new Foxconn and Wistron plants in Houston and Dallas respectively.

"The engines of the world's AI infrastructure are being built in the United States for the first time," said Jensen Huang, Nvidia's founder and CEO. "Adding American manufacturing helps us better meet the incredible and growing demand for AI chips and supercomputers, strengthens our supply chain and boosts our resiliency."

The company will deploy its own AI, robotics, and digital twin technologies in these facilities, using Nvidia Omniverse to create digital twins of factories and Isaac GR00T to build manufacturing automation robots. Nvidia projects an ambitious $500 billion in domestic AI infrastructure production over the next four years, with manufacturing expected to create hundreds of thousands of jobs.
Transportation

An Electric Racecar Drives Upside Down (jalopnik.com) 57

Formula One cars, the world's fastest racecars, need to grip the track for speed and safety on the curves — leading engineers to design cars that create downforce. And racing fans are even told that "a Formula 1 racecar generates enough downforce above a certain speed that it could theoretically drive upside down," writes the automotive site Jalopnik.

"McMurtry Automotive turned this theory into reality after having its Spéirling hypercar complete the impressive feat..." Admittedly, the Spéirling's success can be solely attributed to its proprietary 'Downforce-on-Demand' fan system that produces 4,400 pounds of downforce at the push of a button... For those looking to do the math, Spéirling weighs 2,200 pounds. With the stopped car's fan whirling at 23,000 rpm, the rig was rotated to invert the road deck... Then, the hypercar rolled forward a few feet before stopping while inverted. The rig rotated the road deck back down, and the Spéirling drove off like nothing happened.

The McMurtry Spéirling, as a 1,000-hp twin-motor electric hypercar, didn't have to clear the other hurdles that an F1 car would have clear to drive upside down. Dry-sump combustion engines aren't designed to run inverted and would eventually fail catastrophically. Oil wouldn't be able to cycle through and keep the engine lubricated.

The car is "an electric monster purpose-built to destroy track records," Jalopnik wrote in 2022 when the car shaved more than two seconds off a long-standing record. The "Downforce-on-Demand" feature gives it tremendous acceleration — in nine seconds it can go from 0 to 186.4 mph (300 km/h), according to Jalopnik.

"McMurtry is working towards finalizing a production version of its hypercar, called the Spéirling PURE. Only 100 will be produced."
Facebook

Facebook Whistleblower Alleges Meta's AI Model Llama Was Used to Help DeepSeek (cbsnews.com) 10

A former Facebook employee/whistleblower alleges Meta's AI model Lllama was used to help DeepSeek.

The whistleblower — former Facebook director of global policy Sarah Wynn-Williams — testified before U.S. Senators on Wednesday. CBS News found this earlier response from Meta: In a statement last year on Llama, Meta spokesperson Andy Stone wrote, "The alleged role of a single and outdated version of an American open-source model is irrelevant when we know China is already investing over 1T to surpass the US technologically, and Chinese tech companies are releasing their own open AI models as fast, or faster, than US ones."

Wynn-Williams encouraged senators to continue investigating Meta's role in the development of artificial intelligence in China, as they continue their probe into the social media company founded by Zuckerberg. "The greatest trick Mark Zuckerberg ever pulled was wrapping the American flag around himself and calling himself a patriot and saying he didn't offer services in China, while he spent the last decade building an $18 billion business there," she said.

The testimony also left some of the lawmakers skeptical of Zuckerberg's commitment to free speech after the whistleblower also alleged Facebook worked "hand in glove" with the Chinese government to censor its platforms: In her almost seven years with the company, Wynn-Williams told the panel she witnessed the company provide "custom built censorship tools" for the Chinese Communist Party. She said a Chinese dissident living in the United States was removed from Facebook in 2017 after pressure from Chinese officials. Facebook said at the time it took action against the regime critic, Guo Wengui, for sharing someone else's personal information. Wynn-Williams described the use of a "virality counter" that flagged posts with over 10,000 views for review by a "chief editor," which Democratic Sen. Richard Blumenthal of Connecticut called "an Orwellian censor." These "virality counters" were used not only in Mainland China, but also in Hong Kong and Taiwan, according to Wynn-Williams's testimony.

Wynn-Williams also told senators Chinese officials could "potentially access" the data of American users.

United Kingdom

Gas Boiler Fittings Outnumbered Heat Pumps By 15 To One in UK Last Year - Report (theguardian.com) 132

An anonymous reader shares a report: Gas boiler fittings outnumbered new heat pump installations by more than 15 to one last year, and only one in eight new homes were equipped with the low-carbon alternative despite the government's clean energy targets.

Poorer households are also being shut out of the heat pump market as the grants available are inadequate and should be increased, according to a report by the Resolution Foundation thinktank. The UK has the slowest introduction of heat pumps in Europe: fewer than 100,000 were fitted last year, compared with 1.5m gas boilers. Most of the boilers were replacements for existing units, but new houses are still being built with gas as standard -- only 13% of new homes came with heat pumps last year.

If the government is to meet its net zero targets, switching people to heat pumps will be essential: about 450,000 households will need to install them each year by 2030. But the grant available through the boiler upgrade scheme -- $9,700 in England and Wales -- still leaves homeowners paying about $7000 on average.

AI

The AI Therapist Can See You Now (npr.org) 115

New research suggests that given the right kind of training, AI bots can deliver mental health therapy with as much efficacy as -- or more than -- human clinicians. From a report: The recent study, published in the New England Journal of Medicine, shows results from the first randomized clinical trial for AI therapy. Researchers from Dartmouth College built the bot as a way of taking a new approach to a longstanding problem: The U.S. continues to grapple with an acute shortage of mental health providers. "I think one of the things that doesn't scale well is humans," says Nick Jacobson, a clinical psychologist who was part of this research team. For every 340 people in the U.S., there is just one mental health clinician, according to some estimates.

While many AI bots already on the market claim to offer mental health care, some have dubious results or have even led people to self-harm. More than five years ago, Jacobson and his colleagues began training their AI bot in clinical best practices. The project, says Jacobson, involved much trial and error before it led to quality outcomes. "The effects that we see strongly mirror what you would see in the best evidence-based trials of psychotherapy," says Jacobson. He says these results were comparable to "studies with folks given a gold standard dose of the best treatment we have available."

Google

Samsung and Google Partner To Launch Ballie Home Robot with Built-in Projector (engadget.com) 25

Samsung Electronics and Google Cloud are jointly entering the consumer robotics market with Ballie, a yellow, soccer-ball-shaped robot equipped with a video projector and powered by Google's Gemini AI models. First previewed in 2020, the long-delayed device will finally launch this summer in the US and South Korea. The mobile companion uses small wheels to navigate homes autonomously and integrates with Samsung's SmartThings platform to control smart home devices.

Running on Samsung's Tizen operating system, Ballie can manage calendars, answer questions, handle phone calls, and project video content from services including YouTube and Netflix. Samsung EVP Jay Kim described it as a "completely new Ballie" compared to the 2020 version, with Google Cloud integration being the most significant change. The robot leverages Gemini for understanding commands, searching the web, and processing visual data for navigation, while using Samsung's AI models for accessing personal information.
China

China Launches GPMI, a Powerful Alternative To HDMI and DisplayPort (tomshardware.com) 136

AmiMoJo writes: The Shenzhen 8K UHD Video Industry Cooperation Alliance, a group made up of more than 50 Chinese companies, just released a new wired media communication standard called the General Purpose Media Interface or GPMI. This standard was developed to support 8K and reduce the number of cables required to stream data and power from one device to another. According to HKEPC, the GPMI cable comes in two flavors -- a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard.

Because 8K has four times the number of pixels of 4K and 16 times more pixels than 1080p resolution, it means that GPMI is built to carry a lot more data than other current standards. There are other variables that can impact required bandwidth, of course, such as color depth and refresh rate. The GPMI Type-C connector is set to have a maximum bandwidth of 96 Gbps and deliver 240 watts of power. This is more than double the 40 Gbps data limit of USB4 and Thunderbolt 4, allowing you to transmit more data on the cable. However, it has the same power limit as that of the latest USB Type-C connector using the Extended Power Range (EPR) standard. GPMI Type-B beats all other cables, though, with its maximum bandwidth of 192 Gbps and power delivery of up to 480 watts.

Mars

Could We Reach Mars Faster With Nuclear Fusion-Powered Rockets? (cnn.com) 81

Nuclear fusion — which releases four times the energy of fission — could theoretically happen sooner in space than on earth, reports CNN.

"And it could help spacecraft achieve speeds of up to 500,000 miles (805,000 kilometers) per hour — more than the fastest object ever built..." With funding from the UK Space Agency, British startup Pulsar Fusion has unveiled Sunbird, a space rocket concept designed to meet spacecraft in orbit, attach to them, and carry them to their destination at breakneck speed using nuclear fusion... For now, Sunbird is in the very early stages of construction and it has exceptional engineering challenges to overcome, but Pulsar says it hopes to achieve fusion in orbit for the first time in 2027. [Pulsar's founder/CEO says the first functional Sunbird would be ready four to five years later.]

If the rocket ever becomes operational, it could one day cut the journey time of a potential mission to Mars in half.

CNN says the proposed Sunbird process would use helium-3 — which may be abundant on the Moon — to generate protons which "can be used as a 'nuclear exhaust' to provide propulsion". (And without generating any dangerous radioactive material.) "It's very unnatural to do fusion on Earth," says Richard Dinan, founder and CEO of Pulsar. "Fusion doesn't want to work in an atmosphere. Space is a far more logical, sensible place to do fusion, because that's where it wants to happen anyway...."

Sunbirds would operate similarly to city bikes at docking stations, according to Dinan: "We launch them into space, and we would have a charging station where they could sit and then meet your ship," he says. "You turn off your inefficient combustion engines, and use nuclear fusion for the greater part of your journey. Ideally, you'd have a station somewhere near Mars, and you'd have a station on low Earth orbit, and the (Sunbirds) would just go back and forth...." Initially, the Sunbirds will be offered for shuttling satellites in orbit, but their true potential would come into play with interplanetary missions. The company illustrates a few examples of the missions that Sunbird could unlock, such as delivering up to 2,000 kilograms (4,400 pounds) of cargo to Mars in under six months, deploying probes to Jupiter or Saturn in two to four years (NASA's Europa Clipper, launched in 2024 towards one of Jupiter's moons, will arrive after 5.5 years), and an asteroid mining mission that would complete a round trip to a near-Earth asteroid in one to two years instead of three.

Other companies are working on nuclear fusion engines for space propulsion, including Pasadena-based Helicity Space, which received investment from aerospace giant Lockheed Martin in 2024. San Diego-based General Atomics and NASA are working on another type of nuclear reactor — based on fission rather than fusion — which they plan to test in space in 2027.

AI

Two Teenagers Built 'Cal AI', a Photo Calorie App With Over a Million Users (techcrunch.com) 24

An anonymous reader quotes a report from TechCrunch: In a world filled with "vibe coding," Zach Yadegari, teen founder of Cal AI, stands in ironic, old-fashioned contrast. Ironic because Yadegari and his co-founder, Henry Langmack, are both just 18 years old and still in high school. Yet their story, so far, is a classic. Launched in May, Cal AI has generated over 5 million downloads in eight months, Yadegari says. Better still, he tells TechCrunch that the customer retention rate is over 30% and that the app generated over $2 million in revenue last month. [...]

The concept is simple: Take a picture of the food you are about to consume, and let the app log calories and macros for you. It's not a unique idea. For instance, the big dog in calorie counting, MyFitnessPal, has its Meal Scan feature. Then there are apps like SnapCalorie, which was released in 2023 and created by the founder of Google Lens. Cal AI's advantage, perhaps, is that it was built wholly in the age of large image models. It uses models from Anthropic and OpenAI and RAG to improve accuracy and is trained on open source food calorie and image databases from sites like GitHub.

"We have found that different models are better with different foods," Yadegari tells TechCrunch. Along the way, the founders coded through technical problems like recognizing ingredients from food packages or in jumbled bowls. The result is an app that the creators say is 90% accurate, which appears to be good enough for many dieters.
The report says Yadegari began mastering Python and C# in middle school and went on to build his first business in ninth grade -- a website called Totally Science that gave students access to unblocked games (cleverly named to evade school filters). He sold the company at age 16 to FreezeNova for $100,000.

Following the sale, Yadegari immersed himself in the startup scene, watching Y Combinator videos and networking on X, where he met co-founder Blake Anderson, known for creating ChatGPT-powered apps like RizzGPT. Together, they launched Cal AI and moved to a hacker house in San Francisco to develop their prototype.
Security

Google Launches Sec-Gemini v1 AI Model To Improve Cybersecurity Defense 2

Google has introduced Sec-Gemini v1, an experimental AI model built on its Gemini platform and tailored for cybersecurity. BetaNews reports: Sec-Gemini v1 is built on top of Gemini, but it's not just some repackaged chatbot. Actually, it has been tailored with security in mind, pulling in fresh data from sources like Google Threat Intelligence, the OSV vulnerability database, and Mandiant's threat reports. This gives it the ability to help with root cause analysis, threat identification, and vulnerability triage.

Google says the model performs better than others on two well-known benchmarks. On CTI-MCQ, which measures how well models understand threat intelligence, it scores at least 11 percent higher than competitors. On CTI-Root Cause Mapping, it edges out rivals by at least 10.5 percent. Benchmarks only tell part of the story, but those numbers suggest it's doing something right.
Access is currently limited to select researchers and professionals for early testing. If you meet that criteria, you can request access here.
Power

Open-Source Tool Designed To Throttle PC and Server Performance Based On Electricity Pricing (tomshardware.com) 56

Robotics and machine learning engineer Naveen Kul developed WattWise, a lightweight open-source CLI tool that monitors power usage via smart plugs and throttles system performance based on electricity pricing and peak hours. Tom's Hardware reports: The simple program, called WattWise, came about when Naveen built a dual-socket EPYC workstation with plans to add four GPUs. It's a power-intensive setup, so he wanted a way to monitor its power consumption using a Kasa smart plug. The enthusiast has released the monitoring portion of the project to the public now, but the portion that manages clocks and power will be released later. Unfortunately, the Kasa Smart app and the Home Assistant dashboard was inconvenient and couldn't do everything he desired. He already had a terminal window running monitoring tools like htop, nvtop, and nload, and decided to take matters into his own hands rather than dealing with yet another app.

Naveen built a terminal-based UI that shows power consumption data through Home Assistant and the TP-Link integration. The app monitors real-time power use, showing wattage and current, as well as providing historical consumption charts. More importantly, it is designed to automatically throttle CPU and GPU performance. Naveen's power provider uses Time-of-Use (ToU) pricing, so using a lot of power during peak hours can cost significantly more. The workstation can draw as much as 1400 watts at full load, but by reducing the CPU frequency from 3.7 GHz to 1.5 GHz, he's able to reduce consumption by about 225 watts. (No mention is made of GPU throttling, which could potentially allow for even higher power savings with a quad-GPU setup.)

Results will vary based on the hardware being used, naturally, and servers can pull far more power than a typical desktop -- even one designed and used for gaming. WattWise optimizes the system's clock speed based on the current system load, power consumption as reported by the smart plug, and the time -- with the latter factoring in peak pricing. From there, it uses a Proportional-Integral (PI) controller to manage the power and adapts system parameters based on the three variables.
A blog post with more information is available here.

WattWise is also available on GitHub.
Nintendo

Nintendo Switch 2 Arrives on June 5, Priced at $450 (engadget.com) 46

Nintendo's Switch 2, priced at $450, launches June 5 with a 7.9-inch LCD screen offering 1080p resolution, HDR support, and 120Hz refresh capability. The device maintains the original Switch's 13.99mm thickness while increasing internal storage to 256GB from the previous 32GB.

The console outputs at 4K/60fps when docked, with the dock featuring a built-in cooling fan. Two USB-C ports handle accessories and charging. The system supports microSD Express cards but not original Switch microSD cards. Joy-Con controllers now attach via magnets instead of sliding rails and feature mouse-like functionality with compatible games. Both Joy-Cons and the new Pro Controller include a "C" button that activates a chat menu for the new "Game Chat" feature.

Game cards for Switch 2 will be red rather than black. The system maintains backward compatibility with original Switch cartridges and introduces a "Game Share" feature for local game sharing between consoles.
Power

Nuclear Is Now 'Clean Energy' In Colorado (cpr.org) 135

With the signing of HB25-1040 on Monday, Colorado now defines nuclear as a "clean energy resource" since it doesn't release large amounts of climate-warming emissions. "The category was previously reserved for renewables like wind, solar and geothermal, which don't carry the radioactive stigma that's hobbled fission power plants following disasters like Chernobyl and Fukushima," notes Colorado Public Radio. From the report: In an emailed statement, Ally Sullivan, a spokesperson for the governor's office, said the law doesn't advance any specific nuclear energy project, and no utility has proposed building a nuclear power plant in Colorado. It does, however, allow nuclear energy to potentially serve as one piece of the state's plan to tackle climate change. "If nuclear energy becomes sufficiently cost-competitive, it could potentially become part of Colorado's clean energy future. However, it must be conducted safely, without harming communities, depleting other natural resources or replacing other clean energy sources," Sullivan said.

By redefining nuclear energy as "clean," the law would let future fission-based power plants obtain local grants previously reserved for other carbon-free energy sources, and it would allow those projects to contribute to Colorado's renewable energy goals. It also aligns state law with a push to reshape public opinion of nuclear energy. Nuclear energy proponents promise new reactor designs are smaller and safer than hulking power plants built in the 20th century. By embracing those systems, bill supporters claimed Colorado could meet rising energy demand without abandoning its ambitious climate goals.

Mozilla

Mozilla To Launch 'Thunderbird Pro' Paid Services (techspot.com) 36

Mozilla plans to introduce a suite of paid professional services for its open-source Thunderbird email client, transforming the application into a comprehensive communication platform. Dubbed "Thunderbird Pro," the package aims to compete with established ecosystems like Gmail and Office 365 while maintaining Mozilla's commitment to open-source software.

The Pro tier will include four core services: Thunderbird Appointment for streamlined scheduling, Thunderbird Send for file sharing (reviving the discontinued Firefox Send), Thunderbird Assist offering AI capabilities powered by Flower AI, and Thundermail, a revamped email client built on Stalwart's open-source stack. Initially, Thunderbird Pro will be available free to "consistent community contributors," with paid access for other users.

Mozilla Managing Director Ryan Sipes indicated the company may consider limited free tiers once the service establishes a sustainable user base. This initiative follows Mozilla's 2023 announcement about "remaking" Thunderbird's architecture to modernize its aging codebase, addressing user losses to more feature-rich competitors.
Programming

'There is No Vibe Engineering' 121

Software engineer Sergey Tselovalnikov weighs in on the new hype: The term caught on and Twitter quickly flooded with posts about how AI has radically transformed coding and will soon replace all software engineers. While AI undeniably impacts the way we write code, it hasn't fundamentally changed our role as engineers. Allow me to explain.

[...] Vibe coding is interacting with the codebase via prompts. As the implementation is hidden from the "vibe coder", all the engineering concerns will inevitably get ignored. Many of the concerns are hard to express in a prompt, and many of them are hard to verify by only inspecting the final artifact. Historically, all engineering practices have tried to shift all those concerns left -- to the earlier stages of development when they're cheap to address. Yet with vibe coding, they're shifted very far to the right -- when addressing them is expensive.

The question of whether an AI system can perform the complete engineering cycle and build and evolve software the same way a human can remains open. However, there are no signs of it being able to do so at this point, and if it one day happens, it won't have anything to do with vibe coding -- at least the way it's defined today.

[...] It is possible that there'll be a future where software is built from vibe-coded blocks, but the work of designing software able to evolve and scale doesn't go away. That's not vibe engineering -- that's just engineering, even if the coding part of it will look a bit different.
Microsoft

As Microsoft Turns 50, Four Employees Remember Its Early Days (seattletimes.com) 38

"Microsoft built things. It broke things."

That's how the Seattle Times kicks off a series of articles celebrating Microsoft's 50th anniversary — adding that Microsoft also gave some people "a lucrative retirement early in their lives, and their own stories to tell."

What did they remember from Microsoft's earliest days? Scott Oki joined Microsoft as employee no. 121. The company was small; Gates was hands-on, and hard to please. "One of his favorite phrases was 'that's the stupidest thing I've ever heard,'" Oki says. "He didn't use that on me, so I feel pretty good about that."

Another, kinder phrase that pops to Oki's mind when discussing the international division he founded at Microsoft is "bringing home the bacon." An obsession with rapid revenue growth permeated Microsoft in those early days. Oki was about three weeks into the job as marketing manager when he presented a global expansion plan to Gates. "Had I done business internationally before? No," Oki said. "Do I speak a language other than English? No." But Gates gave Oki a $1 million budget to found the international division and sell Microsoft products overseas.

He established subsidiaries in the most important markets at the time: Japan, United Kingdom, Germany and France. And, because he had a few bucks left over, Australia. "Of the initial subsidiaries we started, every single one of them was profitable in its first year," he says...

Oki left Microsoft on March 1, 1992, 10 years to the day after he was hired.

Other memories shared by early Microsoft employees:
  • One recent graudate remembered her parents in Spokane saying "I think that's Mary and Bill Gates' son's company. If that kid is anything like those two, that is going to be a great company,'" She got her first job at Microsoft in 1992 — and 33 years later, she's a senior director at Microsoft Philanthropies.
  • The Times also interviewed one of Microsoft's first lawyers, who remembers that "The day the U.S. government sued Microsoft ... that was a tough day for me. It kind of turned my world upside down for about the next eight years."
  • Microsoft senior VP Brad Chase remembers negotiating with the Rolling Stones for the rights to their song "Start Me Up" for the Windows 95 ad campaign. ("Chase is quick to dispel any rumor that Mick Jagger called up Bill Gates and got $12 million. But he won't say how much the company paid.")

    But Chase does tell the Times that Bill Gates "used to say all of the time, 'We're going to bet the company on Windows.' That was a huge bet because Windows, frankly, was a lousy product in its early days."

Slashdot Top Deals