Portables (Apple)

Apple's MacBook Neo Makes Repairs Easier, Cheaper Than Other MacBooks (arstechnica.com) 56

Apple's new MacBook Neo is "easier to repair than other modern MacBooks," according to Ars Technica's Andrew Cunningham. It introduces a more repairable internal design that makes components like the battery and keyboard easier and cheaper to replace. An anonymous reader quotes an excerpt from the report: Replacements for pretty much any component in the Neo are simpler and involve fewer steps and tools than in the M5 MacBook Air. That includes the battery, which in the MacBook Air is attached to the chassis with multiple screws and adhesive strips but which in the Neo comes out relatively easily after you get some shielding and flex cables out of the way. But the most significant change in the Neo is that the keyboard is its own separate component. For essentially all modern MacBooks, going back at least as far as the late-2000s unibody aluminum MacBook designs, the keyboard has been integrated into the top part of the laptop case and is extremely difficult, if not impossible, to replace independently.

[...] Apple hasn't yet listed MacBook Neo components in its parts store, but based on the repair prices it has announced, Neo components should cost quite a bit less than those for higher-end MacBooks. An out-of-warranty battery replacement for the Neo will cost $149, down from $199 for current Airs and $229 for current MacBook Pros; fixing accidental screen or external enclosure damage will cost AppleCare+ subscribers $49 for a Neo, down from $99 for other MacBooks.

AMD

AMD Will Bring Its 'Ryzen AI' Processors To Standard Desktop PCs For First Time (arstechnica.com) 27

An anonymous reader quotes a report from Ars Technica: AMD has been selling "Ryzen AI"-branded laptop processors for around a year and a half at this point. In addition to including modern CPU and GPU architectures, these are attempting to capitalize on the generative AI craze by offering chips with neural processing units (NPUs) suitable for running language and image-generation models locally, rather than on some company's server. But so far, AMD's desktop chips have lacked both these higher-performance NPUs and the Ryzen AI label. That changes today, at least a little: AMD is announcing its first three Ryzen AI chips for desktops using its AM5 CPU socket. These Ryzen AI 400-series CPUs are direct replacements for the Ryzen 8000G processors, rather than the Ryzen 9000-series, and they combine Zen 5-based CPU cores, RDNA 3.5 GPU cores, and an NPU capable of 50 trillion operations per second (TOPS). This makes them AMD's first desktop chips to qualify for Microsoft's Copilot+ PC label, which enables a handful of unique Windows 11 features like Recall and Click to Do.

The six chips AMD is announcing today -- the 65 W Ryzen AI 7 Pro 450G, Ryzen AI 5 Pro 440G, and Ryzen AI 5 Pro 435G, along with low-power 35 W "GE" variants -- all bear AMD's "Ryzen Pro" branding as well, which means they support a handful of device management capabilities that are important for business PCs managed by IT departments. At this point, it doesn't seem as though AMD will be offering boxed versions to regular consumers; the Ryzen AI desktop chips will appear mainly in business PCs that don't need a dedicated graphics card but still benefit from more robust graphics than AMD offers in regular Ryzen desktop CPUs. Like past G-series Ryzen chips, these are essentially laptop silicon repackaged for desktop systems. They share most of their specs in common with Ryzen AI 300 laptop processors, despite their Ryzen AI 400-series branding. The two chip generations are extremely similar overall, but the Ryzen AI 400-series laptop CPUs include slightly faster 55 TOPS NPUs.

Businesses

Majority of CEOs Report Zero Payoff From AI Splurge 53

A PwC survey of more than 4,500 CEOs found that over half report no revenue growth or cost savings from their AI investments so far, despite massive spending. Of the 4,454 business leaders surveyed, only 12% saw both lower costs and higher revenue, while 56% saw neither benefit. "26% saw reduced costs, but nearly as many experienced cost increases," adds The Register. From the report: AI adoption remains limited. Even in top use cases like demand generation (22 percent), support services (20 percent), and product development (19 percent), only a minority are deploying AI extensively. Last year, a separate PwC study found that only 14 percent of workers indicated they were using generative AI daily in their work. Despite the CEOs' repsonses, PwC concludes more investment is required. It claims that "isolated, tactical AI projects" often don't deliver measurable value, and that tangible returns instead come from enterprise-wide deployments consistent with business strategy. [...]

In terms of the broader picture, PwC says it found CEO confidence has hit a five-year low, with only 30 percent optimistic about revenue growth (down from 38 percent last year). This points to growing geopolitical risk and intensifying cyber threats, as well as uncertainty over the benefits and downsides of AI. Unsurprisingly, concern remains over tariffs as the Trump administration continues its erratic approach to policy, with almost a third of company chiefs saying tariffs are expected to reduce their company's profit margin in the year ahead. In the U.S., 22 percent indicate their corporation is highly or extremely exposed to tariffs. PwC warns that companies avoiding major investments due to geopolitical uncertainty underperform peers by two percentage points in growth and three points in profit margins.
Space

Could We Provide Better Cellphone Service With Fewer, Bigger Satellites? (reuters.com) 37

European satellite operator Eutelsat "plans to launch 440 Airbus-built LEO satellites in the coming years to replenish and expand its constellation," Reuters reported Friday. And last week America's Federal Communications Commission approved SpaceX's request to deploy another 7,500 Starlink satellites, while Starlink "projects it will eventually have a constellation of 34,000 satellites," writes Fast Company, and Amazon's Project Leo "plans to launch more than 3,200 satellites."

Meanwhile "Beijing and some Chinese companies are planning two separate mega-constellations, Guowang and G60 Starlink, totaling nearly 26,000 satellites," and this week the Chinese government "applied for launch permits for 200,000 satellites."

But a small Texas-based company called AST SpaceMobile "believes it can provide better service with fewer than 100 gigantic satellites in space." AST SpaceMobile has developed a direct-to-cell technology that utilizes large satellites called BlueBirds. These machines use thousands of antennas to deliver broadband coverage directly to standard mobile phones, says the company's president, Scott Wisniewski. "This approach is remarkably efficient: We can achieve global coverage with approximately 90 satellites, not thousands or even tens of thousands required by other systems," Wisniewski writes in an email...

The key is its satellites' size and sophistication. AST's first generation of commercial satellite, the BlueBird 1-5, unfolds into a massive 693-square-foot array in space. Today, the company has five operational BlueBird 1-5 satellites in orbit, but its ambitions are much bigger. On December 24, 2025, AST launched the first of its next-generation satellites from India — called Block 2 — and this one broke records. The BlueBird 6 has a surface of almost 2,400 square feet, making it the largest single satellite in low Earth orbit. The company plans to launch up to 60 more by the end of 2026. "This large surface area is essential for gathering faint signals from standard, unmodified mobile phones on the ground," Wisniewski explains. It is essentially a single, extremely powerful and sensitive cell tower in the sky, capable of serving a huge geographical area...

To be clear, AST SpaceMobile's approach is not without its own controversies. The sheer size of the company's satellites makes them incredibly bright in the night sky, a significant source of frustration for ground-based astronomers. McDowell confirms that when it launched in 2022, AST's prototype satellite, BlueWalker 3, became "one of the top 10 brightest objects in the night sky for a while."

"It's a serious issue, and we are working directly with the astronomy community to mitigate our impact," Wisniewski says. The company is exploring solutions like anti-reflective coatings and operational adjustments to minimize the time its satellites are at maximum brightness...

AST SpaceMobile has already proven its technology works, the article points out, with six working satellites now transmitting at typical 5G speeds directly to regular phones.
The Military

Israel Deploys World's First Drone Defense Laser (tomshardware.com) 173

Israel has operationally deployed Iron Beam, a 100,000-watt laser air-defense system capable of shooting down drones, rockets, and mortars at negligible per-shot cost. According to Tom's Hardware, it marks the first real-world deployment of a high-energy laser as part of a modern, multi-layered missile defense network. From the report: The Iron Beam is a short-range line-of-sight laser interceptor that is extremely cheap to run and, therefore, perfectly suited for intercepting low-cost, high-volume threats. According to the official Israeli announcement, Iron Beam systems have "successfully intercepted rockets, mortars, and UAVs."

A complex mix of government, military, scientific, and commercial interests were responsible for the research and development of the Iron Beam laser system. Central to the Iron Beam are "an advanced laser source and a unique electro-optical targeting system, enabling the interception of a wide range of targets at an enhanced operational range, with maximum precision and superior efficiency," boasted the press release by Israel's MoD. Moreover, it works "at a negligible marginal cost, which constitutes the laser system's primary advantage."

We don't get much more by way of technical details, perhaps understandably. However, Rafael Advanced Defense Systems execs heralded the system's "unique adaptive optics technology," in what it calls "the world's most advanced laser-based system for intercepting aerial threats." Its operational debut "marks the beginning of the era of high-energy laser defense," they claimed.

Biotech

Scientists Edit Gene in 15 Patients That May Permanently Reduce High Cholesterol (cnn.com) 21

A CRISPR-based drug given to study participants by infusion is raising hopes for a much easier way to lower cholesterol, reports CNN: With a snip of a gene, doctors may one day permanently lower dangerously high cholesterol, possibly removing the need for medication, according to a new pilot study published Saturday in the New England Journal of Medicine.

The study was extremely small — only 15 patients with severe disease — and was meant to test the safety of a new medication delivered by CRISPR-Cas9, a biological sort of scissor which cuts a targeted gene to modify or turn it on or off. Preliminary results, however, showed nearly a 50% reduction in low-density lipoprotein, or LDL, the "bad" cholesterol which plays a major role in heart disease — the No.1 killer of adults in the United States and worldwide. The study, which will be presented Saturday at the American Heart Association Scientific Sessions in New Orleans, also found an average 55% reduction in triglycerides, a different type of fat in the blood that is also linked to an increased risk of cardiovascular disease.

"We hope this is a permanent solution, where younger people with severe disease can undergo a 'one and done' gene therapy and have reduced LDL and triglycerides for the rest of their lives," said senior study author Dr. Steven Nissen, chief academic officer of the Sydell and Arnold Miller Family Heart, Vascular & Thoracic Institute at Cleveland Clinic in Ohio.... Today, cardiologists want people with existing heart disease or those born with a predisposition for hard-to-control cholesterol to lower their LDL well below 100, which is the average in the US, said Dr. Pradeep Natarajan, director of preventive cardiology at Massachusetts General Hospital and associate professor of medicine at Harvard Medical School in Boston...

People with a nonfunctioning ANGPTL3 gene — which Natarajan says applies to about 1 in 250 people in the US — have lifelong levels of low LDL cholesterol and triglycerides without any apparent negative consequences. They also have exceedingly low or no risk for cardiovascular disease. "It's a naturally occurring mutation that's protective against cardiovascular disease," said Nissen, who holds the Lewis and Patricia Dickey Chair in Cardiovascular Medicine at Cleveland Clinic. "And now that CRISPR is here, we have the ability to change other people's genes so they too can have this protection."

"Phase 2 clinical trials will begin soon, quickly followed by Phase 3 trials, which are designed to show the effect of the drug on a larger population, Nissen said."

And CNN quotes Nissen as saying "We hope to do all this by the end of next year. We're moving very fast because this is a huge unmet medical need — millions of people have these disorders and many of them are not on treatment or have stopped treatment for whatever reason."
Supercomputing

Nvidia's New Product Merges AI Supercomputing With Quantum (thequantuminsider.com) 14

NVIDIA has introduced NVQLink, an open system architecture that directly connects quantum processors with GPU-based supercomputers. The Quantum Insider reports: The new platform connects the high-speed, high-throughput performance of NVIDIA's GPU computing with quantum processing units (QPUs), allowing researchers to manage the intricate control and error-correction workloads required by quantum devices. According to a NVIDIA statement, the system was developed with guidance from researchers at major U.S. national laboratories including Brookhaven, Fermi, Lawrence Berkeley, Los Alamos, MIT Lincoln, Oak Ridge, Pacific Northwest, and Sandia.

Qubits, the basic units of quantum information, are extremely sensitive to noise and decoherence, making them prone to errors. Correcting and stabilizing these systems requires near-instantaneous feedback and coordination with classical processors. NVQLink is meant to meet that demand by providing an open, low-latency interconnect between quantum processors, control systems, and supercomputers -- effectively creating a unified environment for hybrid quantum applications.

The architecture offers a standardized, open approach to quantum integration, aligning with the company's CUDA-Q software platform to enable researchers to develop, test, and scale hybrid algorithms that draw simultaneously on CPUs, GPUs, and QPUs. The U.S. Department of Energy (DOE) -- which oversees several of the participating laboratories -- framed NVQLink as part of a broader national effort to sustain leadership in high-performance computing, according to NVIDIA.

First Person Shooters (Games)

Programmer Gets Doom Running On a Space Satellite (zdnet.com) 28

An Icelandic programmer successfully ran Doom on the European Space Agency's OPS-SAT satellite, proving that the iconic 1993 shooter can now run not just everywhere on Earth -- but in orbit. ZDNet reports: Olafur Waage, a senior software developer from Iceland who now works in Norway, explained at Ubuntu Summit 25.10 how he, a self-described "professional keyboard typist" and maker of funny videos, ended up making what is perhaps the game's most outlandish port yet: Doom running on a real satellite in orbit, the European Space Agency (ESA) OPS-SAT satellite. OPS-SAT, a "flying laboratory" for testing novel onboard computing techniques, was equipped with an experimental computer approximately 10 times more powerful than the norm for spacecraft. Waag explained, "OPS-SAT was the first of its kind, devoted to demonstrating drastically improved mission control capabilities when satellites can fly more powerful onboard computers. The point was to break the curse of being too risk-averse with multi-million-dollar spacecraft." (The satellite was decommissioned in 2024.) [...]

Running Doom in orbit was partly a challenge of portability and partly a challenge of the limitations of space hardware and mission control. The on-board ARM dual-core Cortex-A9 processor, while hot stuff for space computing hardware (which tends to be low-powered and radiation-hardened), was slow even by Earth-bound standards. Waage chose Chocolate Doom 2.3, a popular open-source version of Doom, for its compatibility with the Ubuntu 18.04 Long Term Support (LTS) distro, which was already running on OPS-SAT. Besides, Waage noted, "We picked Chocolate Doom 2.3 because of the libraries available for 18.04 -- that was the last one that would actually build.

Updating software in orbit is extremely difficult, so relatively little code would have to be uploaded. As Waage said, "Doom is relatively straightforward C with a few external dependencies." In other words, it's easy to port. [...] The only sign that Doom was running in space at first was a lone log entry. So, the team used the satellite's camera to snap real-time images of the Earth, then swapped Doom's Mars skybox for actual satellite photos. "The idea was to take a screenshot from the satellite and use that as the sky, all rendered in software using the game's restricted 256-color palette," explained Waage. Even this posed unexpected difficulties: "Trying to draw all of these beautiful colors with those colors," said Waage, "it's probably not going to work right off. But we tried gradient tests, NASA demo photos. It took quite a bit of tweaking." Eventually, instead of a fantasy Mars as the sky background, they got a good-looking, real Earth in the game's sky. The game itself ran flawlessly. After all, Waage said, "It ran beautifully. It's on Ubuntu."

Bug

Security Bug In India's Income Tax Portal Exposed Taxpayers' Sensitive Data (techcrunch.com) 9

A now-fixed security flaw in India's income tax e-filing portal exposed millions of taxpayers' personal and financial data due to a basic IDOR vulnerability that let users view others' records by swapping PAN numbers. "The exposed data included full names, home addresses, email addresses, dates of birth, phone numbers, and bank account details of people who pay taxes on their income in India," reports TechCrunch. "The data also exposed citizens' Aadhaar number, a unique government-issued identifier used as proof of identity and for accessing government services." From the report: The researchers found that when they signed into the portal using their Permanent Account Number (PAN), an official document issued by the Indian income tax department, they could view anyone else's sensitive financial data by swapping out their PAN for another PAN in the network request as the web page loads. This could be done using publicly available tools like Postman or Burp Suite (or using the web browser's in-built developer tools) and with knowledge of someone else's PAN, the researchers told TechCrunch.

The bug was exploitable by anyone who was logged-in to the tax portal because the Indian income tax department's back-end servers were not properly checking who was allowed to access a person's sensitive data. This class of vulnerability is known as an insecure direct object reference, or IDOR, a common and simple flaw that governments have warned is easy to exploit and can result in large-scale data breaches.

"This is an extremely low-hanging thing, but one that has a very severe consequence," the researchers told TechCrunch. In addition to the data of individuals, the researchers said that the bug also exposed data associated with companies who were registered with the e-Filing portal. [...] It remains unclear how long the vulnerability has existed or whether any malicious actors have accessed the exposed data.

United Kingdom

UK Government Trial of M365 Copilot Finds No Clear Productivity Boost 85

A UK government trial of Microsoft's M365 Copilot found no clear productivity gains despite user satisfaction with tasks like summarizing meetings and writing emails. While the tool sped up some routine work, it actually slowed down more complex tasks like Excel analysis and PowerPoint creation, often producing lower-quality results. The Register reports: The Department for Business and Trade received 1,000 licenses for use between October and December 2024, with the majority of these allocated to volunteers and 30 percent to randomly selected participants. Some 300 of these people consented to their data being analyzed. An evaluation of time savings, quality assurance, and productivity was then calculated in the assessment (PDF). Overall, 72 percent of users were satisfied or very satisfied with their digital assistant and voiced disappointment when the test ended. However, the reality of productivity gains was more nuanced than Microsoft's marketing materials might suggest. Around two-thirds of the employees in the trial used M365 at least once a week, and 30 percent used it at least once a day -- which doesn't sound like great value for money. [...]

According to the M365 Copilot monitoring dashboard made available in the trial, an average of 72 M365 Copilot actions were taken per user. "Based on there being 63 working days during the pilot, this is an average of 1.14 M365 Copilot actions taken per user per day," the study says. Word, Teams, and Outlook were the most used, and Loop and OneNote usage rates were described as "very low," less than 1 percent and 3 percent per day, respectively. "PowerPoint and Excel were slightly more popular; both experienced peak activity of 7 percent of license holders using M365 Copilot in a single day within those applications," the study states. The three most popular tasks involved transcribing or summarizing a meeting, writing an email, and summarizing written comms. These also had the highest satisfaction levels, we're told.

Participants were asked to record the time taken for each task with M365 Copilot compared to colleagues not involved in the trial. The assessment report adds: "Observed task sessions showed that M365 Copilot users produced summaries of reports and wrote emails faster and to a higher quality and accuracy than non-users. Time savings observed for writing emails were extremely small. "However, M365 Copilot users completed Excel data analysis more slowly and to a worse quality and accuracy than non-users, conflicting time savings reported in the diary study for data analysis. PowerPoint slides [were] over 7 minutes faster on average, but to a worse quality and accuracy than non-users." This means corrective action was required.

A cross-section of participants was asked questions in an interview -- qualitative findings -- and they claimed routine admin tasks could be carried out with greater efficiency with M365 Copilot, letting them "redirect time towards tasks seen as more strategic or of higher value, while others reported using these time savings to attend training sessions or take a lunchtime walk." Nevertheless, M365 Copilot did not necessarily make them more productive, the assessment found. This is something Microsoft has worked on with customers to quantify the benefits and justify the greater expense of a license for M365 Copilot.
Japan

Japan Launches its First Homegrown Quantum Computer (livescience.com) 2

Japan has launched its first entirely homegrown quantum computer, built with domestic superconducting qubits and components, and running on the country's own open-source software toolchain, OQTOPUS. "The system is now ready to take on workloads from its base at the University of Osaka's Center for Quantum Information and Quantum Biology (QIQB)," reports LiveScience. From the report: The system uses a quantum chip with superconducting qubits -- quantum bits derived from metals that exhibit zero electrical resistance when cooled to temperatures close to absolute zero (minus 459.67 degrees Fahrenheit, or minus 273.15 degrees Celsius). The quantum processing unit (QPU) was developed at the Japanese research institute RIKEN. Other components that make up the "chandelier" -- the main body of the quantum computer -- include the chip package, delivered by Seiken, the magnetic shield, infrared filters, bandpass filters, a low-noise amplifier and various cables.

These are all housed in a dilution refrigerator (a specialized cryogenic device that cools the quantum computing components) to allow for those extremely low temperatures. It also comes alongside a pulse tube refrigerator (which again cools various components in use), controllers and a low-noise power source. OQTOPUS, meanwhile, is a collection of open-source tools that include everything required to run quantum programs. It includes the core engine and cloud module, as well as graphical user interface (GUI) elements, and is designed to be built on top of a QPU and quantum control hardware.

Moon

Astronomers Plan Far Side of the Moon Satellite to Hear Billion-Year-Old Radio Waves (cosmosmagazine.com) 12

An anonymous reader shared this report from Cosmos magazine about a plan to "pick up those faint signals from billions of years ago." Astronomers are planning to launch a tiny spacecraft to the far side of the Moon to listen out for "ancient whispers" in a quest to uncover the secrets of the early universe. The mission will focus on understanding the 'Cosmic Dawn', a period in the early stages of the universe after the Big Bang but before the first stars and galaxies appeared.

One of the difficulties in studying this period of the universe is that silence is essential. With all the electronics and interference in our atmosphere, Earth becomes too loud, making it unsuitable for this kind of research... The proposed mission will utilise the Moon as a giant shield, blocking out the noise from Earth, in order to observe these signals...

The mission, known as CosmoCube, is a joint study between the UK's University of Portsmouth, University of Cambridge and Rutherford Appleton Laboratory Space... CosmoCube's radio will operate at low frequencies (10-100MHz), which should hopefully be able to detect extremely faint signals. The team hope to reach lunar orbit before the end of the decade, with a roughly 5-year roadmap planned.

The article includes this quote from Professor David Bacon, from the University of Portsmouth and CosmoCube researcher. "It's incredible how far these radio waves have travelled, now arriving with news of the universe's history.

"The next step is to go to the quieter side of the Moon to hear that news."
Security

Cybercriminals Are Hiding Malicious Web Traffic in Plain Sight (wired.com) 34

Cybercriminals have been increasingly turning to "residential proxy" services over the past two to three years to disguise malicious web traffic as everyday online activity, according to research presented at the Sleuthcon cybercrime conference. The shift represents a response to law enforcement's growing success in targeting traditional "bulletproof" hosting services, which previously allowed criminals to maintain anonymous web infrastructure.

Residential proxies route traffic through decentralized networks running on consumer devices like old Android phones and low-end laptops, providing real IP addresses assigned to homes and offices. This approach makes malicious activity extremely difficult to detect because it appears to originate from trusted consumer locations rather than suspicious server farms. The technology creates particular challenges when attackers appear to come from the same residential IP ranges as employees of target organizations.
Space

Dark Matter Formed When Fast Particles Slowed Down and Got Heavy, New Theory Says (phys.org) 38

Dartmouth researchers propose that dark matter originated from massless, light-like particles in the early universe that rapidly condensed into massive particles through a spin-based interaction. Phys.Org reports: [T]he study authors write that their theory is distinct because it can be tested using existing observational data. The extremely low-energy particles they suggest make up dark matter would have a unique signature on the cosmic microwave background, or CMB, the leftover radiation from the Big Bang that fills all of the universe. "Dark matter started its life as near-massless relativistic particles, almost like light," says Robert Caldwell, a professor of physics and astronomy and the paper's senior author. "That's totally antithetical to what dark matter is thought to be -- it is cold lumps that give galaxies their mass," Caldwell says. "Our theory tries to explain how it went from being light to being lumps."

Hot, fast-moving particles dominated the cosmos after the burst of energy known as the Big Bang that scientists believe triggered the universe's expansion 13.7 billion years ago. These particles were similar to photons, the massless particles that are the basic energy, or quanta, of light. It was in this chaos that extremely large numbers of these particles bonded to each other, according to Caldwell and Guanming Liang, the study's first author and a Dartmouth senior. They theorize that these massless particles were pulled together by the opposing directions of their spin, like the attraction between the north and south poles of magnets. As the particles cooled, Caldwell and Liang say, an imbalance in the particles' spins caused their energy to plummet, like steam rapidly cooling into water. The outcome was the cold, heavy particles that scientists think constitute dark matter.
The findings have been published in the journal Physical Review Letters.
Businesses

2025 Will Likely Be Another Brutal Year of Failed Startups, Data Suggests (techcrunch.com) 28

An anonymous reader quotes a report from TechCrunch: TechCrunch gathered data from several sources and found similar trends. In 2024, 966 startups shut down, compared to 769 in 2023, according to Carta. That's a 25.6% increase. One note on methodology: Those numbers are for U.S.-based companies that were Carta customers and left Carta due to bankruptcy or dissolution. There are likely other shutdowns that wouldn't be accounted for through Carta, estimates Peter Walker, Carta's head of insights. [...] Meanwhile, AngelList found that 2024 saw 364 startup winddowns, compared to 233 in 2023. That's a 56.2% jump. However, AngelList CEO Avlok Kohli has a fairly optimistic take, noting that winddowns "are still very low relative to the number of companies that were funded across both years."

Layoffs.fyi found a contradicting trend: 85 tech companies shut down in 2024, compared to 109 in 2023 and 58 in 2022. But as founder Roger Lee acknowledges, that data only includes publicly reported shutdowns "and therefore represents an underestimate." Of those 2024 tech shutdowns, 81% were startups, while the rest were either public companies or previously acquired companies that were later shut down by their parent organizations. So many companies got funded in 2020 and 2021 at heated valuations with famously thin diligence, that it's only logical that up to three years later, an increasing number couldn't raise more cash to fund their operations. Taking investment at too high of a valuation increases the risk such that investors won't want to invest more unless business is growing extremely well. [...]

Looking ahead, Walker also expects we'll continue to see more shutdowns in the first half of 2025, and then a gradual decline for the rest of the year. That projection is based mostly on a time-lag estimate from the peak of funding, which he estimates was the first quarter of 2022 in most stages. So by the first quarter of 2025, "most companies will have either found a new path forward or had to make this difficult choice."
"Tech zombies and a startup graveyard will continue to make headlines," said Dori Yona, CEO and co-founder of SimpleClosure. "Despite the crop of new investments, there are a lot of companies that have raised at high valuations and without enough revenue."
Science

'Snowball Earth' Evolution Hypothesis Gains New Momentum (quantamagazine.org) 42

The University of Colorado Boulder's magazine recently wrote: What happened during the "Snowball Earth" period is perplexing: Just as the planet endured about 100 million years of deep freeze, with a thick layer of ice covering most of Earth and with low levels of atmospheric oxygen, forms of multicellular life emerged. Why? The prevailing scientific view is that such frigid temperatures would slow rather than speed evolution. But fossil records from 720 to 635 million years ago show an evolutionary spurt preceding the development of animals...

Carl Simpson, a macroevolutionary paleobiologist at CU Boulder, has found evidence that cold seawater could have jump-started — rather than suppressed — evolution from single-celled to multicellular life forms.

That evidence is described in Quanta magazine: Simpson proposes an answer linked to a fundamental physical fact: As seawater gets colder, it gets more viscous, and therefore more difficult for very small organisms to navigate. Imagine swimming through honey rather than water... To test the idea, Simpson, a paleobiologist at the University of Colorado, Boulder, and his team conducted an experiment designed to see what a modern single-celled organism does when confronted with higher viscosity... In an enormous, custom-made petri dish, [grad student Andrea] Halling and Simpson created a bull's-eye target of agar gel — their own experimental gauntlet of viscosity. At the center, it was the standard viscosity used for growing these algae in the lab. [Green algae, which swims with a tail-like flagellum.] Moving outward, each concentric ring had higher and higher viscosity, finally reaching a medium with four times the standard level. The scientists placed the algae in the middle, turned on a camera, and left them alone for 30 days — enough time for about 70 generations of algae to live, swim around for nutrients and die...

After 30 days, the algae in the middle were still unicellular. As the scientists put algae from thicker and thicker rings under the microscope, however, they found larger clumps of cells. The very largest were wads of hundreds. But what interested Simpson the most were mobile clusters of four to 16 cells, arranged so that their flagella were all on the outside. These clusters moved around by coordinating the movement of their flagella, the ones at the back of the cluster holding still, the ones at the front wriggling.

"One thing that you learn about small organisms from a physics point of view is that they don't experience the world the same way that we do, as larger-bodied organisms," Simpson says in the university's article. It says that instead unicellular organisms are specifically "affected by the viscosity, or thickness, of sea water," and Simpson adds that "basically, that would trigger the origin of animals, potentially."

Last year Simpson posted a preprint on biorxiv.org. (And he also co-authored an article on "physical constraints during Snowball Earth drive the evolution of multicellularity.")

There's a video showing algae in Simpson's lab clumping together in viscous water. "This observed behavior adds evidence to Simpson's hypothesis that single-celled organisms clumped together to their mutual advantage during the 'Snowball Earth' period," says the video's description, "thus adding momentum to the rise of multicellular organisms." But Simpson says in the university's article, "To actually see it empirically means there's something to this idea."

Simpson and colleagues have now received a $1 million grant to study grains of sand made from calcium carbonate and called ooids, since their diameter "could be a proxy measurement of Earth's temperature for the last 2.5 billion years," according to the university's article. Geologist Lizzy Trower says the research "can tell us something about the chemistry and water temperature in which they formed." And more importantly, "Does the fossil record agree with the predictions we would make based on this theory from this new record of temperature?" Trower and Simpson's work also has potential implications for the human quest to find life elsewhere in the universe, Trower said. If extremely harsh and cold environments can spur evolutionary change, "then that is a really different type of thing to look for in exoplanets (potentially life-sustaining planets in other solar systems), or think about when and where (life) would exist."
Programming

Open Source Maintainers Are Drowning in Junk Bug Reports Written By AI (theregister.com) 91

An anonymous reader shares a report: Software vulnerability submissions generated by AI models have ushered in a "new era of slop security reports for open source" -- and the devs maintaining these projects wish bug hunters would rely less on results produced by machine learning assistants. Seth Larson, security developer-in-residence at the Python Software Foundation, raised the issue in a blog post last week, urging those reporting bugs not to use AI systems for bug hunting.

"Recently I've noticed an uptick in extremely low-quality, spammy, and LLM-hallucinated security reports to open source projects," he wrote, pointing to similar findings from the Curl project in January. "These reports appear at first glance to be potentially legitimate and thus require time to refute." Larson argued that low-quality reports should be treated as if they're malicious.

As if to underscore the persistence of these concerns, a Curl project bug report posted on December 8 shows that nearly a year after maintainer Daniel Stenberg raised the issue, he's still confronted by "AI slop" -- and wasting his time arguing with a bug submitter who may be partially or entirely automated.

Open Source

Slashdot's Interview with Bruce Perens: How He Hopes to Help 'Post Open' Developers Get Paid (slashdot.org) 61

Bruce Perens, original co-founder of the Open Source Initiative, has responded to questions from Slashdot readers about a new alternative he's developing that hopefully helps "Post Open" developers get paid.

But first, "One of the things that's clear from the Slashdot patter is that people are not aware of what I've been doing, in general," Perens says. "So, let's start by filling that in..."

Read on for the rest of his wide-ranging answers....
Earth

Earth Began Absorbing More Sunlight in 2023, Climate Researchers Find (arstechnica.com) 56

Today a group of German scientists presented data suggesting Earth is absorbing more sunlight than in the past, reports Ars Technica, "largely due to reduced cloud cover." We can measure both the amount of energy the Earth receives from the Sun and how much energy it radiates back into space.... The new paper finds that the energy imbalance set a new high in 2023, with a record amount of energy being absorbed by the ocean/atmosphere system. This wasn't accompanied by a drop in infrared emissions from the Earth, suggesting it wasn't due to greenhouse gases, which trap heat by absorbing this radiation. Instead, it seems to be due to decreased reflection of incoming sunlight by the Earth....

Using two different data sets, the teams identify the areas most effected by this, and they're not at the poles, indicating loss of snow and ice are unlikely to be the cause. Instead, the key contributor appears to be the loss of low-level clouds [particularly over the Atlantic ocean]... The drop in low-level clouds had been averaging about 1.3 percent per decade. 2023 saw a slightly larger drop occur in just one year....

So, what could be causing the clouds to go away? The researchers list three potential factors. One is simply the variability of the climate system, meaning 2023 might have just been an extremely unusual year, and things will revert to trends in the ensuing years. The second is the impact of aerosols, which both we and natural processes emit in copious quantities. These can help seed clouds, so a reduction of aerosols (driven by things like pollution control measures) could potentially account for this effect. The most concerning potential explanation, however, is that there may be a feedback relationship between rising temperatures and low-level clouds. Meaning that, as the Earth warms, the clouds become sparse, enhancing the warming further. That would be bad news for our future climate, because it suggests that the lower range of warming estimates would have to be adjusted upward to account for it.

If the decline in reflectivity wasn't just caused by normal variability, the researchers warn, "the 2023 extra heat may be here to stay..."
The Military

Royal Navy Successfully Tests Quantum-Sensing Technology (royalnavy.mod.uk) 25

An anonymous reader quotes a report from the Royal Navy: The Royal Navy has successfully demonstrated the capabilities of ground-breaking cold atom technology. P2000 vessel HMS Pursuer hosted the trial, which unlocks new possibilities in areas such as covert monitoring, which require precise signals for accurate positioning, navigation and timing. The Office of the Chief of Technology Officer (OCTO) for the RN worked with UK quantum technology company Aquark Technologies. The trial involved the company's miniature cold atom systems, founded on Aquark's unique laser-cooling method, known as supemolasses.

This method to generate cold atoms does not need an applied magnetic field, therefore reducing the size, weight, power consumption and cost of sensors. A cold atom is an atom that has been laser-cooled to extremely low temperatures, typically near absolute zero (-273.15C). At these temperatures, the thermal motion of atoms is very slow, allowing their quantum mechanical properties to be precisely controlled. Quantum Sensing is an advanced sensor technology that detects changes in motion, and electric and magnetic fields, by collecting data at the atomic level.
Commander Matthew Steele, who heads up Future Technology for OCTO, said: "Quantum technologies being developed in the UK will offer an alternative Position, Navigation and Timing (PNT) capability necessary to operate effectively in GPS denied or degraded environments."

"Over the next three years, the Navy seeks to accelerate the development of quantum technologies -- such as Aquarks -- through funding and sea trials, to secure the Royal Navy an opportunity to invest in a non-GPS-based PNT capability and to maintain its global operating advantage."

Slashdot Top Deals