The Internet

The Growth Rate For Mobile Internet Subscribers Has Stalled Across the World (restofworld.org) 41

An anonymous reader shares a report: A recent survey from Global System for Mobile Communications Association Intelligence (GSMA), the research wing of a U.K.-based organization that represents mobile operators around the world, found that 4.6 billion people across the globe are now connected to mobile internet -- or roughly 57% of the world's population. Now, the rate of new mobile internet subscriber growth is slowing. From 2015 to 2021, the survey consistently found over 200 million coming online through mobile devices around the world each year. But in the last two years, that number has dropped to 160 million.

Rest of World analysis of that data found that a number of developing countries are plateauing in the number of mobile internet subscribers. That suggests that in countries like Pakistan, Bangladesh, Nigeria, and Mexico, the easiest populations to get online have already logged on, and getting the rest of the population on mobile internet will continue to be a challenge. GSMA collects data by surveying a nationally representative sample of people in each country, and then it correlates the results with similar studies.

[...] In countries including China, the U.S., and Singapore, a high share of the population is already connected to mobile internet -- 80%, 81%, and 93%, respectively. So it's no surprise that the rate of mobile internet subscriptions has slowed. But the rate of new users has also slowed in countries including Bangladesh, Nigeria, and Pakistan -- where only 37%, 34%, and 24% of the population currently use mobile internet.

Supercomputing

Microsoft, Atom Computing Leap Ahead On the Quantum Frontier With Logical Qubits (geekwire.com) 18

An anonymous reader quotes a report from GeekWire: Microsoft and Atom Computing say they've reached a new milestone in their effort to build fault-tolerant quantum computers that can show an advantage over classical computers. Microsoft says it will start delivering the computers' quantum capabilities to customers by the end of 2025, with availability via the Azure cloud service as well as through on-premises hardware. "Together, we are co-designing and building what we believe will be the world's most powerful quantum machine," Jason Zander, executive vice president at Microsoft, said in a LinkedIn posting.

Like other players in the field, Microsoft's Azure Quantum team and Atom Computing aim to capitalize on the properties of quantum systems -- where quantum bits, also known as qubits, can process multiple values simultaneously. That's in contrast to classical systems, which typically process ones and zeros to solve algorithms. Microsoft has been working with Colorado-based Atom Computing on hardware that uses the nuclear spin properties of neutral ytterbium atoms to run quantum calculations. One of the big challenges is to create a system that can correct the errors that turn up during the calculations due to quantum noise. The solution typically involves knitting together "physical qubits" to produce an array of "logical qubits" that can correct themselves.

In a paper posted to the ArXiv preprint server, members of the research team say they were able to connect 256 noisy neutral-atom qubits using Microsoft's qubit-virtualization system in such a way as to produce a system with 24 logical qubits. "This represents the highest number of entangled logical qubits on record," study co-author Krysta Svore, vice president of advanced quantum development for Microsoft Azure Quantum, said today in a blog posting. "Entanglement of the qubits is evidenced by their error rates being significantly below the 50% threshold for entanglement." Twenty of the system's logical qubits were used to perform successful computations based on the Bernstein-Vazirani algorithm, which is used as a benchmark for quantum calculations. "The logical qubits were able to produce a more accurate solution than the corresponding computation based on physical qubits," Svore said. "The ability to compute while detecting and correcting errors is a critical component to scaling to achieve scientific quantum advantage."

Chrome

DOJ Wants Google To Sell Chrome To Break Search Monopoly (9to5google.com) 108

According to Bloomberg, the U.S. Justice Department wants Google to sell off its Chrome browser as part of its ongoing search monopoly case. The recommendations will be made official on Wednesday. 9to5Google reports: At the top of the list is having Google sell Chrome "because it represents a key access point through which many people use its search engine." There are many questions about how that works, including what the impact on the underlying Chromium codebase would be. Would Google still be allowed to develop the open-source project by which many other browsers, like Microsoft Edge use? "The government has the option to decide whether a Chrome sale is necessary at a later date if some of the other aspects of the remedy create a more competitive market," reports Bloomberg. Google, which plans to appeal, previously said that "splitting off Chrome or Android would break them."

Bloomberg reports that "antitrust officials pulled back from a more severe option that would have forced Google to sell off Android." However, the government wants Google to "uncouple its Android smartphone operating system from its other products, including search and its Google Play mobile app store, which are now sold as a bundle." Meanwhile, other recommendations include licensing Google Search data and results, as well as allowing websites that are indexed for Search to opt out of AI training.

AI

Growth of AI Adoption Slows Among US Workers, Study Says (axios.com) 34

The percentage of workers in the U.S. who say they are using AI at work has remained largely flat over the last three months, according to a new study commissioned by Slack. From a report: If AI's rapid adoption curve slows or flattens, a lot of very rosy assumptions about the technology -- and very high market valuations tied to them -- could change. Slack said its most recent survey found 33% of U.S. workers say they are using AI at work, an increase of just a single percentage point. That represents a significant flattening of the rapid growth noted in prior surveys.

Global adoption of AI use at work, meanwhile, rose from 32% to 36%. Between the lines: Slack also found that globally, nearly half of workers (48%) said they were uncomfortable telling their managers they use AI at work. Among the top reasons cited were a fear of being seen as lazy, cheating or incompetent.

Books

Are America's Courts Going After Digital Libraries? (reason.com) 43

A new article at Reason.com argues that U.S. courts "are coming for digital libraries." In September, a federal appeals court dealt a major blow to the Internet Archive — one of the largest online repositories of free books, media, and software — in a copyright case with significant implications for publishers, libraries, and readers. The U.S. Court of Appeals for the 2nd Circuit upheld a lower court ruling that found the Internet Archive's huge, digitized lending library of copyrighted books was not covered by the "fair use" doctrine and infringed on the rights of publishers. Agreeing with the Archive's interpretation of fair use "would significantly narrow — if not entirely eviscerate — copyright owners' exclusive right to prepare derivative works," the 2nd Circuit ruled. "Were we to approve [Internet Archive's] use of the works, there would be little reason for consumers or libraries to pay publishers for content they could access for free."
Others disagree, according to some links shared in a recent email from the Internet Archive. Public Knowledge CEO Chris Lewis argues the court's logic renders the fair use doctrine "almost unusuable". And that's just the beginning... This decision harms libraries. It locks them into an e-book ecosystem designed to extract as much money as possible while harvesting (and reselling) reader data en masse. It leaves local communities' reading habits at the mercy of curatorial decisions made by four dominant publishing companies thousands of miles away. It steers Americans away from one of the few remaining bastions of privacy protection and funnels them into a surveillance ecosystem that, like Big Tech, becomes more dangerous with each passing data breach.
But lawyer/librarian Kyle K. Courtney writes that the case "is specific only to the parties, and does not impact the other existing versions of controlled digital lending." Additionally, this decision is limited to the 2nd Circuit and is not binding anywhere else — in other words, it does not apply to the 47 states outside the 2nd Circuit's jurisdiction. In talking with colleagues in the U.S. this week and last, many are continuing their programs because they believe their digital loaning programs fall outside the scope of this ruling... Moreover, the court's opinion focuses on digital books that the court said "are commercially available for sale or license in any electronic text format." Therefore, there remains a significant number of materials in library collections that have not made the jump to digital, nor are likely to, meaning that there is no ebook market to harm — nor is one likely to emerge for certain works, such as those that are no longer commercially viable...

This case represents just one instance in an ongoing conversation about library lending in the digital age, and the possibility of appeal to the U.S. Supreme Court means the final outcome is far from settled.

Some more quotes from links shared by Internet Archive:
  • "It was clear that the only reason all the big publishers sued the Internet Archive was to put another nail in the coffin of libraries and push to keep this ebook licensing scheme grift going. Now the courts have helped." — TechDirt
  • "The case against the Internet Archive is not just a story about the ruination of an online library, but a grander narrative of our times: how money facilitates the transference of knowledge away from the public, back towards the few." — blogger Hannah Williams

Thanks to Slashdot reader fjo3 for sharing the news.


AMD

AMD's Desktop PC Market Share Skyrockets Amid Intel's Raptor Lake CPU Crashing Scandal (tomshardware.com) 33

An anonymous reader shares a report: AMD has gained a substantial 5.7 percentage points of share of the desktop x86 CPU market in the third quarter compared to Q2, the largest quarterly share gain since we began tracking the market share reports in 2016. It also represents an incredible ten percentage point improvement over the prior year. AMD also raked in a strong increase in revenue share, jumping 8.5 percentage points over the prior quarter, indicating that it is selling a strong mix of higher-end CPU models.

During the quarter, AMD launched its new Ryzen 9000-series family of processors amid a scandal related to stability issues with Intel's Raptor Lake chips, which generated a flood of negative press for the company over the course of several months, and inventory adjustments for one of Intel's customers. AMD now commands 28.7% of the desktop processor market. AMD also continued to gain share in the laptop and server markets, though its gains on the desktop side of the business were the most impressive, according to Mercury Research.

Space

Nearly Three Years Since Launch, Webb Is a Hit Among Astronomers (arstechnica.com) 30

The James Webb Space Telescope has made groundbreaking discoveries, detecting the most distant galaxy yet and capturing an image of the closest directly-imaged exoplanet. "Judging by astronomers' interest in using Webb, there are many more to come," writes Ars Technica's Stephen Clark. With immense demand for observation time, Webb is set to explore a vast array of cosmic targets -- from early galaxies to exoplanet atmospheres -- offering insights that extend far beyond Hubble's reach. From the report: The Space Telescope Science Institute, which operates Webb on behalf of NASA and its international partners, said last week that it received 2,377 unique proposals from science teams seeking observing time on the observatory. The institute released a call for proposals earlier this year for the so-called "Cycle 4" series of observations with Webb. This volume of proposals represents around 78,000 hours of observing time with Webb, nine times more than the telescope's available capacity for scientific observations in this cycle. The previous observing cycle had a similar "oversubscription rate" but had less overall observing time available to the science community.

More than 600 scientists will review the proposals and select the most promising ones for time on Webb. The largest share of proposals would involve observing "high-redshift" galaxies among the first generation of galaxies that formed after the Big Bang. Galaxies this old and distant have their light stretched to longer wavelengths due to the expansion of the Universe. Research involving exoplanet atmospheres and stars and stellar populations were the second- and third-most popular science categories in this cycle. [...] It seems astronomers have no shortage of ideas about where to look. Maybe one day, new super heavy-lift rockets or advancements in in-space assembly will make it possible to deploy space telescopes even more sensitive than Webb. Until then, we can be thankful that Webb is performing well and has a good shot of far outliving its original five-year design life. Let's continue enjoying the show.

Piracy

Google Asked To Remove 10 Billion 'Pirate' Search Results (torrentfreak.com) 23

An anonymous reader quotes a report from TorrentFreak: Rightsholders have asked Google to remove more than 10 billion 'copyright infringing' URLs from its search results. The search engine doesn't celebrate the milestone in any way, but the takedown notices document intriguing shifts in volume over time, as well as shifting takedown interests. [...] The path to 10 billion was turbulent. When Google first made DMCA details public it was processing a few million DMCA takedown requests in a year. That number swiftly increased to hundreds of millions and eventually reached a billion DMCA requests in 2016.

The exponential growth curve eventually flattened out and around 2017, the takedown volume started to decline. The decrease was in part due to various anti-piracy algorithms making pirated content less visible in search results. By downranking pirate sites, infringing content became harder to find. As a result, Google processed fewer takedown notices, a welcome change for both rightsholders and the search engine. Today, Google continues to make pirate sites less visible in search, but the reduction in takedown notices didn't last. On the contrary, over the past several months, Google search processed a record number of DMCA notices.

Last summer, the search giant recorded the 7 billionth takedown request and after that the numbers shot up, adding billions more in the year that followed. The company is now handling removal requests at a rate of roughly 2.5 billion per year; a new record. This represents more than 50 million takedown requests per week and roughly 5,000 every minute. [...] While the 10 billionth reported URL is undoubtedly a milestone, this number is largely driven by a few rightsholders, reporting outfits, and domain names. The aforementioned takedown outfit Link-Busters, for example, accounts for roughly 15% of all reported links, nearly 1.5 billion. Similarly, the ten most prolific rightsholders, including the BPI, HarperCollins, and VIZ Media, are responsible for 40% of all reported links. These ten companies are only a tiny fraction of the 600,000 rightsholders that reported pirated links, however. A small group of domains also receives a disproportionate amount of attention. In total, 5,400,061 domains have been reported, with the top domains having dozens of millions of flagged URLs each. However, most domains have only a few flagged links, some of which are erroneous.

Power

US Regulator Rejects Bid To Boost Nuclear Power To Amazon Data Center (thehill.com) 29

The Federal Energy Regulatory Commission (FERC) blocked Amazon's bid to access more power from the Susquehanna nuclear plant for its Pennsylvania data center, citing grid reliability and consumer cost concerns. The Hill reports: In a 2-1 decision, the FERC found the regional grid operator, PJM Interconnection, failed to prove that the changes to the transmission agreement with Susquehanna power plant were necessary. The regulator's two Republican commissioners, Mark Christie and Lindsay See, outvoted Democratic chair Willie Phillips. The chair's two fellow Democratic commissioners, David Rosner and Judy Chang, sat out the vote. "Co-location arrangements of the type presented here present an array of complicated, nuanced and multifaceted issues, which collectively could have huge ramifications for both grid reliability and consumer costs," Christie wrote in a concurring statement.

In a dissenting statement, Phillips argued the deal with Amazon "represents a 'first of its kind' co-located load configuration" and that Friday's decision is a "step backward for both electric reliability and national security." "We are on the cusp of a new phase in the energy transition, one that is characterized as much by soaring energy demand, due in large part to AI, as it is by rapid changes in the resource mix," Phillips wrote.

Amazon purchased a 960-megawatt data center next to the Susquehanna power plant for $650 million earlier this year. Following the announcement, PJM sought to increase the amount of power running directly to the co-located data center. However, the move faced pushback from regional utilities, including Exelon and American Electric Power (AEP).

Earth

California Inks Sustainable Aviation Fuel Deal With Major Airlines 65

California signed an agreement with major airlines to increase the use of sustainable aviation fuels, aiming to reach 200 million gallons by 2035 or about 40% of the state's air travel demand. The Hill reports: The California Air Resources Board (CARB) and Airlines for America (A4A) -- an industry trade group representing almost a dozen airlines -- pledged to increase the availability of sustainable aviation fuels statewide. Sustainable aviation fuels -- lower-carbon alternatives to petroleum-based jet fuels -- are typically made from nonpetroleum feedstocks, such as biomass or waste. At a San Francisco International Airport ceremony Wednesday, the partners committed (PDF) to using 200 million gallons of such fuels by 2035 -- an amount estimated to meet about 40 percent of travel demand within the state at that point, according to CARB. That quantity also represents a more than tenfold increase from current usage levels of these fuels, the agency added.

Among A4A member airlines are Alaska Airlines, American Airlines, Atlas Air Worldwide, Delta Air Lines, FedEx, Hawaiian Airlines, JetBlue Airways, Southwest Airlines, United Airlines and UPS, while Air Canada is an associate member. To achieve the 2035 goals, CARB and A4A said they plan to work together to identify, assess and prioritize necessary policy measures, such as incentivizing relevant investments and streamlining the permitting processes. A Sustainable Aviation Fuel Working Group, which will include government and industry stakeholders, will meet annually to both discuss progress and address barriers toward meeting these goals, the partners added. A public website will display updated information about the availability and use of conventional and sustainable fuels across California, while also providing details about state policies, according to the agreement.
Government

US Plans $825 Million Investment For New York Semiconductor R&D Facility (reuters.com) 26

The Biden administration is investing $825 million in a new semiconductor research and development facility in Albany, New York. Reuters reports: The New York facility will be expected to drive innovation in EUV technology, a complex process necessary to make semiconductors, the U.S. Department of Commerce and Natcast, operator of the National Semiconductor Technology Center (NTSC) said. The launch of the facility "represents a key milestone in ensuring the United States remains a global leader in innovation and semiconductor research and development," Commerce Secretary Gina Raimondo said. From the U.S. Department of Commerce press release: EUV Lithography is essential for manufacturing smaller, faster, and more efficient microchips. As the semiconductor industry pushes the limits of Moore's Law, EUV lithography has emerged as a critical technology to enable the high-volume production of transistors beyond 7nm, previously unattainable. As the NSTC develops capabilities and programs, access to EUV lithography R&D is essential to meet its three primary goals 1) extend U.S. technology leadership, 2) reduce the time and cost to prototype, and 3) build and sustain a semiconductor workforce ecosystem.
Businesses

Siemens To Buy Altair For $10.6 Billion In Digital Portfolio Push (yahoo.com) 10

An anonymous reader quotes a report from Reuters: Siemens will buy Altair Engineering for $10.6 billion, the American engineering software firm said on Wednesday, as the German company seeks to strengthen its presence in the fast-growing industrial software market. The offer price of $113 per share represents a premium of about 18.7% to Altair's closing price on Oct. 21, a day before Reuters first reported that the company was exploring a sale. The deal for Michigan-based Altair is Siemens's biggest acquisition since Siemens Healthineers bought medical device maker Varian Medical Systems for $16.4 million in 2020. [...]

The transaction is anticipated to add to Siemens' earnings per share in about two years from the deal's closing, which is expected in the second half of 2025. It will also increase Siemens' digital business revenue by about 8%, adding approximately 600 million euros ($651.36 million) to the company's digital business revenue in fiscal 2023. The transaction would have a revenue impact of about $500 million per year in the mid-term and more than $1 billion per year in the long term, Siemens said.

Microsoft

Microsoft Calls Out Google For Running 'Shadow Campaigns' in Europe To Influence Regulators (cnbc.com) 25

Microsoft took the unusual step on Monday of publicly criticizing longtime rival Google for running "shadow campaigns" in Europe designed to discredit the software giant with regulators. CNBC: Microsoft lawyer Rima Alaily wrote in a blog post that Google hired a firm to recruit European cloud companies to represent the search company's case. "This week an astroturf group organized by Google is launching," Microsoft lawyer Rima Alaily wrote in a blog post. "It is designed to discredit Microsoft with competition authorities, and policymakers and mislead the public. Google has gone through great lengths to obfuscate its involvement, funding, and control, most notably by recruiting a handful of European cloud providers, to serve as the public face of the new organization."

The conflict represents a fresh battle between two companies that do battle in cloud infrastructure as well as online advertising and productivity software. The latest chapter surfaces as Google faces heightened regulatory pressure in Europe and in the U.S., where it's in the midst of its second antitrust trial against the Justice Department. Alaily suggested in Monday's post that Google hired advisory firm DGA Group to set up the Open Cloud Coalition. One company that opted not to participate in the group told Microsoft that the coalition would receive financial backing from Google and criticize Microsoft's practices in Europe, Alaily wrote.

Power

Researchers Develop New Lithium Extraction Method With 'Nearly Double the Performance' (pv-magazine.com) 21

PV Magazine reports: Researchers in Australia and China have developed an innovative technology enabling direct lithium extraction from difficult-to-process sources like saltwater, which they say represents a substantial portion of the world's lithium potential.

Until now, up to 75% of the world's lithium-rich saltwater sources have remained untapped because of technical limitations, but given predictions that global lithium supply could fall short of demand as early as 2025, the researchers believe they have a game-changing solution. Their technology is a type of nanofiltration system that uses ethylenediaminetetraacetic acid, or EDTA, as a chelating agent to selectively separate lithium from other minerals, especially magnesium, which is often present in brines and difficult to remove.

"With some predicting global lithium supply could fall short of demand as early as 2025, the innovative technology sets a new standard in lithium processing," writes SciTechDaily: The work, co-led by Dr Zhikao Li, from the Monash Suzhou Research Institute and the Department of Chemical and Biological Engineering, and Professor Xiwang Zhang from the University of Queensland, promises to meet the surging demand for lithium and paves the way for more sustainable and efficient extraction practices... "Our technology achieves 90 percent lithium recovery, nearly double the performance of traditional methods, while dramatically reducing the time required for extraction from years to mere weeks," Dr. Li said.

The technology also turns leftover magnesium into a valuable, high-quality product that can be sold, reducing waste and its impact on the environment. Beyond its advanced efficiency, the EALNF system brings innovation to address major environmental concerns associated with lithium extraction. Unlike conventional methods that deplete vital water resources in arid regions, the technology produces freshwater as a by-product.

Dr Li said the system was flexible and ready for large-scale use, meaning it can quickly expand from testing to full industrial operations. "This breakthrough is crucial for avoiding a future lithium shortage, making it possible to access lithium from hard-to-reach sources and helping power the shift to clean energy."

"Our scalable process minimizes environmental impact while maximizing resource utilization," according to the researchers' article in Nature Sustainability, "thereby catalysing the shift toward a more sustainable future."

Thanks to long-time Slashdot reader schwit1 for sharing the news.
Math

Former Nvidia Engineer Discovers 41-Million-Digit Prime (tomshardware.com) 29

Former Nvidia engineer Luke Durant, working with the Great Internet Mersenne Prime Search (GIMPS), recently discovered the largest known prime number: (2^136,279,841)-1 or M136279841 (where the number following the letter M represents the exponent). The achievement was detailed on Mersenne.org. Tom's Hardware reports: This is the largest prime number we've seen so far, with the last one, M82589933, being discovered six years prior. What makes this discovery particularly fascinating is that this is the first GIMPS discovery that used the power of data center GPUs. Mihai Preda was the first one to harness GPU muscle in 2017, says the GIMPS website, when he "wrote the GpuOwl program to test Mersenne numbers for primarilty, making his software available to all GIMPS users." When Luke joined GIMPS in 2023, they built the infrastructure needed to deploy Preda's software across several GPU servers available in the cloud.

While it took a year of testing, Luke's efforts finally bore fruit when an A100 GPU in Dublin, Ireland gave the M136279841 result last October 11. This was then corroborated by an Nvidia H100 located in San Antonio, Texas, which confirmed its primality with the Lucas-Lehmer test.

Microsoft

Microsoft Bets on Latest 'Call of Duty' To Power Up Video Games Strategy (ft.com) 27

Microsoft is seeking to boost its video games business with the release of the latest instalment of the Call of Duty franchise on Friday, pushing to increase subscription revenues through the new game to offset falling Xbox console sales. Financial Times: Black Ops 6 is the first of the best-selling series to be launched on the tech giant's Game Pass subscription service. It represents the biggest test of the company's gaming strategy [non-paywalled link] since its $75bn deal to acquire Activision Blizzard -- makers of Call of Duty -- received sign-off from regulators last year. Microsoft hopes that the release will help achieve its target of reaching 110mn Game Pass subscribers by 2030, a substantial rise from 34mn in February this year.

The company has shifted its focus towards its subscription games service as hardware sales have slowed in recent years. Xbox hardware revenue fell 13 per cent year-on-year in Microsoft's fiscal 2024, which ended in June. For the first time this year, subscribers to Game Pass, who can already access a growing library of Xbox titles for as long as they keep paying a monthly fee, will be able to access the latest Call of Duty without having to pay a traditional price of $70 or more for the packaged game. Microsoft is still making the game available to buy on PlayStation, after concerns from regulators during the Activision merger probe that it might make the title exclusive to its own platform.

Businesses

Cable Companies Ask 5th Circuit To Block FTC's Click-to-Cancel Rule (arstechnica.com) 55

Cable companies, advertising firms, and newspapers are asking courts to block a federal "click-to-cancel" rule that would force businesses to make it easier for consumers to cancel services. From a report: Lawsuits were filed yesterday, about a week after the Federal Trade Commission approved a rule that "requires sellers to provide consumers with simple cancellation mechanisms to immediately halt all recurring charges."

Cable lobby group NCTA-The Internet & Television Association and the Interactive Advertising Bureau trade group sued the FTC in the conservative US Court of Appeals for the 5th Circuit. The lawsuit claims the 5th Circuit is a proper venue because a third plaintiff, the Electronic Security Association, has its principal offices in Dallas. That group represents security companies such as ADT.

Math

Physicist Reveals Why You Should Run in The Rain (sciencealert.com) 116

Theoretical Physicist Jacques Treiner, from the University of Paris Cite, explains why you should run in the rain: ... Let p represent the number of drops per unit volume, and let a denote their vertical velocity. We'll denote Sh as the horizontal surface area of the individual (e.g., the head and shoulders) and Sv as the vertical surface area (e.g., the body). When you're standing still, the rain only falls on the horizontal surface, Sh. This is the amount of water you'll receive on these areas. Even if the rain falls vertically, from the perspective of a walker moving at speed v, it appears to fall obliquely, with the angle of the drops' trajectory depending on your speed. During a time period T, a raindrop travels a distance of aT. Therefore, all raindrops within a shorter distance will reach the surface: these are the drops inside a cylinder with a base of Sh and a height of aT, which gives:
p.Sh.a.T.

As we have seen, as we move forward, the drops appear to be animated by an oblique velocity that results from the composition of velocity a and velocity v. The number of drops reaching Sh remains unchanged, since velocity v is horizontal and therefore parallel to Sh. However, the number of drops reaching surface Sv -- which was previously zero when the walker was stationary -- has now increased. This is equal to the number of drops contained within a horizontal cylinder with a base area of Sv and a length of v.T. This length represents the horizontal distance the drops travel during this time interval. In total, the walker receives a number of drops given by the expression:
p.(Sh.a + Sv.v). T

Now we need to take into account the time interval during which the walker is exposed to the rain. If you're covering a distance d at constant speed v, the time you spend walking is d/v. Plugging this into the equation, the total amount of water you encounter is:
p.(Sh.a + Sv.v). d/v = p.(Sh.a/v + Sv). d
This equation proves that the faster you move, the less water hits your head and shoulders, but the amount of water hitting the vertical part of your body remains constant. To stay drier, it's best to move quickly and lean forward. However, you'll have to increase your speed to offset the exposed surface area caused by leaning.
Math

A Calculator's Most Important Button Has Been Removed (theatlantic.com) 108

Apple's latest iOS update has removed the "C" button from its Calculator app, replacing it with a backspace function. The change, part of iOS 18, has sparked debate among users accustomed to the traditional clear function. The removal of the "C" button represents a significant departure from decades-old calculator design conventions, The Atlantic writes. From the story: The "C" button's function is vestigial. Back when calculators were commercialized, starting in the mid-1960s, their electronics were designed to operate as efficiently as possible. If you opened up a desktop calculator in 1967, you might have found a dozen individual circuit boards to run and display its four basic mathematical functions. Among these would have been an input buffer or temporary register that could store an input value for calculation and display. The "C" button, which was sometimes labeled "CE" (Clear Entry) or "CI" (Clear Input), provided a direct interface to zero out -- or "clear" -- such a register. A second button, "AC" (All Clear), did the same thing, but for other parts of the circuit, including previously stored operations and pending calculations. (A traditional calculator's memory buttons -- "M+," "M-," "MC" -- would perform simple operations on a register.)

By 1971, Mostech and Texas Instruments had developed a "calculator on a chip," which condensed all of that into a single integrated circuit. Those chips retained the functions of their predecessors, including the ones that were engaged by "C" and "AC" buttons. And this design continued on into the era of pocket calculators, financial calculators, and even scientific calculators such as the ones you may have used in school. Some of the latter were, in essence, programmable pocket computers themselves, and they could have been configured with a backspace key. They were not.

Encryption

Debunking Hype: China Hasn't Broken Military Encryption with Quantum (forbes.com) 43

An anonymous reader shared this report from Forbes: Recent headlines have proclaimed that Chinese scientists have hacked "military-grade encryption" using quantum computers, sparking concern and speculation about the future of cybersecurity. The claims, largely stemming from a recent South China Morning Post article about a Chinese academic paper published in May, was picked up by many more serious publications.

However, a closer examination reveals that while Chinese researchers have made incremental advances in quantum computing, the news reports are a huge overstatement. "Factoring a 50-bit number using a hybrid quantum-classical approach is a far cry from breaking 'military-grade encryption'," said Dr. Erik Garcell, Head of Technical Marketing at Classiq, a quantum algorithm design company. While advancements have indeed been made, the progress represents incremental steps rather than a paradigm-shifting breakthrough that renders current cryptographic systems obsolete. "This kind of overstatement does more harm than good," Dr. Garcell said. "Misrepresenting current capabilities as 'breaking military-grade encryption' is not just inaccurate — it's potentially damaging to the field's credibility...."

In fact, the Chinese paper in question, titled Quantum Annealing Public Key Cryptographic Attack Algorithm Based on D-Wave Advantage, does not mention military-grade encryption, which typically involves algorithms like the Advanced Encryption Standard (AES). Instead, the paper is about attacking RSA encryption (RSA stands for Rivest-Shamir-Adleman, named after its creators)... While factoring a 50-bit integer is an impressive technical achievement, it's important to note that RSA encryption commonly uses key sizes of 2048 bits or higher. The difficulty of factoring increases exponentially with the size of the number, meaning that the gap between 50-bit and 2048-bit integers is astronomically large.

Moreover, the methods used involve a hybrid approach that combines quantum annealing with classical computation. This means that the quantum annealer handles part of the problem, but significant processing is still performed by classical algorithms. The advances do not equate to a scalable method for breaking RSA encryption as it is used in practical applications today.

Duncan Jones, Head of Cybersecurity at Quantinuum, tells Forbes that if China had actually broken AES — they'd be keeping it secret (rather than publicizing it in newspapers).

Slashdot Top Deals