×
United Kingdom

Why US Tech Giants Are Threatening to Leave the UK (bbc.com) 15

"It was difficult to maintain a poker face when the leader of a big US tech firm I was chatting to said there was a definite tipping point at which the firm would exit the UK," writes a BBC technology editor: Many of these companies are increasingly fed up. Their "tipping point" is UK regulation — and it's coming at them thick and fast. The Online Safety Bill is due to pass in the autumn. Aimed at protecting children, it lays down strict rules around policing social media content, with high financial penalties and prison time for individual tech execs if the firms fail to comply. One clause that has proved particularly controversial is a proposal that encrypted messages, which includes those sent on WhatsApp, can be read and handed over to law enforcement by the platforms they are sent on, if there is deemed to be a national security or child protection risk...

Currently messaging apps like WhatsApp, Proton and Signal, which offer this encryption, cannot see the content of these messages themselves. WhatsApp and Signal have both threatened to quit the UK market over this demand.

The Digital Markets Bill is also making its way through Parliament. It proposes that the UK's competition watchdog selects large companies like Amazon and Microsoft, gives them rules to comply with and sets punishments if they don't. Several firms have told me they feel this gives an unprecedented amount of power to a single body. Microsoft reacted furiously when the Competition and Markets Authority (CMA) chose to block its acquisition of the video game giant Activision Blizzard. "There's a clear message here — the European Union is a more attractive place to start a business than the United Kingdom," raged chief executive Brad Smith. The CMA has since re-opened negotiations with Microsoft. This is especially damning because the EU is also introducing strict rules in the same vein — but it is collectively a much larger and therefore more valuable market.

In the UK, proposed amendments to the Investigatory Powers Act, which included tech firms getting Home Office approval for new security features before worldwide release, incensed Apple so much that it threatened to remove Facetime and iMessage from the UK if they go through. Clearly the UK cannot, and should not, be held to ransom by US tech giants. But the services they provide are widely used by millions of people. And rightly or wrongly, there is no UK-based alternative to those services.

The article concludes that "It's a difficult line to tread. Big Tech hasn't exactly covered itself in glory with past behaviours — and lots of people feel regulation and accountability is long overdue."
Cloud

In Generative AI Market, Amazon Chases Microsoft and Google with Custom AWS Chips (cnbc.com) 6

An anonymous reader shared this report from CNBC: In an unmarked office building in Austin, Texas, two small rooms contain a handful of Amazon employees designing two types of microchips for training and accelerating generative AI. These custom chips, Inferentia and Trainium, offer AWS customers an alternative to training their large language models on Nvidia GPUs, which have been getting difficult and expensive to procure. "The entire world would like more chips for doing generative AI, whether that's GPUs or whether that's Amazon's own chips that we're designing," Amazon Web Services CEO Adam Selipsky told CNBC in an finterview in June. "I think that we're in a better position than anybody else on Earth to supply the capacity that our customers collectively are going to want...."

In the long run, said Chirag Dekate, VP analyst at Gartner, Amazon's custom silicon could give it an edge in generative AI...

With millions of customers, Amazon's AWS cloud service "still accounted for 70% of Amazon's overall $7.7 billion operating profit in the second quarter," CNBC notes. But does that give them a competitive advantage?

A technology VP for the service tells them "It's a question of velocity. How quickly can these companies move to develop these generative AI applications is driven by starting first on the data they have in AWS and using compute and machine learning tools that we provide." In June, AWS announced a $100 million generative AI innovation "center."

"We have so many customers who are saying, 'I want to do generative AI,' but they don't necessarily know what that means for them in the context of their own businesses. And so we're going to bring in solutions architects and engineers and strategists and data scientists to work with them one on one," AWS CEO Selipsky said... For now, Amazon is only accelerating its push into generative AI, telling CNBC that "over 100,000" customers are using machine learning on AWS today. Although that's a small percentage of AWS's millions of customers, analysts say that could change.

"What we are not seeing is enterprises saying, 'Oh, wait a minute, Microsoft is so ahead in generative AI, let's just go out and let's switch our infrastructure strategies, migrate everything to Microsoft.' Dekate said. "If you're already an Amazon customer, chances are you're likely going to explore Amazon ecosystems quite extensively."

Space

Could Supermassive Black Holes Explain Our Universe's Gravitational-Wave 'Hum'? (space.com) 8

"Earlier this year, after 15 years of searching, scientists finally heard the background hum of low-frequency gravitational waves that fill our universe," writes Space.com.

"Now, the hard work of searching for the source of these ripples in spacetime can begin." Currently, the primary suspects in this case are pairings of supermassive black holes with masses millions, or even billions, of times that of the sun. However, that doesn't mean that there isn't room for a few unusual suspects, which could potentially point us toward new physics....

[G]ravitational waves detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO) express wavelengths that are thousands of miles (or km) in length and hold frequencies of milliseconds to seconds. The new gravitational waves detected by the North American Nanohertz Observatory for Gravitational Waves (NANOGrav), by contrast, have wavelengths on a scale of trillions of miles (or km). This is similar to the distance between the sun and its neighboring star, Proxima Centauri, a staggering 20 light-years in length. Plus, NANOGrav gravitational wavelengths have frequencies on scales of years instead of mere seconds. Practically, what this means is scientists need to build over 15 years of NANOGrav data to confirm a low-frequency gravitational wave detection.

But, when it happens, it's worth the wait. That's because these results have the capacity to point us toward new information about our universe... "The detection of low-frequency gravitational waves means they're from very different sources to the LIGO and Virgo sources, which are stellar mass black holes and neutron star mergers," Scott Ransom, a National Radio Astronomy Observatory astronomer and former chair of NANOGrav, told Space.com... Ransom is part of a collaboration of researchers that believe low-frequency gravitational waves, including those detected by NANOGrav, may originate from a pretty incredible source. They could come from, the team argues, hundreds of thousands of supermassive black hole pairings that, over the 13.8-billion-year course of cosmic history, came close enough together that they've merged...

"For many decades, theorists have hypothesized that supermassive black hole binaries should produce a signal with characteristics just like what NANOGrav and other pulsar timing arrays are seeing," Luke Zoltan Kelly, a Northwestern University theoretical astrophysicist and NANOGrav researcher, told Space.com. "For most of the community, supermassive black hole binaries are a natural best guess for what's producing the gravitational wave background...." Zoltan Kelley pointed out to Space.com that besides binaries, there are a number of new models in cosmology and in particle physics that, under the right circumstances, could also produce a similar gravitational wave background to that detected by NANOGrav. For example, axion or 'fuzzy' dark matter, cosmic strings, inflationary phase transitions, and many others," the Northwestern astrophysicist said.

"What's really exciting about these possibilities is that each of these models is an attempt to explain some of the biggest current mysteries of our universe."

AI

Stack Overflow 'Evolves', Previewing AI-Powered Answers and Chat Followups (stackoverflow.blog) 31

"Stack Overflow is adding artificial intelligence to its offerings," reports ZDNet (which notes traffic to the Q&A site has dropped 5% in the last year).

So in a video, Stack Overflow's CEO Prashanth Chandrasekar says that search and question-asking "will evolve to provide you with instant summarized solutions with citations to sources, aggregated by generative AI — plus the option to ask follow-up questions in a chat-like format."

The New Stack provides some context: As computer scientist Santiago Valdarrama remarked in a tweet, "I don't remember the last time I visited Stack Overflow. Why would I when tools like Copilot and ChatGPT answer my questions faster without making me feel bad for asking?" It's a problem Stack Overflow CEO Prashanth Chandrasekar acknowledges because, well, he encountered it too.

"When I first started using Stack Overflow, I remember my first experience was quite harsh, because I basically asked a fairly simple question, but the standard on the website is pretty high," Chandrasekar told The New Stack. "When ChatGPT came out, it was a lot easier for people to go and ask ChatGPT without anybody watching...."

But what may be of more interest to developers is that Stack Overflow is now offering an IDE (integrated development environment) extension for Visual Studio Code that will be powered by OverflowAI. This means that coders will be able to ask a conversational interface a question and find solutions from within the IDE.

Stack Overflow also is launching a GenAI Stack Exchange, where the community can post and share knowledge on prompt engineering, getting the most out of AI and similar topics.

And they're integrating it into other workflows as well. "Of course, AI isn't replacing humans any time soon," CEO Chandrasekar says in the video. "But it can help you draft a question to pose to our community..."

Signups for the OverflowAI preview are available now. "With your help, we'll be putting AI to work," CEO Chandrasekar says in the video.
Power

How Laser Sensors Could Improve America's Electric Grid (npr.org) 39

By 2035 America needs a 43% increase in its power-transmitting capacity, according to an analysis by the REPEAT project. But NPR reports there's another way to quickly improve capacity without building new transmission lines: That's where the laser sensors come in, says Jon Marmillo, co-founder of LineVision, the company that makes them. Sensors can help utilities get real-time data on their power lines, which can allow them to send more renewable electricity through the wires. This tech is part of a suite of innovations that could help the U.S. increase its grid capacity faster and cheaper than building new transmission lines...

At any given moment, utilities typically know how much power is going through their lines. But they aren't required to know the real time conditions of those lines, like the wind speed or how hot the line is. Without that data, utilities have to use conservative standards for how much power can safely flow, says Jake Gentle, senior program manager for infrastructure security at Idaho National Laboratory. But when sensors gather information from the wires — about wind, temperature, and wire sag — that data allows utilities to go beyond their conservative standards and safely put more electricity through the wires... With this tech, called "dynamic line rating", utilities are able to increase the efficiency of their lines — sometimes as much as 40%, says Gentle.

One Pittsburgh company using similar technology told NPR that "we found an average of 25% additional available capacity on transmission lines that were equipped with the sensors."
Firefox

Does Desktop Linux Have a Firefox Problem? (osnews.com) 97

OS News' managing editor calls Firefox "the single most important desktop Linux application," shipping in most distros (with some users later opting for a post-installation download of Chrome).

But "I'm genuinely worried about the state of browsers on Linux, and the future of Firefox on Linux in particular..." While both GNOME and KDE nominally invest in their own two browsers, GNOME Web and Falkon, their uptake is limited and releases few and far between. For instance, none of the major Linux distributions ship GNOME Web as their default browser, and it lacks many of the features users come to expect from a browser. Falkon, meanwhile, is updated only sporadically, often going years between releases. Worse yet, Falkon uses Chromium through QtWebEngine, and GNOME Web uses WebKit (which are updated separately from the browser, so browser releases are not always a solid metric!), so both are dependent on the goodwill of two of the most ruthless corporations in the world, Google and Apple respectively.

Even Firefox itself, even though it's clearly the browser of choice of distributions and Linux users alike, does not consider Linux a first-tier platform. Firefox is first and foremost a Windows browser, followed by macOS second, and Linux third. The love the Linux world has for Firefox is not reciprocated by Mozilla in the same way, and this shows in various places where issues fixed and addressed on the Windows side are ignored on the Linux side for years or longer. The best and most visible example of that is hardware video acceleration. This feature has been a default part of the Windows version since forever, but it wasn't enabled by default for Linux until Firefox 115, released only in early July 2023. Even then, the feature is only enabled by default for users of Intel graphics — AMD and Nvidia users need not apply. This lack of video acceleration was — and for AMD and Nvidia users, still is — a major contributing factor to Linux battery life on laptops taking a serious hit compared to their Windows counterparts... It's not just hardware accelerated video decoding. Gesture support has taken much longer to arrive on the Linux version than it did on the Windows version — things like using swipes to go back and forward, or pinch to zoom on images...

I don't see anyone talking about this problem, or planning for the eventual possible demise of Firefox, what that would mean for the Linux desktop, and how it can be avoided or mitigated. In an ideal world, the major stakeholders of the Linux desktop — KDE, GNOME, the various major distributions — would get together and seriously consider a plan of action. The best possible solution, in my view, would be to fork one of the major browser engines (or pick one and significantly invest in it), and modify this engine and tailor it specifically for the Linux desktop. Stop living off the scraps and leftovers thrown across the fence from Windows and macOS browser makers, and focus entirely on making a browser engine that is optimised fully for Linux, its graphics stack, and its desktops. Have the major stakeholders work together on a Linux-first — or even Linux-only — browser engine, leaving the graphical front-end to the various toolkits and desktop environments....

I think it's highly irresponsible of the various prominent players in the desktop Linux community, from GNOME to KDE, from Ubuntu to Fedora, to seemingly have absolutely zero contingency plans for when Firefox enshittifies or dies...

AI

Will Quantum Computing Supercharge AI - and Then Transform Our Understanding of Reality? (scmp.com) 78

Quantum computing could turbo-charge AI into something "massively, universally transformative," argues the South China Morning Post, citing a quote from theoretical physicist Michio Kaku. "AI has the ability to learn new, complex tasks, and quantum computers can provide the computational muscle it needs..."

"AI will give us the ability to create learning machines that can begin to mimic human abilities, while quantum computers may provide the calculational power to finally create an intelligent machine." Where AI brings an ability to self-improve and learn from its mistakes, quantum computers add speed and power. Google CEO Sundar Pichai has said "AI can accelerate quantum computing, and quantum computing can accelerate AI...."

Complex calculations that would take classical supercomputers thousands of years to crunch could, in theory, be completed by quantum computers in minutes... In expectation of its advantages, the automotive industry is already collaborating with pioneers in the quantum-computing arena. Daimler has partnered with IBM, Volkswagen with D-Wave Systems (a Canadian quantum-computing firm) and Hyundai with IonQ. "If you can increase the energy density of your battery by another factor of two, three or four, then instead of 300 miles (480km), you can go 600 miles and 1,200 miles on [one] charge," says Kim. "That actually starts to cross the threshold where they become so much more attractive than fossil fuel. And then we can really make an impact on global warming and all these problems..."

Similarly, the mysteries of carbon sequestration could be unravelled by quantum computing, with clear benefits for the efforts to reverse global warming. Drug design at the molecular level could be revolutionised, opening up new avenues for vaccines and, for example, personalised cancer treatment. There's no doubt about it: with effective quantum computing our understanding of chemical processes could become godlike. Finance and investment, too, could be revolutionised by the qubit. The huge range of factors that produce market fluctuations allow for an almost infinite range of possible outcomes, and modelling these possibilities would be relatively simple for quantum computers. Forecasts of market movements would become far more accurate...

For many physicists and mathematicians, every step of the journey towards functional and world-changing quantum computers assumes acknowledgement of an even more profound goal: a greater understanding of the nature of reality. This could also mean that the very nature of understanding has to be reconsidered.

The article suggests we "occupy ourselves with the dawning realisation that something philosophically far-reaching has begun to percolate into our shared consciousness from the laboratories of the world's quantum pioneers."
Iphone

Judge Finally Clears Way for Apple's $500 Million iPhone Throttling Settlement (siliconvalley.com) 49

"Owners of some older iPhone models are expected to receive about $65 each," reports SiliconValley.com, "after a judge cleared the way for payments in a class-action lawsuit accusing Apple of secretly throttling phone performance." The Cupertino cell phone giant agreed in 2020 to pay up to $500 million to resolve a lawsuit alleging it had perpetrated "one of the largest consumer frauds in history" by surreptitiously slowing the performance of certain iPhone models to address problems with batteries and processors...

According to the lawsuit, filed in 2018, reports of unexplained iPhone shutdowns began to surface in 2015 and increased in the fall of 2016. Consumers complained their phones were shutting off even though the batteries showed a charge of more than 30%, the lawsuit claimed. The lawsuit claimed the shutdowns resulted from a mismatch between phones' hardware, including batteries and processing chips, and the ever-increasing demands of constantly updating operating systems. Apple tried to fix the problem with a software update, but the update merely throttled device performance to cut the number of shutdowns, the lawsuit claimed... In a 2019 court filing in the case, Apple argued that lithium-ion batteries become less effective with time, repeated charging, extreme temperatures and general use. Updating software, Apple asserted in the filing, entails trade-offs. "Providing more features also introduces complexity and can reduce speed, and increasing features or speed may adversely impact hardware lifespan," the company said.

Consumer grief over the shutdowns and alleged throttling also led to a 2020 lawsuit against Apple by the State of California and Alameda and Los Angeles counties. Apple, admitting to no wrongdoing, settled the case for $113 million.

About 3 million claims were received, the article notes, and two iPhone owners who'd objected to the settlement lost their appeal this week, "removing the final obstacle to the deal..."

"The phones at issue in the case were iPhone 6, 6 Plus, 6s, 6s Plus, and SE devices running operating systems iOS 10.2.1 or later before Dec. 21, 2017, and iPhone 7 and 7 Plus phones running iOS 11.2 or later before that date."
Crime

The Untold History of Today's Russian-Speaking Hackers (ft.com) 13

Monday sees the release of "The Billion Dollar Heist," a documentary about the theft of $81 million from the Bangladesh Bank, considered the biggest cyber-heist of all time. The film's executive producer wrote the book Dark Market: How Hackers Became the New Mafia (and is also a rector at the Institute for Human Sciences).

But he's also written an article for the Financial Times outlining the complicated background of Russian-speaking hacker gangs responsible for malware and ransomware, starting with "one of the most remarkable if little-known events in post-cold war history: the first and, to my knowledge, the last publicly organised conference of avowed criminals" in May, 2002.

The First Worldwide Carders Conference was the brainchild of the administrators of a landmark website, carderplanet.com. Known as "the family", this was a mixed group of young men, both Ukrainians and Russians, who had spent the previous 10 years growing up in a lively atmosphere of gangster capitalism. During the 1990s, conventional law and order in the former Soviet Union had broken down. The collapse of the communist system had left a vacuum in which new forms of economic activity were emerging...

Founded a year before the conference, CarderPlanet revolutionised web-based criminal activity, especially the lucrative trade in stolen or cloned credit card data, by solving the conundrum that until then had faced every bad guy on the web: how can I do business with this person, as I know he's a criminal, so he must be untrustworthy by definition? To obviate the problem, the CarderPlanet administrators created an escrow system for criminals. They would act as guarantor of any criminal sale of credit and debit card data — a disinterested party mediating between the vendor and the purchaser... The escrow system led to an explosion of credit card crime around the world in which many criminal fortunes were made....

Roman Stepanenko Vega, a Russian-speaking Ukrainian national who was one of the founders and administrators of CarderPlanet, explained to me how "two days before the conference's opening, we received a visit from an FSB [Federal Security Service] officer in Moscow. He explained that Moscow had no objections to us cloning credit cards or defrauding banks in Europe and the United States but anywhere within the CIS was off limits." In addition, the FSB officer let CarderPlanet know that if the Russian state ever required assistance from criminal gangs, it would be expected to co-operate...

Members of criminal gangs were later recruited into notorious state-backed hacking teams such as Advanced Persistent Threat 28.

A 2021 ransomware attack on Colonial Pipeline brought warnings of a U.S. counterattack, the article notes, after which "Russian police started arresting and imprisoning cyber criminal groups." Ransomware attacks now seem particularly focused on Europe, and "According to cyber-security experts, the Russian government is giving these criminal groups information on potential targets." But once more the hackers have been careful not to cross what the Americans consider red lines, as advised, presumably, by Russia's security services. Russia is probably confident that disrupting European businesses will be unlikely to provoke a cyber attack. But the U.S. — whether its government, municipalities or police — remains strictly off-limits.
Thanks to long-time Slashdot reader Geoffrey.landis for sharing the article.
Science

Why Was Silicon Valley So Obsessed with LK-99 Superconductor Claims? (msn.com) 66

What to make of the news that early research appears unable to duplicate the much-ballyhooed claims for the LK99 superconductor?

"The episode revealed the intense appetite in Silicon Valley for finding the next big thing," argues the Washington Post, "after years of hand-wringing that the tech world has lost its ability to come up with big, world-changing innovations, instead channeling all its money and energy into building new variations of social media apps and business software..." [M]any tech leaders are nervous that the current focus on consumer and business software has led to stagnation. A decade ago, investors prophesied that self-driving cars would take over the roads by the mid-2020s — but they are still firmly in the testing phase, despite billions of dollars of investment. Cryptocurrencies and blockchain technology have had multiple hype cycles of their own, but have yet to fundamentally change any industry, besides crime and money laundering. Tech meant to help mitigate climate change, like carbon capture and storage, has lagged without major advances in years. Meanwhile, Big Tech companies used their huge cash hoards to snap up smaller competitors, with antitrust regulators only recently beginning to clamp down on consolidation. Over the last year, as higher interest rates have cut into the amount of venture capital and slowing growth has caused companies to pull back spending, a massive wave of layoffs has swept the industry, and companies such as Google that previously said they'd invest some of their profits in big, risky ideas have turned away from such "moonshots..."

Room-temperature superconductors would be especially relevant to the tech industry right now, which is busy burning billions of dollars on new computer chips and the energy costs to run them to train the AI models behind tools like ChatGPT and Google's Bard. For years, computer chips have gotten smaller and more efficient, but that progress has run up against the limits of the physical world as transistors get so small some are now just one atom thick.

Science

Common Alzheimer's Disease Gene May Have Helped Our Ancestors Have More Kids (science.org) 35

Science magazine reports: Roughly one in five people are born with at least one copy of a gene variant called APOE4 that makes them more prone to heart disease and Alzheimer's disease in old age. That the variant is so common poses an evolutionary mystery: If it decreases our fitness, why hasn't APOE4 been purged from the human population over time?

Now, a study of nearly 800 women in a traditional society in the Amazon finds that those with the disease-promoting variant had slightly more children. Such a fertility benefit may have allowed the gene to persist during human evolution despite its harmful effects for older people today...

The Tsimané data also allowed the team to home in on how APOE4 may boost fertility: Women carrying it were slightly heavier that those without it, started bearing children about 1 year earlier, and had their next child a few months sooner. That fits with being more resistant to parasites, says siological anthropologist Benjamin Trumble . "Being in a better immune state means that you can then devote more calories towards growing faster, and then you're able to reproduce faster."

Thanks to Slashdot reader sciencehabit for sharing the article.
Space

How to Turn an Asteroid into a Space Habitat (Using Self-Replicating Spider Robots) (sciencealert.com) 58

A retired Technical Fellow from Rockwell Collins "released a 65-page paper that details an easy-to-understand, relatively inexpensive, and feasible plan to turn an asteroid into a space habitat," reports Universe Today (in an article republished at Science Alert): Dr. David W. Jensen breaks the discussion into three main categories — asteroid selection, habitat style selection, and mission strategy to get there (i.e., what robots to use)... He eventually settled on a torus as the ideal habitat type and then dives into calculations about the overall station mass, how to support the inner wall with massive columns, and how to allocate floor space.

All important, but how exactly would we build such a massive behemoth? Self-replicating robots are Dr. Jensen's answer. The report's third section details a plan to utilize spider robots and a base station that can replicate themselves. He stresses the importance of only sending the most advanced technical components from Earth and using materials on the asteroid itself to build everything else, from rock grinders to solar panels...

With admittedly "back-of-the-envelope" calculations, Dr. Jensen estimates that the program would cost only $4.1 billion. That is far less than the $93 billion NASA plans to spend on the Apollo program. And the result would be a space habitat that provides 1 billion square meters of land that didn't exist before. That's a total cost of $4.10 per square meter to build land — in space. Possibly even more impressive is the timeline — Dr. Jensen estimates that the entire construction project could be done in as little as 12 years. However, it will still take longer to fill the habitat with air and water and start regulating its temperature.

AI

As Privacy Policies Get Harder to Understand, Many Allow Companies to Copy Your Content (themarkup.org) 25

An anonymous reader shared this investigative report from The Markup: Over the past quarter-century, privacy policies — the lengthy, dense legal language you quickly scroll through before mindlessly hitting "agree" — have grown both longer and denser. A study released last year found that not only did the average length of a privacy policy quadruple between 1996 and 2021, they also became considerably more difficult to understand. "Analyzing the content of privacy policies, we identify several concerning trends, including the increasing use of location data, increasing use of implicitly collected data, lack of meaningful choice, lack of effective notification of privacy policy changes, increasing data sharing with unnamed third parties, and lack of specific information about security and privacy measures," wrote De Montfort University Associate Professor Isabel Wagner, who used machine learning to analyze some 50,000 website privacy policies for the study...

To get a sense of what all of this means, I talked to Jesse Woo — a data engineer at The Markup who previously helped write institutional data use policies as a privacy lawyer. Woo explained that, while he can see why the language in Zoom's terms of service touched a nerve, the sentiment — that users allow the company to copy and use their content — is actually pretty standard in these sorts of user agreements. The problem is that Zoom's policy was written in a way where each of the rights being handed over to the company are specifically enumerated, which can feel like a lot. But that's also kind of just what happens when you use products or services in 2023 — sorry, welcome to the future!

As a point of contrast, Woo pointed to the privacy policy of the competing video-conferencing service Webex, which reads: "We will not monitor Content, except: (i) as needed to provide, support or improve the provision of the Services, (ii) investigate potential or suspected fraud, (iii) where instructed or permitted by you, or (iv) as otherwise required by law or to exercise or protect Our legal rights." That language feels a lot less scary, even though, as Woo noted, training AI models could likely be covered under a company taking steps to "support or improve the provision of the Services."

The article ends with a link to a helpful new guide showing "how to read any privacy policy and quickly identify the important/creepy/enraging parts."
Linux

Should There Be an 'Official' Version of Linux? (zdnet.com) 228

Why aren't more people using Linux on the desktop? Slashdot reader technology_dude shares one solution: Jack Wallen at ZDNet says establishing an "official" version of Linux may (or may not) help Linux on the desktop increase the number of users, mostly as someplace to point new users. It makes sense to me. What does Slashdot think and what would be the challenges, other than acceptance of a particular flavor?
Wallen argues this would also create a standard for hardware and software vendors to target, which "could equate to even more software and hardware being made available to Linux." (And an "official" Linux might also be more appealing to business users.) Wallen suggests it be "maintained and controlled by a collective of people from users, developers, and corporations (such as Intel and AMD) with a vested interest in the success of this project... There would also be corporate backing for things like marketing (such as TV commercials)." He also suggests basing it on Debian, and supporting both Snap and Flatpak...

In comments on the original submission, long-time Slashdot reader bobbomo points instead to kernel.org, arguing "There already is an official version of Linux called mainline. Everything else is backports." And jd (Slashdot user #1,658) believes that the official Linux is the Linux Standard Base. "All distributions, more-or-less, conform to the LSB, which gives you a pseudo 'official' Linux. About the one variable is the package manager. And there are ways to work around that."

Unfortunately, according to Wikipedia... The LSB standard stopped being updated in 2015 and current Linux distributions do not adhere to or offer it; however, the lsb_release command is sometimes still available.[citation needed] On February 7, 2023, a former maintainer of the LSB wrote, "The LSB project is essentially abandoned."
That post (on the lsb-discuss mailing list) argues the LSB approach was "partially superseded" by Snaps and Flatpaks (for application portability and stability). And of course, long-time Slashdot user menkhaura shares the obligatory XKCD comic...

It's not exactly the same thing, but days after ZDNet's article, CIQ, Oracle, and SUSE announced the Open Enterprise Linux Association, a new collaborative trade association to foster "the development of distributions compatible with Red Hat Enterprise Linux."

So where does that leave us? Share your own thoughts in the comments.

And should there be an "official" version of Linux?
Encryption

Google's Chrome Begins Supporting Post-Quantum Key Agreement to Shield Encryption Keys (theregister.com) 12

"Teams across Google are working hard to prepare the web for the migration to quantum-resistant cryptography," writes Chrome's technical program manager for security, Devon O'Brien.

"Continuing with our strategy for handling this major transition, we are updating technical standards, testing and deploying new quantum-resistant algorithms, and working with the broader ecosystem to help ensure this effort is a success." As a step down this path, Chrome will begin supporting X25519Kyber768 for establishing symmetric secrets in TLS, starting in Chrome 116, and available behind a flag in Chrome 115. This hybrid mechanism combines the output of two cryptographic algorithms to create the session key used to encrypt the bulk of the TLS connection:

X25519 — an elliptic curve algorithm widely used for key agreement in TLS today
Kyber-768 — a quantum-resistant Key Encapsulation Method, and NIST's PQC winner for general encryption

In order to identify ecosystem incompatibilities with this change, we are rolling this out to Chrome and to Google servers, over both TCP and QUIC and monitoring for possible compatibility issues. Chrome may also use this updated key agreement when connecting to third-party server operators, such as Cloudflare, as they add support. If you are a developer or administrator experiencing an issue that you believe is caused by this change, please file a bug.

The Register delves into Chrome's reasons for implementing this now: "It's believed that quantum computers that can break modern classical cryptography won't arrive for 5, 10, possibly even 50 years from now, so why is it important to start protecting traffic today?" said O'Brien. "The answer is that certain uses of cryptography are vulnerable to a type of attack called Harvest Now, Decrypt Later, in which data is collected and stored today and later decrypted once cryptanalysis improves." O'Brien says that while symmetric encryption algorithms used to defend data traveling on networks are considered safe from quantum cryptanalysis, the way the keys get negotiated is not. By adding support for a hybrid KEM, Chrome should provide a stronger defense against future quantum attacks...

Rebecca Krauthamer, co-founder and chief product officer at QuSecure, told The Register in an email that while this technology sounds futuristic, it's useful and necessary today... [T]he arrival of capable quantum computers should not be thought of as a specific, looming date, but as something that will arrive without warning. "There was no press release when the team at Bletchley Park cracked the Enigma code, either," she said.

Slashdot Top Deals