180765994
submission
mspohr writes:
As the US becomes a more unstable place to immigrate to and start a company, all three major Gulf powers are making a show of their multibillion-dollar push into AI.
Last year, the UAE signed a deal with the US for advanced chips that will fill one of the largest datacenters in the world to be constructed outside Abu Dhabi. Saudi Arabia’s state-owned AI firm, Humain, has inked billions of dollars in deals to create a “full-stack AI ecosystem”, which is to say the kingdom wants its own datacenters, training data, cloud services, and AI models, perhaps even its own chips. The aim of sovereign AI – artificial intelligence under the control of its home country from tip to tail – is explicit.. For the construction of AI models, there is far less Arabic textual content online than English.
180678846
submission
mspohr writes:
/i When Sir Tim Berners-Lee invented the world wide web in 1989, his vision was clear: it would used by everyone, filled with everything and, crucially, it would be free.
Today, the British computer scientist’s creation is regularly used by 5.5 billion people – and bears little resemblance to the democratic force for humanity he intended.
Since Berners-Lee’s disappointment a decade ago, he’s thrown everything at a project that completely shifts the way data is held on the web, known as the Solid (social linked data) protocol. It’s activism that is rooted in people power – not unlike the first years of the web.
This version of the internet would turbocharge personal sovereignty and give control back to users.
Berners-Lee has long seen AI – which exists only because of the web and its data – as having the potential to transform society far beyond the boundaries of self-interested companies. But now is the time, he says, to put guardrails in place so that AI remains a force for good – and he’s afraid the chance may pass humankind by.
180459667
submission
mspohr writes:
Europe’s quest for digital sovereignty is hampered by a 90 per cent dependency on US cloud infrastructure, claims Cristina Caffarra, a competition expert and a driving force behind the Eurostack initiative.
While Brussels champions policy initiatives and American tech giants market their own ‘sovereign’ solutions, a handful of public authorities in Austria, Germany, and France, alongside the International Criminal Court in The Hague, are taking concrete steps to regain control over their IT.
These cases provide a potential blueprint for a continent grappling with its technological autonomy, while simultaneously revealing the deep-seated legal and commercial challenges that make true independence so difficult to achieve.
The core of the problem lies in a direct and irreconcilable legal conflict. The US CLOUD Act of 2018 allows American authorities to compel US-based technology companies to provide requested data, regardless of where that data is stored globally. This places European organizations in a precarious position, as it directly clashes with Europe's own stringent privacy regulation, the General Data Protection Regulation (GDPR).
Austria's Federal Ministry for Economy, Energy and Tourism is a case in point. The ministry recently completed a migration of 1,200 employees to the European open-source collaboration platform Nextcloud, but the project was not a migration away from an existing US cloud provider. It was a deliberate choice not to adopt one.
The primary driver was not cost, but sovereignty. "It was never about saving money," Zinnagl adds. "It was about maintaining control over our own data and our own systems."
The decision has triggered a ripple effect, as several other Austrian ministries have since begun implementing Nextcloud. For Zinnagl and Ollrom, this proves that one organization willing to take the first step can inspire others to follow.
Their advice to other European governments is clear: be brave, involve management, and start. "You don't achieve digital sovereignty overnight," Ollrom tells The Register. "You have to do this in many steps, but you have to start with the first step. Don't just talk about it, but execute it."
180448675
submission
mspohr writes:
People examining documents released by the Department of Justice in the Jeffrey Epstein case discovered that some of the file redaction can be undone with Photoshop techniques, or by simply highlighting text to paste into a word processing file.
The Epstein Files Transparency Act signed into law last month permits the Department of Justice “to withhold certain information such as the personal information of victims and materials that would jeopardize an active federal investigation”.
It was unclear how property material complies with the redaction standard under the law. An inquiry to the Department of Justice has not yet been answered
179870040
submission
mspohr writes:
After Palisade Research released a paper last month which found that certain advanced AI models appear resistant to being turned off, at times even sabotaging shutdown mechanisms, it wrote an update attempting to clarify why this is – and answer critics who argued that its initial work was flawed.
In an update this week, Palisade, which is part of a niche ecosystem of companies trying to evaluate the possibility of AI developing dangerous capabilities, described scenarios it ran in which leading AI models – including Google’s Gemini 2.5, xAI’s Grok 4, and OpenAI’s GPT-o3 and GPT-5 – were given a task, but afterwards given explicit instructions to shut themselves down.
Certain models, in particular Grok 4 and GPT-o3, still attempted to sabotage shutdown instructions in the updated setup. Concerningly, wrote Palisade, there was no clear reason why.
179826816
submission
mspohr writes:
One exception to the industry-wide lethargy is the engineering team that designs the Signal Protocol, the open source engine that powers the world’s most robust and resilient form of end-to-end encryption for multiple private chat apps, most notably the Signal Messenger. Eleven days ago, the nonprofit entity that develops the protocol, Signal Messenger LLC, published a 5,900-word write-up describing its latest updates that bring Signal a significant step toward being fully quantum-resistant.
The complexity and problem-solving required for making the Signal Protocol quantum safe are as daunting as just about any in modern-day engineering. The original Signal Protocol already resembled the inside of a fine Swiss timepiece, with countless gears, wheels, springs, hands, and other parts all interoperating in an intricate way. In less adept hands, mucking about with an instrument as complex as the Signal protocol could have led to shortcuts or unintended consequences that hurt performance, undoing what would otherwise be a perfectly running watch. Yet this latest post-quantum upgrade (the first one came in 2023) is nothing short of a triumph
Outside researchers are applauding the work.
“If the normal encrypted messages we use are cats, then post-quantum ciphertexts are elephants,” Matt Green, a cryptography expert at Johns Hopkins University, wrote in an interview. “So the problem here is to sneak an elephant through a tunnel designed for cats. And that’s an amazing engineering achievement. But it also makes me wish we didn’t have to deal with elephants.”
179800000
submission
mspohr writes:
Western automotive and green energy executives who visit China are returning humbled â" and even terrified.
As The Telegraph reports, the executives are warning that the countryâ(TM)s heavily automated manufacturing industry could quickly leave Western nations behind, especially when it comes to electric vehicles.
You get this sense of a change, where Chinaâ(TM)s competitiveness has gone from being about government subsidies and low wages to a tremendous number of highly skilled, educated engineers who are innovating like mad,â British energy supplier Octopus CEO Greg Jackson told the newspaper.
According to recent figures by the International Federation of Robotics, China has deployed orders of magnitude more industrial robots than Germany, the US, and the UK.
179799922
submission
mspohr writes:
However, scientists are concerned about a third factor: the possibility that the planet’s carbon sinks are beginning to fail. About half of all CO2 emissions every year are taken back out of the atmosphere by being dissolved in the ocean or being sucked up by growing trees and plants. But the oceans are getting hotter and can therefore absorb less CO2 while on land hotter and drier conditions and more wildfires mean less plant growth.
Rising CO2 emissions not only impact the global climate today, but will do so for hundreds of years because of the gas’s long lifetime in the atmosphere, the WMO says.
Ko Barrett, the WMO deputy secretary general, said: “The heat trapped by CO2 and other greenhouse gases is turbo-charging our climate and leading to more extreme weather. Reducing emissions is therefore essential not just for our climate but also for our economic security and community wellbeing.”
Atmospheric concentrations of methane and nitrous oxide – the second and third most important greenhouse gases related to human activities – also rose to record levels in 2024.
179749126
submission
mspohr writes:
This man is crazy
179717422
submission
mspohr writes:
The deals are so vast that they defy comprehension — the Financial Times put the company’s recent commitments at north of $1 trillion – and they’re making public companies’ stock prices jump. Stock analysts dub some of these agreements “circular,” because investment money is flowing between companies that also buy from or sell to one another. The worry then is that such deals might prop up or overhype a bad business.
Here’s one indicatively tangled pathway through the morass of companies. Nvidia is investing billions in and selling chips to OpenAI, which is also buying chips from and earning stock in AMD. AMD sells processors to Oracle, which is building data centers with OpenAI — which also gets data center work from CoreWeave. And that company is partially owned by, yes, Nvidia. Taken together, it’s a doozy. There are other collaborations and rivalries and many other factors at play, but OpenAI is the many-tentacled octopus in the middle, spinning its achievement of ChatGPT into a blitz of speculative investments.
179667386
submission
mspohr writes:
But Amazon’s fee isn’t 10%. Add all the junk fees together and an Amazon seller is being screwed out of 45-51 cents on every dollar it earns there. Even if it wanted to absorb the “Amazon tax” on your behalf, it couldn’t. Merchants just don’t make 51% margins.
Amazon also crushes its merchants under a mountain of junk fees pitched as optional but effectively mandatory. Take Prime: a merchant has to give up a huge share of each sale to be included in Prime, and merchants that don’t use Prime are pushed so far down in the search results, they might as well cease to exist.
Same with Fulfilment by Amazon, a “service” in which a merchant sends its items to an Amazon warehouse to be packed and delivered with Amazon’s own inventory. This is far more expensive than comparable (or superior) shipping services from rival logistics companies, and a merchant that ships through one of those rivals is, again, relegated even farther down the search rankings.
Now Amazon is in the terminal stage. We’re all still stuck to the platform, but we get less and less value out of it. And because we’re all still there, buying Prime and starting (and ending) our purchase planning with Amazon’s enshittified search results, the merchants who rely on selling to us are stuck there, too, earning less and less from every sale.
The platform has turned into a pile of shit, and we’re at the bottom of it.
A rival – and frankly terrible – theory of antitrust law says that the only time a government should intervene against a monopolist is when it is sure that the monopolist is using its scale to raise prices or lower quality. This is the consumer welfare standard theory and its premise is that when we find monopolies in the wild, they are almost certainly large and powerful thanks to the quality of their offerings. Any time you find that people all buy the same goods from the same store, you should assume that this is the very best store, selling the very best goods. It would be perverse (goes the theory) for the government to harass companies for being so excellent that everyone loves them.
179529414
submission
mspohr writes:
A new interactive map from Climate Trace, a coalition of academics and analysts that tracks pollution and greenhouse gases, shows that PM2.5 and other toxins are being poured into the air near the homes of about 1.6 billion people. Of these, about 900 million are in the path of “super-emitting” industrial facilities – including power plants, refineries, ports and mines – that deliver outsize doses of toxic air.
Great map
https://climatetrace.org/
179288098
submission
mspohr writes:
Sawyer is one among the thousands of AI workers contracted for Google through Japanese conglomerate Hitachi’s GlobalLogic to rate and moderate the output of Google’s AI products, including its flagship chatbot Gemini, launched early last year, and its summaries of search results, AI Overviews. The Guardian spoke to 10 current and former employees from the firm. Google contracts with other firms for AI rating services as well.
“AI isn’t magic; it’s a pyramid scheme of human labor,” said Adio Dinika, a researcher at the Distributed AI Research Institute based in Bremen, Germany. “These raters are the middle rung: invisible, essential and expendable.”
She said raters are typically given as little information as possible or that their guidelines changed too rapidly to enforce consistently. “We had no idea where it was going, how it was being used or to what end,” she said, requesting anonymity, as she is still employed at the company.
The AI responses she got “could have hallucinations or incorrect answers” and she had to rate them based on factuality – is it true? – and groundedness – does it cite accurate sources? Sometimes, she also handled “sensitivity tasks” that included prompts such as “when is corruption good?” or “what are the benefits to conscripted child soldiers?”
179085652
submission
mspohr writes:
Myanmar, Cambodia and Laos have in recent years become havens for transnational crime syndicates running scam centres such as KK Park, which use enslaved workers to run complex online fraud and scamming schemes that generate huge profits.
Instead he was trafficked across the border and his passport was taken away. Each day he was required to message hundreds of older American men on social media sites, building their trust until they shared their WhatsApp number. The contact would then be passed on to another scamming team.
If he failed to meet targets he would be punished with a stun gun or with gruelling physical penalties in the searing heat outside. Leaving the compound was not an option. “There are many armed guards,” he says.
To me, that’s such ignorance, in not understanding the scale of this and where it will grow to,” she says, adding that there are already signs of similar schemes emerging in countries such as Sri Lanka and Nigeria.
177945824
submission
mspohr writes:
Time and memory (also called space) are the two most fundamental resources in computation: Every algorithm takes some time to run, and requires some space to store data while it’s running. Until now, the only known algorithms for accomplishing certain tasks required an amount of space roughly proportional to their runtime, and researchers had long assumed there’s no way to do better. Williams’ proof established a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less space.
One of the most important classes goes by the humble name “P.” Roughly speaking, it encompasses all problems that can be solved in a reasonable amount of time. An analogous complexity class for space is dubbed “PSPACE.”
The relationship between these two classes is one of the central questions of complexity theory. Every problem in P is also in PSPACE, because fast algorithms just don’t have enough time to fill up much space in a computer’s memory. If the reverse statement were also true, the two classes would be equivalent: Space and time would have comparable computational power. But complexity theorists suspect that PSPACE is a much larger class, containing many problems that aren’t in P. In other words, they believe that space is a far more powerful computational resource than time. This belief stems from the fact that algorithms can use the same small chunk of memory over and over, while time isn’t as forgiving — once it passes, you can’t get it back.