Linux

Linus Torvalds Tactfully Discusses Value of getrandom() Upgrade for Linux vDSO (phoronix.com) 86

Linux's vDSO (or virtual dynamic shared object) is "a small shared library that the kernel automatically maps into the address space of all user-space applications," according to its man page. "There are some system calls the kernel provides that user-space code ends up using frequently, to the point that such calls can dominate overall performance... due both to the frequency of the call as well as the context-switch overhead that results from exiting user space and entering the kernel."

But Linus Torvalds had a lot to say about a proposed getrandom() upgrade, reports Phoronix: This getrandom() work in the vDSO has been through 20+ rounds of review over the past 2+ years, but... Torvalds took some time out of his U.S. Independence Day to argue the merits of the patches on the Linux kernel mailing list. Torvalds kicked things off by writing:


Nobody has explained to me what has changed since your last vdso getrandom, and I'm not planning on pulling it unless that fundamental flaw is fixed. Why is this _so_ critical that it needs a vdso? Why isn't user space just doing it itself? What's so magical about this all?

This all seems entirely pointless to me still, because it's optimizing something that nobody seems to care about, adding new VM infrastructure, new magic system calls, yadda yadda. I was very sceptical last time, and absolutely _nothing_ has changed. Not a peep on why it's now suddenly so hugely important again. We don't add stuff "just because we can". We need to have a damn good reason for it. And I still don't see the reason, and I haven't seen anybody even trying to explain the reason.



And then he responded to himself, adding:


In other words, I want to see actual *users* piping up and saying "this is a problem, here's my real load that spends 10% of time on getrandom(), and this fixes it". I'm not AT ALL interested in microbenchmarks or theoretical "if users need high-performance random numbers". I need a real actual live user that says "I can't just use rdrand and my own chacha mixing on top" and explains why having a SSE2 chachacha in kernel code exposed as a vdso is so critical, and a magical buffer maintained by the kernel."


Torvalds also added in a third message:


One final note: the reason I'm so negative about this all is that the random number subsystem has such an absolutely _horrendous_ history of two main conflicting issues: people wanting reasonable usable random numbers on one side, and then the people that discuss what the word "entropy" means on the other side. And honestly, I don't want the kernel stuck even *more* in the middle of that morass....

Torvalds made additional comments. ("This smells. It's BS...") Advocating for the change was WiredGuard developer Jason Donenfeld, and more communication happened (and continues to happen... 40 messages and counting).

At one point the discussion evolved to Torvalds saying "Bah. I guess I'll have to walk through the patch series once again. I'm still not thrilled about it. But I'll give it another go..."
Python

Fedora 41 Finally Retires Python 2.7 (fedoraproject.org) 25

"After sixteen years since the introduction of Python 3, the Fedora project announces that Python 2.7, the last of the Python 2 series, will be retired," according to long-time Slashdot reader slack_justyb.

From the announcement on the Fedora changes page: The python2.7 package will be retired without replacement from Fedora Linux 41. There will be no Python 2 in Fedora 41+ other than PyPy. Packages requiring python2.7 on runtime or buildtime will have to deal with the retirement or be retired as well.
"This also comes with the announcement that GIMP 3 will be coming to Fedora 41 to remove any last Python 2 dependencies," adds slack_justyb. GIMP 2 was originally released on March 23, 2004. GIMP will be updated to GIMP 3 with Python 3 support. Python 2 dependencies of GIMP will be retired.
Python 2's end of life was originally 2015, but was extended to 2020. The Python maintainers close with this: The Python maintainers will no longer regularly backport security fixes to Python 2.7 in RHEL, due to the the end of maintenance of RHEL 7 and the retirement of the Python 2.7 application stream in RHEL 8. We provided this obsolete package for 5 years beyond its retirement date and will continue to provide it until Fedora 40 goes end of life. Enough has been enough.
Open Source

FreeBSD Contributor Mocks Gloomy Predictions for the Open Source Movement (acm.org) 94

In Communications of the ACM, long-time FreeBSD contributor Poul-Henning Kamp mocks the idea that the free and open-source software movement has "come apart" and "will end in tears and regret." Economists and others focused on money — like my bank — have had a lot of trouble figuring out the free and open source software (FOSS) phenomenon, and eventually they seem to have reached the conclusion that it just makes no sense. So, they go with the flow. Recently, very serious people in the FOSS movement have started to write long and thoughtful opinion pieces about how it has all come apart and will end in tears and regret. Allow me to disagree...
What follows is a humorous history of how the Open Source movement bested a series of ill-conceived marketing failures starting after the "utterly bad" 1980s when IBM had an "unimaginably huge monopoly" — and an era of vendor lock-in from companies trying to be the next IBM: Out of that utter market failure came Minix, (Net/Free/Open)BSD, and Linux, at a median year of approximately 1991. I can absolutely guarantee that if we had been able to buy a reasonably priced and solid Unix for our 32-bit PCs — no strings attached — nobody would be running FreeBSD or Linux today, except possibly as an obscure hobby. Bill Gates would also have had a lot less of our money...
The essay moves on to when "that dot-com thing happened, fueled by the availability of FOSS operating systems, which did a much better job than any operating system you could buy — not just for the price, but in absolute terms of performance on any given piece of hardware. Thus, out of utter market failure, the FOSS movement was born."

And ultimately, the essay ends with our present day, and the phenomenon of companies that "make a business out of FOSS or derivatives thereof..." The "F" in FOSS was never silent. In retrospect, it seems clear that open source was not so much the goal itself as a means to an end, which is freedom: freedom to fix broken things, freedom from people who thought they could clutch the source code tightly and wield our ignorance of it as a weapon to force us all to pay for and run Windows Vista. But the FOSS movement has won what it wanted, and no matter how much oldsters dream about their glorious days as young revolutionaries, it is not coming back; the frustrations and anger of IT in 2024 are entirely different from those of 1991.

One very big difference is that more people have realized that source code is a liability rather than an asset. For some, that realization came creeping along the path from young teenage FOSS activists in the late 1990s to CIOs of BigCorp today. For most of us, I expect, it was the increasingly crushing workload of maintaining legacy code bases...

Privacy

New SnailLoad Attack Exploits Network Latency To Spy On Users' Web Activities (thehackernews.com) 13

Longtime Slashdot reader Artem S. Tashkinov shares a report from The Hacker News: A group of security researchers from the Graz University of Technology have demonstrated a new side-channel attack known as SnailLoad that could be used to remotely infer a user's web activity. "SnailLoad exploits a bottleneck present on all Internet connections," the researchers said in a study released this week. "This bottleneck influences the latency of network packets, allowing an attacker to infer the current network activity on someone else's Internet connection. An attacker can use this information to infer websites a user visits or videos a user watches." A defining characteristic of the approach is that it obviates the need for carrying out an adversary-in-the-middle (AitM) attack or being in physical proximity to the Wi-Fi connection to sniff network traffic. Specifically, it entails tricking a target into loading a harmless asset (e.g., a file, an image, or an ad) from a threat actor-controlled server, which then exploits the victim's network latency as a side channel to determine online activities on the victim system.

To perform such a fingerprinting attack and glean what video or a website a user might be watching or visiting, the attacker conducts a series of latency measurements of the victim's network connection as the content is being downloaded from the server while they are browsing or viewing. It then involves a post-processing phase that employs a convolutional neural network (CNN) trained with traces from an identical network setup to make the inference with an accuracy of up to 98% for videos and 63% for websites. In other words, due to the network bottleneck on the victim's side, the adversary can deduce the transmitted amount of data by measuring the packet round trip time (RTT). The RTT traces are unique per video and can be used to classify the video watched by the victim. The attack is so named because the attacking server transmits the file at a snail's pace in order to monitor the connection latency over an extended period of time.

Games

Minecraft Seeks New Revenue as Gaming Growth Slows (yahoo.com) 20

Mojang Studios, the creator of the globally popular video game Minecraft, is diversifying its revenue streams amid slowing growth in the gaming industry. Chief Executive Asa Bredin revealed in an interview that the company is exploring new partnerships in merchandising, education, and content streaming. The company is also venturing into film and television, with a Warner Bros. movie adaptation set to premiere in April and a Netflix series in development. From a report: Mojang's push follows repeated forays by Nintendo and Sony Group to broaden the appeal of their gaming properties at a time that spending in the industry has hit a lull. Nintendo is developing a live-action film based on the Legend of Zelda franchise, following the blockbuster success of The Super Mario Bros. Movie, while Sony has turned The Last of Us into an HBO series and created games based on the Spider-Man movies.
Apple

Epic Games Says Apple Stalling Launch of Its Game Store in Europe (reuters.com) 62

"Fortnite" maker Epic Games said on Friday Apple was impeding its attempts to set up a games store on iPhones and iPads in Europe, the latest escalation in a bitter feud over the technology giant's control of the iOS app ecosystem. From a report: Apple has twice rejected documents it submitted to launch the Epic Games Store because the design of certain buttons and labels was similar to those used by its App Store, the video-game publisher said. "We are using the same 'Install' and 'In-app purchases' naming conventions that are used across popular app stores on multiple platforms, and are following standard conventions for buttons in iOS apps," Epic said in a series of posts on X. "Apple's rejection is arbitrary, obstructive, and in violation of the DMA, and we've shared our concerns with the European Commission," it said. Under pressure from European regulators, Apple had in March cleared the way for Epic to put its own game store on iOS devices in Europe.
Piracy

Sony Music Goes After Piracy Portal 'Hikari-no-Akari' (torrentfreak.com) 15

An anonymous reader quotes a report from TorrentFreak: Hikari-no-Akari, a long-established and popular pirate site that specializes in Japanese music, is being targeted in U.S. federal court by Sony Music. [...] The music download portal, which links to externally hosted files, has been operating for well over a decade and currently draws more than a million monthly visits. In addition to the public-facing part of the site, HnA also has a private forum and Discord channel. [...] Apparently, Sony Music Japan has been keeping an eye on the unauthorized music portal. The company has many of its works shared on the site, including anime theme music, which is popular around the globe.

For example, a few weeks ago, HnA posted "Sayonara, Mata Itsuka!" from the Japanese artist Kenshi Yonezu, which is used as the theme song for the asadora series "The Tiger and Her Wings." Around the same time, PEACEKEEPER, a song by Japanese musician STEREO DIVE FOUNDATION, featured in the third season of the series "That Time I Got Reincarnated as a Slime", was shared on the site. Sony Music Japan is a rightsholder for both these tracks, as well as many others that were posted on the site. The music company presumably tried to contact HnA directly to have these listings removed and reached out to its CDN service Cloudflare too, asking it to take action. [...] They are a prerequisite for obtaining a DMCA subpoena, which Sony Music Japan requested at a California federal court this week.

Sony requested two DMCA subpoenas, both targeted at hikarinoakari.com and hnadownloads.co. The latter domain receives the bulk of its traffic from the first, which isn't a surprise considering the 'hnadownloads' name. Through the subpoena, the music company hopes to obtain additional information on the people behind these sites. That includes, names, IP-addresses, and payment info. Presumably, this will be used for follow-up enforcement actions. It's unclear whether Cloudflare will be able to hand over any usable information and for the moment, HnA remains online. Several of the infringing URLs that were identified by Sony have recently been taken down, including this one. However, others remain readily available. The same applies to private forum threads and Discord postings, of course.

United States

Supreme Court Ruling Will Likely Cause Cyber Regulation Chaos (csoonline.com) 408

An anonymous reader shares a report: The US Supreme Court has issued a decision that could upend all federal cybersecurity regulations, moving ultimate regulatory approval to the courts and away from regulatory agencies. A host of likely lawsuits could gut the Biden administration's spate of cyber incident reporting requirements and other recent cyber regulatory actions. [...] While the Court's decision has the potential to weaken or substantially alter all federal agency cybersecurity requirements ever adopted, a series of cyber regulatory initiatives implemented over the past four years could become the particular focus of legal challenges. Parties who previously objected to these initiatives but were possibly reluctant to fight due to the Chevron deference will likely be encouraged to challenge these regulations.

Although all existing regulations are still in effect, the upshot for CISOs is almost certainly some degree of uncertainty as the legal challenges get underway. A host of conflicting decisions across the various judicial circuits in the US could lead to confusion in compliance programs until the smoke clears. CISOs should expect some court cases to water down or eliminate many existing cybersecurity regulatory requirements. A host of recently adopted cyber regulations will likely be challenged following the Court's ruling, but some recent regulations stand out as leading candidates for litigation. Among these are:

Earth

Many Carbon Capture Projects Are Now Launching (yahoo.com) 93

The Los Angeles Times reports that "multiple projects seeking to remove carbon dioxide from the air have been launched across Los Angeles County: When completed, Project Monarch and its wastewater component, Pure Water Antelope Valley, will purify up to 4.5 million gallons of water each day and capture 25,000 tons of atmospheric CO2 each year. (The typical gasoline-powered automobile spews 4.6 tons of carbon each year, according to the Environmental Protection Agency).... But the Palmdale project isn't the only new carbon-capture development in L.A. County. On Friday, officials from CarbonCapture Inc. gathered in Long Beach to introduce the first commercial-scale U.S. direct air capture, or DAC, system designed for mass production. The unit, which resembles a shipping container, can remove more than 500 tons of atmospheric CO2 per year... The L.A.-based company also announced that it will mass-produce up to 4,000 of its DAC modules annually at a new facility in Mesa, Arizona. It joins similar efforts from L.A.-based Captura, which is working to remove CO2 from the upper ocean; L.A.-based Avnos, which produces water while capturing carbon; and L.A.-based Equatic, which is working to remove atmospheric CO2 using the ocean...

[Equatic's] San Pedro facility pumps seawater through a series of electric plates that separate the water into hydrogen and oxygen as well as acidic and alkaline streams of liquid. The alkaline, or base, stream is exposed to the atmosphere, where it mineralizes CO2 into carbonates that are then dissolved and discharged back into the ocean for permanent storage, operators say Additionally, the hydrogen produced by the process is carbon-negative, making it a source of renewable energy that can be used to fuel the CO2 removal process or sold to other users, said Edward Sanders, chief operating officer at Equatic.

Equatic announced this month that it will partner with a Canadian carbon removal project developer, Deep Sky, to build North America's first commercial-scale ocean-based CO2 removal plant in Quebec, following the success of its effort in Los Angeles as well as another facility in Singapore. While the San Pedro facility can capture about 40 tons of CO2 per year, the Quebec facility will capture about 100,000 tons per year, Sanders said.

Meanwhile, two new projects by direct air capture company Heirloom were announced this week in Louisiana. Those projects are "expected to remove hundreds of thousands of tons of carbon dioxide from the air per year," according to the Associated Press, "and store it deep underground... part of "a slew of carbon removal and storage projects that have been announced in Louisiana." Heirloom estimates that they will eventually remove 320,000 tons of carbon dioxide each year... The company uses limestone, a natural absorbent, to extract carbon dioxide from the air. Heirloom's technology reduces the time it takes to absorb carbon dioxide in nature from years to just three days, according to the company's press release. The carbon dioxide is then removed from the limestone material and stored permanently underground.
In May America's Energy department also announced $3.5 billion in funding for its carbon-capture program — four large-scale, regional direct air capture hubs "that each comprise a network of carbon dioxide removal projects..." The hubs will have the capacity to capture and then permanently store at least one million metric tons of CO2 from the atmosphere annually, either from a single unit or from multiple interconnected units.
And Shell Canada has a pair of carbon capture projects in Alberta it expects to have operational toward the end of 2028, according to the CBC: The Polaris project is designed to capture about 650,000 tonnes of carbon dioxide annually from the Scotford complex. That works out to approximately 40 per cent of Scotford's direct CO2 emissions from the refinery and 22 per cent of its emissions from the chemicals complex.
Sci-Fi

William Gibson's 'Neuromancer' to Become a Series on Apple TV+ 149

It's been adapted into a graphic novel, a videogame, a radio play, and an opera, according to Wikipedia — which also describes years of trying to adapt Neuromancer into a movie. "The landmark 1984 cyberpunk novel has been on Hollywood's wishlist for decades," writes Gizmodo, "with multiple filmmakers attempting to bring it to the big screen." (Back in 2010, Slashdot's CmdrTaco even posted an update with the headline "Neuromancer Movie In Your Future?" with a 2011 story promising the movie deal was "moving forward....")

But now Deadline reports it's becoming a 10-episode series on Apple TV+ (co-produced by Apple Studios) starring Callum Turner and Brianna Middleton: Created for television by Graham Roland and JD Dillard, Neuromancer follows a damaged, top-rung super-hacker named Case (Turner) who is thrust into a web of digital espionage and high stakes crime with his partner Molly (Middleton), a razor-girl assassin with mirrored eyes, aiming to pull a heist on a corporate dynasty with untold secrets.
More from Gizmodo: "We're incredibly excited to be bringing this iconic property to Apple TV+," Roland and Dillard said in a statement. "Since we became friends nearly 10 years ago, we've looked for something to team up on, so this collaboration marks a dream come true. Neuromancer has inspired so much of the science fiction that's come after it and we're looking forward to bringing television audiences into Gibson's definitive 'cyberpunk' world."
The novel launched Gibson's "Sprawl" trilogy of novels (building on the dystopia in his 1982 short story "Burning Chrome"), also resurrecting the "Molly Millions" character from Johnny Mnemonic — an even earlier short story from 1981...
Education

ChatGPT Outperforms Undergrads In Intro-Level Courses, Falls Short Later (arstechnica.com) 93

Peter Scarfe, a researcher at the University of Reading's School of Psychology and Clinical Language Sciences, conducted an experiment testing the vulnerability of their examination system to AI-generated work. Using ChatGPT-4, Scarfe's team submitted over 30 AI-generated answers across multiple undergraduate psychology modules, finding that 94 percent of these submissions went undetected and nearly 84 percent received higher grades than human counterparts. The findings have been published in the journal PLOS One. Ars Technica reports: Scarfe's team submitted AI-generated work in five undergraduate modules, covering classes needed during all three years of study for a bachelor's degree in psychology. The assignments were either 200-word answers to short questions or more elaborate essays, roughly 1,500 words long. "The markers of the exams didn't know about the experiment. In a way, participants in the study didn't know they were participating in the study, but we've got necessary permissions to go ahead with that," Scarfe claims. Shorter submissions were prepared simply by copy-pasting the examination questions into ChatGPT-4 along with a prompt to keep the answer under 160 words. The essays were solicited the same way, but the required word count was increased to 2,000. Setting the limits this way, Scarfe's team could get ChatGPT-4 to produce content close enough to the required length. "The idea was to submit those answers without any editing at all, apart from the essays, where we applied minimal formatting," says Scarfe.

Overall, Scarfe and his colleagues slipped 63 AI-generated submissions into the examination system. Even with no editing or efforts to hide the AI usage, 94 percent of those went undetected, and nearly 84 percent got better grades (roughly half a grade better) than a randomly selected group of students who took the same exam. "We did a series of debriefing meetings with people marking those exams and they were quite surprised," says Scarfe. Part of the reason they were surprised was that most of those AI submissions that were detected did not end up flagged because they were too repetitive or robotic -- they got flagged because they were too good.

Out of five modules where Scarfe's team submitted AI work, there was one where it did not receive better grades than human students: the final module taken by students just before they left the university. "Large language models can emulate human critical thinking, analysis, and integration of knowledge drawn from different sources to a limited extent. In their last year at the university, students are expected to provide deeper insights and use more elaborate analytical skills. The AI isn't very good at that, which is why students fared better," Scarfe explained. All those good grades Chat GPT-4 got were in the first- and second-year exams, where the questions were easier. "But the AI is constantly improving, so it's likely going to score better in those advanced assignments in the future. And since AI is becoming part of our lives and we don't really have the means to detect AI cheating, at some point we are going to have to integrate it into our education system," argues Scarfe. He said the role of a modern university is to prepare the students for their professional careers, and the reality is they are going to use various AI tools after graduation. So, they'd be better off knowing how to do it properly.

Chrome

Google Cuts Ties With Entrust in Chrome Over Trust Issues (theregister.com) 12

Google is severing its trust in Entrust after what it describes as a protracted period of failures around compliance and general improvements. From a report: Entrust is one of the many certificate authorities (CA) used by Chrome to verify that the websites end users visit are trustworthy. From November 1 in Chrome 127, which recently entered beta, TLS server authentication certificates validating to Entrust or AffirmTrust roots won't be trusted by default.

Google pointed to a series of incident reports over the past few years concerning Entrust, saying they "highlighted a pattern of concerning behaviors" that have ultimately seen the security company fall down in Google's estimations. The incidents have "eroded confidence in [Entrust's] competence, reliability, and integrity as a publicly trusted CA owner," Google stated in a blog.
The move follows a May publication by Mozilla, which compiled a sprawling list of Entrust's certificate issues between March and May this year. Entrust -- after an initial PR disaster -- acknowledged its procedural failures and said it was treating the feedback as a learning opportunity.
Youtube

The Majority of Gen Z Describe Themselves as Video Content Creators (washingtonpost.com) 31

For the first two decades of the social internet, lurkers ruled. Among Gen Z, they're in the minority, according to survey data from YouTube. From a report: Tech industry insiders used to cite a rule of thumb stating that only one in ten of an online community's users generally post new content, with the masses logging on only to consume images, video or other updates. Now younger generations are flipping that divide, a survey by the video platform said. YouTube found that 65 percent of Gen Z, which it defined as people between the ages of 14 and 24, describe themselves as video content creators -- making lurkers a minority.

The finding came from responses from 350 members of Gen Z in the U.S., out of a wider survey that asked thousands of people about how they spend time online, including whether they consider themselves video creators. YouTube did the survey in partnership with research firm SmithGeiger, as part of its annual report on trends on the platform. YouTube's report says that after watching videos online, many members of Gen Z respond with videos of their own, uploading their own commentary, reaction videos, deep dives into content posted by others and more. This kind of interaction often develops in response to videos on pop culture topics such as "RuPaul's Drag Race" or the Fallout video game series. Fan-created content can win more watch time than the original source material, the report says.

Security

Remote Access Giant TeamViewer Says Russian Spies Hacked Its Corporate Network (techcrunch.com) 29

TeamViewer, the company that makes widely used remote access tools for companies, has confirmed an ongoing cyberattack on its corporate network. TechCrunch: In a statement Friday, the company attributed the compromise to government-backed hackers working for Russian intelligence, known as APT29 (and Midnight Blizzard). The Germany-based company said its investigation so far points to an initial intrusion on June 26 "tied to credentials of a standard employee account within our corporate IT environment."

TeamViewer said that the cyberattack "was contained" to its corporate network and that the company keeps its internal network and customer systems separate. The company added that it has "no evidence that the threat actor gained access to our product environment or customer data."
Martina Dier, a spokesperson for TeamViewer, declined to answer a series of questions from TechCrunch, including whether the company has the technical ability, such as logs, to determine what, if any, data was accessed or exfiltrated from its network.
Patents

Microsoft's Canceled Xbox Cloud Console Gets Detailed In New Patent (windowscentral.com) 4

Microsoft's canceled Xbox cloud console, codenamed Keystone, has been detailed in a new patent spotted by Windows Central's Zac Bowden. From the report: Back in 2021, Microsoft announced that it was working on a dedicated streaming device for Xbox Game Pass. That device was later revealed to be codenamed Keystone, which took the form of a streaming box that would sit under your TV, cost a fraction of the price of a normal Xbox, and enable the ability to play Xbox games via the cloud. Unfortunately, it appears Microsoft has since scrapped plans to ship Xbox Keystone due to an inability to bring the price down to a level where it made sense for customers. Xbox CEO Phil Spencer is on record saying the device should have costed around $99 or $129, but the company was unable to achieve this.

Thanks to a patent discovered by Windows Central, we can finally take a closer look at the box Microsoft had conjured up internally. First up, the patent reveals that the console took the form of an even square with a circle shape on top, similar to the black circular vent on an Xbox Series S. The front of the box had the Xbox power button, and a USB-A port. Around the back, there were three additional ports; HDMI, ethernet, and power. On the right side of the console there was appears to be an Xbox controller pairing button, and the underside featured a circular "Hello from Seattle" plate that the console sat on, similar to the Xbox Series X. This patent was filed in June 2022, which was around the time when the first details of Xbox Keystone were being revealed.

China

China Becomes First Country To Retrieve Rocks From the Moon's Far Side (nytimes.com) 55

China brought a capsule full of lunar soil [non-paywalled link] from the far side of the moon down to Earth on Tuesday, achieving the latest success in an ambitious schedule to explore the moon and other parts of the solar system. From a report: The sample, retrieved by the China National Space Administration's Chang'e-6 lander after a 53-day mission, highlights China's growing capabilities in space and notches another win in a series of lunar missions that started in 2007 and have so far been executed almost without flaw. "Chang'e-6 is the first mission in human history to return samples from the far side of the moon," Long Xiao, a planetary geologist at China University of Geosciences, wrote in an email. "This is a major event for scientists worldwide," he added, and "a cause for celebration for all humanity."

Such sentiments and the prospects of international lunar sample exchanges highlighted the hope that China's robotic missions to the moon and Mars will serve to advance scientific understanding of the solar system. Those possibilities are contrasted by views in Washington and elsewhere that Tuesday's achievement is the latest milestone in a 21st-century space race with geopolitical overtones. In February, a privately operated American spacecraft landed on the moon. NASA is also pursuing the Artemis campaign to return Americans to the lunar surface, although its next mission, a flight by astronauts around the moon, has been delayed because of technical issues. China, too, is looking to expand its presence on the moon, landing more robots there, and eventually human astronauts, in the years to come.

EU

China and EU To Hold Talks On Electric Car Tariffs (bbc.com) 47

Top officials from the European Union and China agreed to negotiate a planned series of import taxes on Chinese electric vehicles. "The call marks the first time the two sides have agreed to negotiate since the EU threatened China with electric vehicle (EV) tariffs of up to 38%," reports the BBC. From the report: The EU said Chinese EVs were unfairly subsidised by its government. In response, China accused the EU of protectionism and trade rule breaches. An EU spokesperson told the BBC the call between Trade Commissioner Valdis Dombrovskis and his Chinese counterpart Wang Wentao was "candid and constructive." They said the two sides would "continue to engage at all levels in the coming weeks." However, the spokesperson also doubled down on the EU's opposition to how the Chinese EV industry is funded. They said "any negotiated outcome" to the proposed tariffs must address the "injurious subsidisation" of Chinese EVs.

China released a similar statement on Saturday and made clear it still disagreed with the EU. As well as its call with the EU, Mr Wang met German Vice-Chancellor and Federal Minister for Economic Affairs and Climate Action Robert Habeck on Saturday. In a Facebook post about the meeting, China's Ministry of Commerce said it had told Mr Habeck about its "firm opposition" to the tariffs. It repeated its threat to file a lawsuit with the World Trade Organization (WTO) "to firmly defend its legitimate rights and interests."

Germany has also expressed criticism of the tariffs. When the EU first proposed them last week following its investigation of Chinese EVs in the trading bloc, Germany's Transport Minister, Volker Wissing, said the move risked a "trade war" with Beijing. "The European Commission's punitive tariffs hit German companies and their top products," he wrote on X, formerly known as Twitter, at the time. The European car industry has been critical too. Stellantis - which owns Citroen, Peugeot, Vauxhall, Fiat, and several other brands - said it did not support measures that "contribute to the world fragmentation [of trade]."

Netscape

Slashdot Asks: What Do You Remember About the Web in 1994? (fastcompany.com) 171

"The Short Happy Reign of the CD-ROM" was just one article in a Fast Company series called 1994 Week. As the week rolled along they also re-visited Yahoo, Netscape, and how the U.S. Congress "forced the videogame industry to grow up."

But another article argues that it's in web pages from 1994 that "you can start to see in those weird, formative years some surprising signs of what the web would be, and what it could be." It's hard to say precisely when the tipping point was. Many point to September '93, when AOL users first flooded Usenet. But the web entered a new phase the following year. According to an MIT study, at the start of 1994, there were just 623 web servers. By year's end, it was estimated there were at least 10,000, hosting new sites including Yahoo!, the White House, the Library of Congress, Snopes, the BBC, sex.com, and something called The Amazing FishCam. The number of servers globally was doubling every two months. No one had seen growth quite like that before. According to a press release announcing the start of the World Wide Web Foundation that October, this network of pages "was widely considered to be the fastest-growing network phenomenon of all time."

As the year began, Web pages were by and large personal and intimate, made by research institutions, communities, or individuals, not companies or brands. Many pages embodied the spirit, or extended the presence, of newsgroups on Usenet, or "User's Net." (Snopes and the Internet Movie Database, which landed on the Web in 1993, began as crowd-sourced projects on Usenet.) But a number of big companies, including Microsoft, Sun, Apple, IBM, and Wells Fargo, established their first modest Web outposts in 1994, a hint of the shopping malls and content farms and slop factories and strip mines to come. 1994 also marked the start of banner ads and online transactions (a CD, pizzas), and the birth of spam and phishing...

[B]ack in '94, the salesmen and oilmen and land-grabbers and developers had barely arrived. In the calm before the storm, the Web was still weird, unruly, unpredictable, and fascinating to look at and get lost in. People around the world weren't just writing and illustrating these pages, they were coding and designing them. For the most part, the design was non-design. With a few eye-popping exceptions, formatting and layout choices were simple, haphazard, personal, and — in contrast to most of today's web — irrepressibly charming. There were no table layouts yet; cascading style sheets, though first proposed in October 1994 by Norwegian programmer Håkon Wium Lie, wouldn't arrive until December 1996... The highways and megalopolises would come later, courtesy of some of the world's biggest corporations and increasingly peopled by bots, but in 1994 the internet was still intimate, made by and for individuals... Soon, many people would add "under construction" signs to their Web pages, like a friendly request to pardon our dust. It was a reminder that someone was working on it — another indication of the craft and care that was going into this never-ending quilt of knowledge.

The article includes screenshots of Netscape in action from browser-emulating site OldWeb.Today (albeit without using a 14.4 kbps modems). "Look in and think about how and why this web grew the way it did, and what could have been. Or try to imagine what life was like when the web wasn't worldwide yet, and no one knew what it really was."

Slashdot reader tedlistens calls it "a trip down memory lane," offering "some telling glimpses of the future, and some lessons for it too." The article revisits 1994 sites like Global Network Navigator, Time-Warner's Pathfinder, and Wired's online site HotWired as well as 30-year-old versions of the home pages for Wells Fargo and Microsoft.

What did they miss? Share your own memories in the comments.

What do you remember about the web in 1994?
United States

Why Washington's Mount Rainier Still Makes Volcanologists Worry (cnn.com) 71

It's been a 1,000 years since there was a significant volcanic eruption from Mount Rainier, CNN reminds readers. It's a full 60 miles from Tacoma, Washington — and 90 miles from Seattle. Yet "more than Hawaii's bubbling lava fields or Yellowstone's sprawling supervolcano, it's Mount Rainier that has many U.S. volcanologists worried."

"Mount Rainier keeps me up at night because it poses such a great threat to the surrounding communities, said Jess Phoenix, a volcanologist and ambassador for the Union of Concerned Scientists, on an episode of CNN's series "Violent Earth With Liv Schreiber." The sleeping giant's destructive potential lies not with fiery flows of lava, which, in the event of an eruption, would be unlikely to extend more than a few miles beyond the boundary of Mount Rainier National Park in the Pacific Northwest. And the majority of volcanic ash would likely dissipate downwind to the east away from population centers, according to the US Geological Survey. Instead, many scientists fear the prospect of a lahar — a swiftly moving slurry of water and volcanic rock originating from ice or snow rapidly melted by an eruption that picks up debris as it flows through valleys and drainage channels.

"The thing that makes Mount Rainier tough is that it is so tall, and it's covered with ice and snow, and so if there is any kind of eruptive activity, hot stuff ... will melt the cold stuff and a lot of water will start coming down," said Seth Moran, a research seismologist at USGS Cascades Volcano Observatory in Vancouver, Washington. "And there are tens, if not hundreds of thousands of people who live in areas that potentially could be impacted by a large lahar, and it could happen quite quickly." The deadliest lahar in recent memory was in November 1985 when Colombia's Nevado del Ruiz volcano erupted. Just a couple hours after the eruption started, a river of mud, rocks, lava and icy water swept over the town of Armero, killing over 23,000 people in a matter of minutes... Bradley Pitcher, a volcanologist and lecturer in Earth and environmental sciences at Columbia University, said in an episode of CNN's "Violent Earth"... said that Mount Rainier has about eight times the amount of glaciers and snow as Nevado del Ruiz had when it erupted. "There's the potential to have a much more catastrophic mudflow...."

Lahars typically occur during volcanic eruptions but also can be caused by landslides and earthquakes. Geologists have found evidence that at least 11 large lahars from Mount Rainier have reached into the surrounding area, known as the Puget Lowlands, in the past 6,000 years, Moran said.

Two major U.S. cities — Tacoma and South Seattle — "are built on 100-foot-thick (30.5-meter) ancient mudflows from eruptions of Mount Rainier," the volcanologist said on CNN's "Violent Earth" series.

CNN's article adds that the US Geological Survey already set up a lahar detection system at Mount Rainier in 1998, "which since 2017 has been upgraded and expanded. About 20 sites on the volcano's slopes and the two paths identified as most at risk of a lahar now feature broadband seismometers that transmit real-time data and other sensors including trip wires, infrasound sensors, web cameras and GPS receivers."
Space

Tuesday SpaceX Launches a NOAA Satellite to Improve Weather Forecasts for Earth and Space (space.com) 20

Tuesday a SpaceX Falcon Heavy rocket will launch a special satellite — a state-of-the-art weather-watcher from America's National Oceanic and Atmospheric Administration.

It will complete a series of four GOES-R satellite launches that began in 2016. Space.com drills down into how these satellites have changed weather forecasts: More than seven years later, with three of the four satellites in the series orbiting the Earth, scientists and researchers say they are pleased with the results and how the advanced technology has been a game changer. "I think it has really lived up to its hype in thunderstorm forecasting. Meteorologists can see the convection evolve in near real-time and this gives them enhanced insight on storm development and severity, making for better warnings," John Cintineo, a researcher from NOAA's National Severe Storms Laboratory , told Space.com in an email.

"Not only does the GOES-R series provide observations where radar coverage is lacking, but it often provides a robust signal before radar, such as when a storm is strengthening or weakening. I'm sure there have been many other improvements in forecasts and environmental monitoring over the last decade, but this is where I have most clearly seen improvement," Cintineo said. In addition to helping predict severe thunderstorms, each satellite has collected images and data on heavy rain events that could trigger flooding, detected low clouds and fog as it forms, and has made significant improvements to forecasts and services used during hurricane season. "GOES provides our hurricane forecasters with faster, more accurate and detailed data that is critical for estimating a storm's intensity, including cloud top cooling, convective structures, specific features of a hurricane's eye, upper-level wind speeds, and lightning activity," Ken Graham, director of NOAA's National Weather Service told Space.com in an email.

Instruments such as the Advanced Baseline Imager have three times more spectral channels, four times the image quality, and five times the imaging speed as the previous GOES satellites. The Geostationary Lightning Mapper is the first of its kind in orbit on the GOES-R series that allows scientists to view lightning 24/7 and strikes that make contact with the ground and from cloud to cloud. "GOES-U and the GOES-R series of satellites provides scientists and forecasters weather surveillance of the entire western hemisphere, at unprecedented spatial and temporal scales," Cintineo said. "Data from these satellites are helping researchers develop new tools and methods to address problems such as lightning prediction, sea-spray identification (sea-spray is dangerous for mariners), severe weather warnings, and accurate cloud motion estimation. The instruments from GOES-R also help improve forecasts from global and regional numerical weather models, through improved data assimilation."

The final satellite, launching Tuesday, includes a new sensor — the Compact Coronagraph — "that will monitor weather outside of Earth's atmosphere, keeping an eye on what space weather events are happening that could impact our planet," according to the article.

"It will be the first near real time operational coronagraph that we have access to," Rob Steenburgh, a space scientist at NOAA's Space Weather Prediction Center, told Space.com on the phone. "That's a huge leap for us because up until now, we've always depended on a research coronagraph instrument on a spacecraft that was launched quite a long time ago."

Slashdot Top Deals