Open Source

China's Lead in Open-Source AI Jolts Washington and Silicon Valley (msn.com) 71

China has established a lead in the field of open-source AI, a development that is reportedly sending jolts through both Washington and Silicon Valley. The nation's progress has become a significant event for American policymakers in the U.S. capital. The advancement has registered as a shock within Silicon Valley, the hub of the American technology industry. From the report: The overall performance of China's best open-weight model has surpassed the American open-source champion since November, according to research firm Artificial Analysis. The firm, which rates the ability of models in math, coding and other areas, found a version of Alibaba's Qwen3 beat OpenAI's gpt-oss.

However, the Chinese model is almost twice as big as OpenAI's, suggesting that for simpler tasks, Qwen might consume more computing power to do the same job. OpenAI said its open-source model outperformed rivals of similar size on reasoning tasks and delivered strong performance at low cost.

Python

How Python is Fighting Open Source's 'Phantom' Dependencies Problem (blogspot.com) 32

Since 2023 the Python Software Foundation has had a Security Developer-in-Residence (sponsored by the Open Source Security Foundation's vulnerability-finding "Alpha-Omega" project). And he's just published a new 11-page white paper about open source's "phantom dependencies" problem — suggesting a way to solve it.

"Phantom" dependencies aren't tracked with packaging metadata, manifests, or lock files, which makes them "not discoverable" by tools like vulnerability scanners or compliance and policy tools. So Python security developer-in-residence Seth Larson authored a recently-accepted Python Enhancement Proposal offering an easy way for packages to provide metadata through Software Bill-of-Materials (SBOMs). From the whitepaper: Python Enhancement Proposal 770 is backwards compatible and can be enabled by default by tools, meaning most projects won't need to manually opt in to begin generating valid PEP 770 SBOM metadata. Python is not the only software package ecosystem affected by the "Phantom Dependency" problem. The approach using SBOMs for metadata can be remixed and adopted by other packaging ecosystems looking to record ecosystem-agnostic software metadata...

Within Endor Labs' [2023 dependencies] report, Python is named as one of the most affected packaging ecosystems by the "Phantom Dependency" problem. There are multiple reasons that Python is particularly affected:

- There are many methods for interfacing Python with non-Python software, such as through the C-API or FFI. Python can "wrap" and expose an easy-to-use Python API for software written in other languages like C, C++, Rust, Fortran, Web Assembly, and more.

- Python is the premier language for scientific computing and artificial intelligence, meaning many high-performance libraries written in system languages need to be accessed from Python code.

- Finally, Python packages have a distribution type called a "wheel", which is essentially a zip file that is "installed" by being unzipped into a directory, meaning there is no compilation step allowed during installation. This is great for being able to inspect a package before installation, but it means that all compiled languages need to be pre-compiled into binaries before installation...


When designing a new package metadata standard, one of the top concerns is reducing the amount of effort required from the mostly volunteer maintainers of packaging tools and the thousands of projects being published to the Python Package Index... By defining PEP 770 SBOM metadata as using a directory of files, rather than a new metadata field, we were able to side-step all the implementation pain...

We'll be working to submit issues on popular open source SBOM and vulnerability scanning tools, and gradually, Phantom Dependencies will become less of an issue for the Python package ecosystem.

The white paper "details the approach, challenges, and insights into the creation and acceptance of PEP 770 and adopting Software Bill-of-Materials (SBOMs) to improve the measurability of Python packages," explains an announcement from the Python Software Foundation. And the white paper ends with a helpful note.

"Having spoken to other open source packaging ecosystem maintainers, we have come to learn that other ecosystems have similar issues with Phantom Dependencies. We welcome other packaging ecosystems to adopt Python's approach with PEP 770 and are willing to provide guidance on the implementation."
AI

Initiative Seeks AI Lab to Build 'American Truly Open Models' (ATOM) (msn.com) 16

"Benchmarking firm Artificial Analysis found that only five of the top 15 AI models are open source," reports the Washington Post, "and all were developed by Chinese AI companies...."

"Now some American executives, investors and academics are endorsing a plan to make U.S. open-source AI more competitive." A new campaign called the ATOM Project, for American Truly Open Models, aims to create a U.S.-based AI lab dedicated to creating software that developers can freely access and modify. Its blueprint calls for access to serious computing power, with upward of 10,000 of the cutting-edge GPU chips used to power corporate AI development. The initiative, which launched Monday, has gathered signatures of support from more than a dozen industry figures. They include veteran tech investor Bill Gurley; Clement Delangue, CEO of Hugging Face, a repository for open-source AI models and datasets; Stanford professor and AI investor Chris Manning; chipmaker Nvidia's director of applied research, Oleksii Kuchaiev; Jason Kwon, chief strategy officer for OpenAI; and Dylan Patel, CEO and founder of research firm SemiAnalysis...

The lack of progress in open-source AI underscores the case for initiatives like ATOM: The U.S. has not produced a major new open-source AI release since Meta's launch of its Llama 4 model in April, which disappointed some AI experts... "A lot of it is a coordination problem," said ATOM's creator, Nathan Lambert, a senior research scientist at the nonprofit Allen Institute for AI who is launching the project in a personal capacity... Lambert said the idea was to develop much more powerful open-source AI models than existing U.S. efforts such as Bloom, an AI language model from Hugging Face, Pythia from EleutherAI, and others. Those groups were willing to take on more legal risk in the name of scientific progress but suffered from underfunding, said Lambert, who has worked at Google's DeepMind AI lab, Facebook AI Research and Hugging Face.

The other problem? The hefty cost of top-performing AI. Lambert estimates that getting access to 10,000 state-of-the-art GPUs will cost at least $100 million. But the funding must be found if American efforts are to stay competitive, he said.

The initiative's web page is seeking signatures, but also asks visitors to the site to "consider how your expertise or resources might contribute to building the infrastructure America needs."
China

Facing US Chip Restrictions, China Pitches Global Cooperation on AI (msn.com) 13

In Shanghai at the World Artificial Intelligence Conference (which ran until Tuesday), the Chinese government "announced an international organization for AI regulation and a 13-point action plan aimed at fostering global cooperation to ensure the technology's beneficial and responsible development," reports the Washington Post.

The theme of the conference was "Global Solidarity in the AI Era," the article notes, and "the expo is one part of Beijing's bid to establish itself as a responsible AI leader for the international community."

CNN points out that China's announcement comes "just days after the United States unveiled its own plan to promote U.S. dominance." Chinese Premier Li Qiang unveiled China's vision for future AI oversight at the World AI Conference, an annual gathering in Shanghai of tech titans from more than 40 countries... While Li did not directly refer to the U.S. in his speech, he alluded to the ongoing trade tensions between the two superpowers, which include American restrictions on advanced semiconductor exports — a component vital for powering and training AI, which is currently causing a shortage in China. "Key resources and capabilities are concentrated in a few countries and a few enterprises," said Li in his speech on Saturday. "If we engage in technological monopoly, controls and restrictions, AI will become an exclusive game for a small number of countries and enterprises...."

Secretary-General of the Association of Southeast Asian Nations, Dr. Kao Kim Hourn, also called for "robust governance" of artificial intelligence to mitigate potential threats, including misinformation, deepfakes, and cybersecurity threats... Former Google CEO Eric Schmidt reiterated the call for international collaboration, explicitly calling on the U.S. and China to work together... "We have a vested interest to keep the world stable, keep the world not at war, to keep things peaceful, to make sure we have human control of these tools."

China's plan "called for establishing an international open-source community," reports the Wall Street Journal, "through which AI models can be freely deployed and improved by users." Industry participants said that plan "showed China's ambition to set global standards for AI and could undermine the U.S., whose leading models aren't open-source... While the world's best large language model is still American, the best model that everyone can use free is now Chinese."

"The U.S. should commit to ensuring that powerful models remain openly available," argues an opinion piece in The Hill by Stability AI's former head of public policy. Ubiquity is a matter of national security: retreating behind paywalls will leave a vacuum filled by strategic adversaries. Washington should treat open technology not as a vector for Chinese Communist Party propaganda but as a vessel to transmit U.S. influence abroad, molding the global ecosystem around U.S. industry. If DeepSeek is China's open-source "Sputnik moment," we need a legislative environment that supports — not criminalizes — an American open-source Moon landing.
Security

CISA Open-Sources Thorium Platform For Malware, Forensic Analysis (bleepingcomputer.com) 7

CISA has publicly released Thorium, a powerful open-source platform developed with Sandia National Labs that automates malware and forensic analysis at massive scale. According to BleepingComputer, the platform can "schedule over 1,700 jobs per second and ingest over 10 million files per hour per permission group." From the report: Security teams can use Thorium for automating and speeding up various file analysis workflows, including but not limited to:

- Easily import and export tools to facilitate sharing across cyber defense teams,
- Integrate command-line tools as Docker images, including open-source, commercial, and custom software,
- Filter results using tags and full-text search,
- Control access to submissions, tools, and results with strict group-based permissions,
- Scale with Kubernetes and ScyllaDB to meet workload demands.

Defenders can find installation instructions and get their own copy of Thorium from CISA's official GitHub repository.

Open Source

Google's New Security Project 'OSS Rebuild' Tackles Package Supply Chain Verification (googleblog.com) 13

This week Google's Open Source Security Team announced "a new project to strengthen trust in open source package ecosystems" — by reproducing upstream artifacts.

It includes automation to derive declarative build definitions, new "build observability and verification tools" for security teams, and even "infrastructure definitions" to help organizations rebuild, sign, and distribute provenance by running their own OSS Rebuild instances. (And as part of the initiative, the team also published SLSA Provenance attestations "for thousands of packages across our supported ecosystems.") Our aim with OSS Rebuild is to empower the security community to deeply understand and control their supply chains by making package consumption as transparent as using a source repository. Our rebuild platform unlocks this transparency by utilizing a declarative build process, build instrumentation, and network monitoring capabilities which, within the SLSA Build framework, produces fine-grained, durable, trustworthy security metadata. Building on the hosted infrastructure model that we pioneered with OSS Fuzz for memory issue detection, OSS Rebuild similarly seeks to use hosted resources to address security challenges in open source, this time aimed at securing the software supply chain... We are committed to bringing supply chain transparency and security to all open source software development. Our initial support for the PyPI (Python), npm (JS/TS), and Crates.io (Rust) package registries — providing rebuild provenance for many of their most popular packages — is just the beginning of our journey...

OSS Rebuild helps detect several classes of supply chain compromise:

- Unsubmitted Source Code: When published packages contain code not present in the public source repository, OSS Rebuild will not attest to the artifact.

- Build Environment Compromise: By creating standardized, minimal build environments with comprehensive monitoring, OSS Rebuild can detect suspicious build activity or avoid exposure to compromised components altogether.

- Stealthy Backdoors: Even sophisticated backdoors like xz often exhibit anomalous behavioral patterns during builds. OSS Rebuild's dynamic analysis capabilities can detect unusual execution paths or suspicious operations that are otherwise impractical to identify through manual review.


For enterprises and security professionals, OSS Rebuild can...

Enhance metadata without changing registries by enriching data for upstream packages. No need to maintain custom registries or migrate to a new package ecosystem.

Augment SBOMs by adding detailed build observability information to existing Software Bills of Materials, creating a more complete security picture...

- Accelerate vulnerability response by providing a path to vendor, patch, and re-host upstream packages using our verifiable build definitions...


The easiest (but not only!) way to access OSS Rebuild attestations is to use the provided Go-based command-line interface.

"With OSS Rebuild's existing automation for PyPI, npm, and Crates.io, most packages obtain protection effortlessly without user or maintainer intervention."
AI

Nvidia's CUDA Platform Now Support RISC-V (tomshardware.com) 20

An anonymous reader quotes a report from Tom's Hardware: At the 2025 RISC-V Summit in China, Nvidia announced that its CUDA software platform will be made compatible with the RISC-V instruction set architecture (ISA) on the CPU side of things. The news was confirmed during a presentation during a RISC-V event. This is a major step in enabling the RISC-V ISA-based CPUs in performance demanding applications. The announcement makes it clear that RISC-V can now serve as the main processor for CUDA-based systems, a role traditionally filled by x86 or Arm cores. While nobody even barely expects RISC-V in hyperscale datacenters any time soon, RISC-V can be used on CUDA-enabled edge devices, such as Nvidia's Jetson modules. However, it looks like Nvidia does indeed expect RISC-V to be in the datacenter.

Nvidia's profile on RISC-V seems to be quite high as the keynote at the RISC-V Summit China was delivered by Frans Sijsterman, who appears to be Vice President of Hardware Engineering at Nvidia. The presentation outlined how CUDA components will now run on RISC-V. A diagram shown at the session illustrated a typical configuration: the GPU handles parallel workloads, while a RISC-V CPU executes CUDA system drivers, application logic, and the operating system. This setup enables the CPU to orchestrate GPU computations fully within the CUDA environment. Given Nvidia's current focus, the workloads must be AI-related, yet the company did not confirm this. However, there is more.

Also featured in the diagram was a DPU handling networking tasks, rounding out a system consisting of GPU compute, CPU orchestration, and data movement. This configuration clearly suggests Nvidia's vision to build heterogeneous compute platforms where RISC-V CPU can be central to managing workloads while Nvidia's GPUs, DPUs, and networking chips handle the rest. Yet again, there is more. Even with this low-profile announcement, Nvidia essentially bridges proprietary CUDA stack to an open architecture, one that seems to develop fast in China. Yet, being unable to ship flagship GB200 and GB300 offerings to China, the company has to find ways to keep its CUDA thriving.

Open Source

NVIDIA Makes More Hopper, Blackwell Header Files Open-Source (phoronix.com) 4

NVIDIA has released additional open-source header files for its Blackwell and Hopper GPU architectures, continuing its effort to support open-source drivers like Nouveau/NVK and the NOVA Rust driver. Phoronix reports: Last week NVIDIA open-sourced 12k lines of C header files for Blackwell GPUs to help in the open-source driver efforts, namely for Nouveau / NVK and the in-development NOVA Rust driver. On Friday they made public some additional header files for helping in the Blackwell and Hopper open-source driver enablement.

Following the previously-covered open-source header activity, on Friday this commit was pushed to their open-source documentation repository that provides Hopper and Blackwell DMA-copy class header files. [...] In turn the code has already been imported into Mesa Git.

Open Source

Jack Dorsey Pumps $10M Into a Nonprofit Focused on Open Source Social Media (techcrunch.com) 20

Twitter co-founder/Block CEO Jack Dorsey isn't just vibe coding new apps like Bitchat and Sun Day. He's also "invested $10 million in an effort to fund experimental open source projects and other tools that could ultimately transform the social media landscape," reports TechCrunch," funding the projects through an online collective formed in May called "andOtherStuff: [T]he team at "andOtherStuff" is determined not to build a company but is instead operating like a "community of hackers," explains Evan Henshaw-Plath [who handles UX/onboarding and was also Twitter's first employee]. Together, they're working to create technologies that could include new consumer social apps as well as various experiments, like developer tools or libraries, that would allow others to build apps for themselves.

For instance, the team is behind an app called Shakespeare, which is like the app-building platform Lovable, but specifically for building Nostr-based social apps with AI assistance. The group is also behind heynow, a voice note app built on Nostr; Cashu wallet; private messenger White Noise; and the Nostr-based social community +chorus, in addition to the apps Dorsey has already released. Developments in AI-based coding have made this type of experimentation possible, Henshaw-Plath points out, in the same way that technologies like Ruby on Rails, Django, and JSON helped to fuel an earlier version of the web, dubbed Web 2.0.

Related to these efforts, Henshaw-Plath sat down with Dorsey for the debut episode of his new podcast, revolution.social with @rabble... Dorsey believes Bluesky faces the same challenges as traditional social media because of its structure — it's funded by VCs, like other startups. Already, it has had to bow to government requests and faced moderation challenges, he points out. "I think [Bluesky CEO] Jay [Graber] is great. I think the team is great," Dorsey told Henshaw-Plath, "but the structure is what I disagree with ... I want to push the energy in a different direction, which is more like Bitcoin, which is completely open and not owned by anyone from a protocol layer...."

Dorsey's initial investment has gotten the new nonprofit up and running, and he worked on some of its initial iOS apps. Meanwhile, others are contributing their time to build Android versions, developer tools, and different social media experiments. More is still in the works, says Henshaw-Plath.

"There are things that we're not ready to talk about yet that'll be very exciting," he teases.

Open Source

Intel Kills Clear Linux OS As Support Ends Without Warning (nerds.xyz) 95

BrianFagioli shares a report from NERDS.xyz: Intel has quietly pulled the plug on Clear Linux OS, officially ending support for the once-promising Linux distribution that it had backed for nearly a decade. Effective immediately, the company says it will no longer provide any updates, security patches, or maintenance for the operating system. In a final blow, the Clear Linux OS GitHub repository is now archived in read-only mode.

The move was announced with little fanfare, and for users still relying on Clear Linux OS, there's no sugarcoating it... you need to move on. Intel is urging everyone to migrate to an actively maintained Linux distribution as soon as possible to avoid running unpatched software.
"Rest assured that Intel remains deeply invested in the Linux ecosystem, actively supporting and contributing to various open-source projects and Linux distributions to enable and optimize for Intel hardware," the company said in a statement. "A heartfelt thank you to every developer, user, and contributor who helped shape Clear Linux OS over the last 10 years. Your feedback and contributions have been invaluable."
Open Source

Linux Reaches 5% On Desktop (ostechnix.com) 150

Longtime Slashdot reader bobdevine shares a report from OSTechNix: For the first time, Linux has officially broken the 5% desktop market share barrier in the United States of America! It's a huge milestone for open-source and our fantastic Linux community. While many might think of Linux as a niche choice, this new data shows a significant shift is happening.

According to the latest StatCounter Global Stats for June 2025, Linux now holds 5.03% of the desktop operating system market share in the United United States of America. This is fantastic news! [...] One truly satisfying detail for me? Linux has finally surpassed the "Unknown" category in the USA! It shows that our growth is clear and recognized.
"It took eight years to go from 1% to 2% (by April 2021), then just 2.2 years to reach 3% (June 2023), and a mere 0.7 years to hit 4% (February 2024)," notes the report. "Now, here we are, at over 5% in the USA! This exponential growth suggests that we're on a promising upward trend."
Bitcoin

LibreOffice Lands Built-In Support For Bitcoin As Currency (phoronix.com) 32

An anonymous reader quotes a report from Phoronix: Merged yesterday to the latest development code for the LibreOffice open-source office suite is now recognizing Bitcoin "BTC" as a supported currency for use within the Calc spreadsheet program and elsewhere within this cross-platform free software office suite. Stemming from a recent bug report requesting Bitcoin as an official currency option within LibreOffice Calc, the necessary additions are now in place so it's a built-in preset like USD and EUR. Thus easier managing of Bitcoin transactions and the like from within LibreOffice Calc.
Software

Blender 4.5 LTS Released (nerds.xyz) 11

BrianFagioli shares a report from NERDS.xyz: Blender 4.5 has arrived and it's a long-term support release. That means users get two full years of updates and bug fixes, making it a smart choice for anyone looking for stability in serious projects. Whether you're a solo artist or part of a studio pipeline, this version is built to last. Here's a list of key features and changes in this release:

- Vulkan backend replaces OpenGL (faster, smoother UI)
- Adaptive subdivision up to 14x faster with multithreading
- New Geometry Nodes: Camera Info, Instance Bounds
- GPU-accelerated compositor nodes with standardized inputs
- New Boolean solver: Manifold (cleaner, faster mesh operations)
- UV maps visible in Object Mode + improved selection behavior
- Grease Pencil render pass and Geometry Nodes integration
- Improved file import support: PLY, OBJ, STL, CSV, VDB
- Deprecations: Collada, Big Endian, legacy .blend, Intel Mac support
- Cycles OptiX now requires NVIDIA driver v535+
- New shader variants for add-on developers (POLYLINES_*, POINT_*)
~500 bug fixes across all major systems
Graphics

Blender Studio Releases Free New Game 'Dogwalk' to Showcase Its Open Source Godot Game Engine (notebookcheck.net) 25

"Steam quietly welcomed another indie game this week, but this one is distinctly different for a lot of reasons," writes Notebookcheck: Dogwalk, which debuted on July 11, is the kind of short, gentle experience that almost forces you to smile. Developed by Blender Studio, the game introduces players to a gorgeous winter landscape. You play as a cute, fluffy dog, with a small child in tow...

What's particularly interesting here is that Dogwalk is more than just another charming indie project. It's Blender Studio's showcase for what's possible using fully open-source tools. The entire project — assets, animations, and code — is made with Blender and the popular Godot Game Engine. Unlike industry giants such as Unity or Unreal, Godot is completely open source, meaning it doesn't require developers to pay royalties or follow strict licensing agreements. This should make it great for small studios and independent creators, as it lowers the entry barrier to game creation.

Dogwalk is 100% free, which fits neatly into its open-source philosophy

AI

Linux Foundation Adopts A2A Protocol To Help Solve One of AI's Most Pressing Challenges 38

An anonymous reader quotes a report from ZDNet: The Linux Foundation announced at the Open Source Summit in Denver that it will now host the Agent2Agent (A2A) protocol. Initially developed by Google and now supported by more than 100 leading technology companies, A2A is a crucial new open standard for secure and interoperable communication between AI agents. In his keynote presentation, Mike Smith, a Google staff software engineer, told the conference that the A2A protocol has evolved to make it easier to add custom extensions to the core specification. Additionally, the A2A community is working on making it easier to assign unique identities to AI agents, thereby improving governance and security.

The A2A protocol is designed to solve one of AI's most pressing challenges: enabling autonomous agents -- software entities capable of independent action and decision-making -- to discover each other, securely exchange information, and collaborate across disparate platforms, vendors, and frameworks. Under the hood, A2A does this work by creating an AgentCard. An AgentCard is a JavaScript Object Notation (JSON) metadata document that describes its purpose and provides instructions on how to access it via a web URL. A2A also leverages widely adopted web standards, such as HTTP, JSON-RPC, and Server-Sent Events (SSE), to ensure broad compatibility and ease of integration. By providing a standardized, vendor-neutral communication layer, A2A breaks down the silos that have historically limited the potential of multi-agent systems.

For security, A2A comes with enterprise-grade authentication and authorization built in, including support for JSON Web Tokens (JWTs), OpenID Connect (OIDC), and Transport Layer Security (TLS). This approach ensures that only authorized agents can participate in workflows, protecting sensitive data and agent identities. While the security foundations are in place, developers at the conference acknowledged that integrating them, particularly authenticating agents, will be a hard slog.
Antje Barth, an Amazon Web Services (AWS) principal developer advocate for generative AI, explained what the adoption of A2A will mean for IT professionals: "Say you want to book a train ride to Copenhagen, then a hotel there, and look maybe for a fancy restaurant, right? You have inputs and individual tasks, and A2A adds more agents to this conversation, with one agent specializing in hotel bookings, another in restaurants, and so on. A2A enables agents to communicate with each other, hand off tasks, and finally brings the feedback to the end user."

Jim Zemlin, executive director of the Linux Foundation, said: "By joining the Linux Foundation, A2A is ensuring the long-term neutrality, collaboration, and governance that will unlock the next era of agent-to-agent powered productivity." Zemlin expects A2A to become a cornerstone for building interoperable, multi-agent AI systems.
Open Source

The Open-Source Software Saving the Internet From AI Bot Scrapers (404media.co) 33

An anonymous reader quotes a report from 404 Media: For someone who says she is fighting AI bot scrapers just in her free time, Xe Iaso seems to be putting up an impressive fight. Since she launched it in January, Anubis, a "program is designed to help protect the small internet from the endless storm of requests that flood in from AI companies," has been downloaded nearly 200,000 times, and is being used by notable organizations including GNOME, the popular open-source desktop environment for Linux, FFmpeg, the open-source software project for handling video and other media, and UNESCO, the United Nations organization for educations, science, and culture. [...]

"Anubis is an uncaptcha," Iaso explains on her site. "It uses features of your browser to automate a lot of the work that a CAPTCHA would, and right now the main implementation is by having it run a bunch of cryptographic math with JavaScript to prove that you can run JavaScript in a way that can be validated on the server." Essentially, Anubis verifies that any visitor to a site is a human using a browser as opposed to a bot. One of the ways it does this is by making the browser do a type of cryptographic math with JavaScript or other subtle checks that browsers do by default but bots have to be explicitly programmed to do. This check is invisible to the user, and most browsers since 2022 are able to complete this test. In theory, bot scrapers could pretend to be users with browsers as well, but the additional computational cost of doing so on the scale of scraping the entire internet would be huge. This way, Anubis creates a computational cost that is prohibitively expensive for AI scrapers that are hitting millions and millions of sites, but marginal for an individual user who is just using the internet like a human.

Anubis is free, open source, lightweight, can be self-hosted, and can be implemented almost anywhere. It also appears to be a pretty good solution for what we've repeatedly reported is a widespread problem across the internet, which helps explain its popularity. But Iaso is still putting a lot of work into improving it and adding features. She told me she's working on a non cryptographic challenge so it taxes users' CPUs less, and also thinking about a version that doesn't require JavaScript, which some privacy-minded disable in their browsers. The biggest challenge in developing Anubis, Iaso said, is finding the balance. "The balance between figuring out how to block things without people being blocked, without affecting too many people with false positives," she said. "And also making sure that the people running the bots can't figure out what pattern they're hitting, while also letting people that are caught in the web be able to figure out what pattern they're hitting, so that they can contact the organization and get help. So that's like, you know, the standard, impossible scenario."

Programming

Microsoft Open Sources Copilot Chat for VS Code on GitHub (nerds.xyz) 18

"Microsoft has released the source code for the GitHub Copilot Chat extension for VS Code under the MIT license," reports BleepingComputer. This provides the community access to the full implementation of the chat-based coding assistant, including the implementation of "agent mode," what contextual data is sent to large language models (LLMs), and the design of system prompts. The GitHub repository hosting the code also details telemetry collection mechanisms, addressing long-standing questions about data transparency in AI-assisted coding tools...

As the VS Code team explained previously, shifts in AI tooling landscape like the rapid growth of the open-source AI ecosystem and a more level playing field for all have reduced the need for secrecy around prompt engineering and UI design. At the same time, increased targeting of development tools by malicious actors has increased the need for crowdsourcing contributions to rapidly pinpoint problems and develop effective fixes. Essentially, openness is now considered superior from a security perspective.

"If you've been hesitant to adopt AI tools because you don't trust the black box behind them, this move opensources-github-copilot-chat-vscode/offers something rare these days: transparency," writes Slashdot reader BrianFagioli" Now that the extension is open source, developers can audit how agent mode actually works. You can also dig into how it manages your data, customize its behavior, or build entirely new tools on top of it. This could be especially useful in enterprise environments where compliance and control are non negotiable.

It is worth pointing out that the backend models powering Copilot remain closed source. So no, you won't be able to self host the whole experience or train your own Copilot. But everything running locally in VS Code is now fair game. Microsoft says it is planning to eventually merge inline code completions into the same open source package too, which would make Copilot Chat the new hub for both chat and suggestions.

X

X11 Fork XLibre Released For Testing On Systemd-Free Artix Linux (webpronews.com) 134

An anonymous reader shared this report from WebProNews: The Linux world is abuzz with news of XLibre, a fork of the venerable X11 window display system, which aims to be an alternative to X11's successor, Wayland.

Much of the Linux world is working to adopt Wayland, the successor to X11. Wayland has been touted as being a superior option, providing better security and performance. Despite Fedora and Ubuntu both going Wayland-only, the newer display protocol still lags behind X11, in terms of functionality, especially in the realm of accessibility, screen recording, session restore, and more. In addition, despite the promise of improved performance, many users report performance regressions compared to X11.

While progress is being made, it has been slow going, especially for a project that is more than 17 years old. To make matters worse, Wayland is largely being improved by committee, with the various desktop environment teams trying to work together to further the protocol. Progress is further hampered by the fact that the GNOME developers often object to the implementation of some functionality that doesn't fit with their vision of what a desktop should be — despite those features being present and needed in every other environment.

In response, developer Enrico Weigelt has forked Xll into the XLibre project. Weigelt was already one of the most prolific X11 contributors at a time when little to no improvements or new features are being added to the aging window system... Weigelt has wasted no time releasing the inaugural version of XLibre, XLibre 25.0. The release includes a slew of improvements.

MrBrklyn (Slashdot reader #4,775) adds that Artix Linux, a rolling-release distro based on Arch Linux which does not use systemd, now offers XLibre ISO images and packages for testing and use. They're all non-systemd based, and "Its a decent undertaking by the Artix development team. The iso is considered to be testing but it is quickly moving to the regular repos for broad public use."
EU

'The Year of the EU Linux Desktop May Finally Arrive' (theregister.com) 71

Steven J. Vaughan-Nichols writes in an opinion piece for The Register: Microsoft, tactically admitting it has failed at talking all the Windows 10 PC users into moving to Windows 11 after all, is -- sort of, kind of -- extending Windows 10 support for another year. For most users, that means they'll need to subscribe to Microsoft 365. This, in turn, means their data and meta-information will be kept in a US-based datacenter. That isn't sitting so well with many European Union (EU) organizations and companies. It doesn't sit that well with me or a lot of other people either.

A few years back, I wrote in these very pages that Microsoft didn't want you so much to buy Windows as subscribe to its cloud services and keep your data on its servers. If you wanted a real desktop operating system, Linux would be almost your only choice. Nothing has changed since then, except that folks are getting a wee bit more concerned about their privacy now that President Donald Trump is in charge of the US. You may have noticed that he and his regime love getting their hands on other people's data.

Privacy isn't the only issue. Can you trust Microsoft to deliver on its service promises under American political pressure? Ask the EU-based International Criminal Court (ICC) which after it issued arrest warrants for Israeli Prime Minister Benjamin Netanyahu for war crimes, Trump imposed sanctions on the ICC. Soon afterward, ICC's chief prosecutor, Karim Khan, was reportedly locked out of his Microsoft email accounts. Coincidence? Some think not. Microsoft denies they had anything to do with this.

Peter Ganten, chairman of the German-based Open-Source Business Alliance (OSBA), opined that these sanctions ordered by the US which he alleged had been implemented by Microsoft "must be a wake-up call for all those responsible for the secure availability of state and private IT and communication infrastructures." Microsoft chairman and general counsel, Brad Smith, had promised that it would stand behind its EU customers against political pressure. In the aftermath of the ICC reports, Smith declared Microsoft had not been "in any way [involved in] the cessation of services to the ICC." In the meantime, if you want to reach Khan, you'll find him on the privacy-first Swiss email provider, ProtonMail.

In short, besides all the other good reasons for people switching to the Linux desktop - security, Linux is now easy to use, and, thanks to Steam, you can do serious gaming on Linux - privacy has become much more critical. That's why several EU governments have decided that moving to the Linux desktop makes a lot of sense... Besides, all these governments know that switching from Windows 10 to 11 isn't cheap. While finances also play a role, and I always believe in "following the money" when it comes to such software decisions, there's no question that Europe is worried about just how trustworthy America and its companies are these days. Do you blame them? I don't.
The shift to the Linux desktop is "nothing new," as Vaughan-Nichols notes. Munich launched its LiMux project back in 2004 and, despite ending it in 2017, reignited its open-source commitment by establishing a dedicated program office in 2024. In France, the gendarmerie now operates over 100,000 computers on a custom Ubuntu-based OS (GendBuntu), while the city of Lyon is transitioning to Linux and PostgreSQL.

More recently, Denmark announced it is dropping Windows and Office in favor of Linux and LibreOffice, citing digital sovereignty. The German state of Schleswig-Holstein is following suit, also moving away from Microsoft software. Meanwhile, a pan-European Linux OS (EU OS) based on Fedora Kinoite is being explored, with Linux Mint and openSUSE among the alternatives under consideration.
Open Source

Magic Lantern Software for Canon Cameras Is Back (petapixel.com) 11

Magic Lantern, the popular open-source suite of software enhancements for Canon DSLR cameras, has returned under new leadership. The revived project aims to offer regular updates and support for additional models, including compatibility for Canon's newer mirrorless cameras equipped with DIGIC X processors. PetaPixel reports: The new lead developer, names_are_hard, announced Magic Lantern's return yesterday on Magic Lantern's forums, seen by Reddit r/cinematography users and confirmed on the official Magic Lantern website. "It's been a long journey, but official Magic Lantern builds return, for all cameras," names_are_hard writes. They add that this means that there will be new, regular releases for all supported cameras and new cameras will be supported. As of now, the supported cameras are almost entirely DSLR models, save for tools for the original EOS M mirrorless camera.

However, one of the members of the core Magic Lantern team, which comprises developers g3ggo, kitor, and WalterSchulz, says the team is looking at supporting cameras with DIGIC X processors, which includes mirrorless EOS R models. "It would be awesome if they start supporting new cameras. Imaging unlocking Open Gate on the R5/R6 lines, or RAW on cameras that don't have it (like R6, R7, etc.)," writes Redditor user machado34. "I believe it will be possible. They say they're exploring up to DIGIC X," adds 3dforlife. "In fact we are," developer kitor replies. "Just DIGIC 8 is stubborn and X adds some new (undocumented) hardware on top of that." Kitor is listed as the chief DIGIC 8 and DIGIC X hacker on Magic Lantern's forums, plus kitor is chiefly in charge of the revived website and Magic Lantern's social media presence. If the team can crack mirrorless cameras, it would be a boon. [...]

The new Magic Lantern core team of devs, plus many other key players who are involved to various degrees in bringing Magic Lantern back to life, have built a new repo, formalized the code base, and developed a new, efficient build system. "Around 2020, our old lead dev, a1ex, after years of hard work, left the project. The documentation was fragmentary. Nobody understood the build system. A very small number of volunteers kept things alive, but nothing worked well. Nobody had deep knowledge of Magic Lantern code," names_are_hard writes. "Those that remained had to learn how everything worked, then fix it. Then add support for new cams without breaking the old ones."

"We have an updated website. We have a new repo. We have new supported models. We have a new build system. We have cleaner, faster, smaller code." The team is now using Git, building on modern operating systems with contemporary tools, and compiling clean. "This was a lot of work, and invisible to users, but very useful for devs. It's easier than ever to join as a dev." Alongside the exciting return, Magic Lantern has added support for numerous new Canon DSLR cameras, including the 200D, 6D Mark II, 750D, and 7D Mark II.

Slashdot Top Deals