Amiga

Can You Run Linux On a Commodore 64? (github.com) 68

llvm-mos adapts the popular LLVM compiler to target the MOS 6502 processor (the 1980s microprocessor used in early home computing devices like the Apple II and the Commodore 64). So developer Onno Kortman used it to cross-compile semu, a "minimalist RISC-V system emulator capable of running Linux the kernel and corresponding userland." And by the end of the day, Kortman has Linux running on a Commodore 64.

Long-time Slashdot reader johnwbyrd shared the link to Kortman's repository. Some quotes: "But does it run Linux?" can now be finally and affirmatively answered for the Commodore C64...!

It runs extremely slowly and it needs a RAM Expansion Unit (REU), as there is no chance to fit it all into just 64KiB.

It even emulates virtual memory with an MMU....

The screenshots took VICE a couple hours in "warp mode" (activate it with Alt-W) to generate. So, as is, a real C64 should be able to boot Linux within a week or so.

The compiled 6502 code is not really optimized yet, and it might be realistic to squeeze a factor 10x of performance out of this. Maybe even a simple form of JIT compilation? It should also be possible to implement starting a checkpointed VM (quickly precomputed on x86-64) to avoid the lengthy boot process...

I also tested a minimal micropython port (I can clean it up and post it on github if there is interest), that one does not use the MMU and is almost barely remotely usable with lots of optimism at 100% speed.

A key passage: I have not tested it on real hardware yet, that's the next challenge .. for you. So please send me a link to a timelapse video of an original unit with REU booting Linux :D
Its GitHub repository has build and run instructions...
AI

DHS Has Spent Millions On an AI Surveillance Tool That Scans For 'Sentiment and Emotion' (404media.co) 50

New submitter Slash_Account_Dot shares a report from 404 Media, a new independent media company founded by technology journalists Jason Koebler, Emanuel Maiberg, Samantha Cole, and Joseph Cox: Customs and Border Protection (CBP), part of the Department of Homeland Security, has bought millions of dollars worth of software from a company that uses artificial intelligence to detect "sentiment and emotion" in online posts, according to a cache of documents obtained by 404 Media. CBP told 404 Media it is using technology to analyze open source information related to inbound and outbound travelers who the agency believes may threaten public safety, national security, or lawful trade and travel. In this case, the specific company called Fivecast also offers "AI-enabled" object recognition in images and video, and detection of "risk terms and phrases" across multiple languages, according to one of the documents.

Marketing materials promote the software's ability to provide targeted data collection from big social platforms like Facebook and Reddit, but also specifically names smaller communities like 4chan, 8kun, and Gab. To demonstrate its functionality, Fivecast promotional materials explain how the software was able to track social media posts and related Persons-of-Interest starting with just "basic bio details" from a New York Times Magazine article about members of the far-right paramilitary Boogaloo movement. 404 Media also obtained leaked audio of a Fivecast employee explaining how the tool could be used against trafficking networks or propaganda operations. The news signals CBP's continued use of artificial intelligence in its monitoring of travelers and targets, which can include U.S. citizens. This latest news shows that CBP has deployed multiple AI-powered systems, and provides insight into what exactly these tools claim to be capable of while raising questions about their accuracy and utility.
"CBP should not be secretly buying and deploying tools that rely on junk science to scrutinize people's social media posts, claim to analyze their emotions, and identify purported 'risks,'" said Patrick Toomey, deputy director of the ACLU's National Security Project. "The public knows far too little about CBP's Counter Network Division, but what we do know paints a disturbing picture of an agency with few rules and access to an ocean of sensitive personal data about Americans. The potential for abuse is immense."
The Internet

Repair Ship Bound for Cut Cables Off Africa's West Coast as Internet Interrupted (bloomberg.com) 28

Fiber-optic cables that were damaged by a rockfall in an undersea canyon, resulting in slow internet connections in some parts of Africa, should be repaired next month by a specialized vessel, according to telecommunication companies. From a report: The West Africa Cable System that runs about 16,000 kilometers (9,950 miles) along the sea floor from Europe to southern Africa was damaged with other lines earlier this month. The 40-year-old cable-layer vessel Leon Thevenin, named after a French telegraph engineer, was moored in Cape Town this week, according to tracking data compiled by Bloomberg. It's capable of working in extreme conditions and in shallow or deep water, according to owner Orange Marine, a submarine telecommunications company. All South African networks are currently experiencing disruptions due to the damaged lines, said Anne-Caroline Tanguy, a spokeswoman at Cloudflare, a company that provides load balancing and analysis. The repairs are expected to be finished in September.
Sony

Sony's Portable PlayStation Portal Launches Later This Year For $200 (theverge.com) 50

Sony is officially launching its portable PlayStation later this year, the PlayStation Portal remote player. The handheld device will stream PS5 games over Wi-Fi and features an eight-inch LCD screen running at 1080p resolution at 60fps. Sony says the PlayStation Portal will be available later this year priced at $199.99. From a report: "PlayStation Portal will connect remotely to your PS5 over Wi-Fi, so you'll be able to swiftly jump from playing on your PS5 to your PlayStation Portal," says Hideaki Nishino, senior vice president of platform experience at Sony Interactive Entertainment. "PlayStation Portal can play supported games that are installed on your PS5 console and use the Dualsense controller." The PlayStation Portal features prominent controllers on each side that look very much like Sony's PS5 DualSense controllers. They support adaptive triggers and haptic feedback, so PS5 games will feel similar to using a dedicated DualSense controller. The PlayStation Portal will also be capable of playing media, as the homescreen has a dedicated section for it as it's mirroring your PS5. You won't be able to run anything locally though, so if you don't have Wi-Fi then you're out of luck.
United States

America's Farmers Are Bogged Down by Data (wsj.com) 54

A decade after data analytics promised to revolutionize agriculture, most farmers still aren't using data tools or specialized software, and of those who do, many are swamped in a deluge of data. From a report: In 2013, seed and pesticide giant Monsanto acquired agriculture-data firm Climate Corporation for $1 billion, helping spur the industry's mania for data-driven farming. The hope was that by outfitting farmers with software and tools capable of ingesting and analyzing troves of data on things from weather patterns to soil conditions, they could more efficiently use their land. Many are still waiting for the technology to pay off. In the U.S., less than half of farmers surveyed by consulting firm McKinsey are using farm management software, and 25% are using remote-sensing and precision agriculture hardware. That software is a foundational technology in enabling the autonomous machinery and AI-enabled equipment of the future, analysts say, and unless farmers start using it, some will be left behind in the next decade of farm innovation. At the moment, 3% of American farmers said they plan to adopt software or precision agriculture hardware over the next two years, according to McKinsey.

Certain tools can automatically gather data from internet-connected farm equipment, but others require farmers to manually enter the information. For a specific field, for instance, that could total over a dozen crop-protection products and multiple seeds. Even those who are using the tech say they can find it difficult to draw useful conclusions from it. "We're collecting so much data that you're almost paralyzed with having to analyze it all," said David Emmert, a corn and soybean farmer in West Central Indiana who works about 4,300 acres. [...] The first generation of digital farming tools also wasn't easy for farmers to use. Software was slow, interfaces were complex and difficult to manage. "The industry does need to step up a little bit on continuing to improve the customer experience," said David Fiocco, a McKinsey partner focused on agriculture. In recent years, big tech vendors like Microsoft, Amazon and Google have begun tailoring their cloud-computing, data and artificial-intelligence services to agriculture, bringing along expertise that could help address complications that have long plagued farm data management and analytics.

AI

AI Unlikely To Destroy Most Jobs, But Clerical Workers at Risk, UN Study Says (reuters.com) 55

Generative AI probably will not take over most people's jobs entirely but will instead automate a portion of their duties, freeing them up to do other tasks, a U.N. study said on Monday. From a report: It warned, however, that clerical work would likely be the hardest hit, potentially hitting female employment harder, given women's over-representation in this sector, especially in wealthier countries. An explosion of interest in generative AI and its chatbot applications has sparked fears over job destruction, similar to those that emerged when the moving assembly line was introduced in the early 1900s and after mainframe computers in the 1950s.

However, the study produced by the International Labour Organization concludes that: "Most jobs and industries are only partially exposed to automation and are thus more likely to be complemented rather than substituted by AI." This means that "the most important impact of the technology is likely to be of augmenting work," it adds. The occupation likely to be most affected by GenAI -- capable of generating text, images, sounds, animation, 3D models and other data -- is clerical work, where about a quarter of tasks are highly exposed to potential automation, the study says.

Windows

Windows 11 Has Made the 'Clean Windows Install' an Oxymoron (arstechnica.com) 207

An anonymous reader shares a column: You can still do a clean install of Windows, and it's arguably easier than ever, with official Microsoft-sanctioned install media easily accessible and Windows Update capable of grabbing most of the drivers that most computers need for basic functionality. The problem is that a "clean install" doesn't feel as clean as it used to, and unfortunately for us, it's an inside job -- it's Microsoft, not third parties, that is primarily responsible for the pile of unwanted software and services you need to decline or clear away every time you do a new Windows install.

The "out-of-box experience" (OOBE, in Microsoft parlance) for Windows 7 walked users through the process of creating a local user account, naming their computer, entering a product key, creating a "Homegroup" (a since-discontinued local file- and media-sharing mechanism), and determining how Windows Update worked. Once Windows booted to the desktop, you'd find apps like Internet Explorer and the typical in-box Windows apps (Notepad, Paint, Calculator, Media Player, Wordpad, and a few other things) installed. Keeping that baseline in mind, here's everything that happens during the OOBE stage in a clean install of Windows 11 22H2 (either Home or Pro) if you don't have active Microsoft 365/OneDrive/Game Pass subscriptions tied to your Microsoft account:

(Mostly) mandatory Microsoft account sign-in.
Setup screen asking you about data collection and telemetry settings.
A (skippable) screen asking you to "customize your experience."
A prompt to pair your phone with your PC.
A Microsoft 365 trial offer.
A 100GB OneDrive offer.
A $1 introductory PC Game Pass offer.

This process is annoying enough the first time, but at some point down the line, you'll also be offered what Microsoft calls the "second chance out-of-box experience," or SCOOBE (not a joke), which will try to get you to do all of this stuff again if you skipped some of it the first time. This also doesn't account for the numerous one-off post-install notification messages you'll see on the desktop for OneDrive and Microsoft 365. (And it's not just new installs; I have seen these notifications appear on systems that have been running for months even if they're not signed in to a Microsoft account, so no one is safe). And the Windows desktop, taskbar, and Start menu are no longer the pristine places they once were. Due to the Microsoft Store, you'll find several third-party apps taking up a ton of space in your Start menu by default, even if they aren't technically downloaded and installed until you run them for the first time. Spotify, Disney+, Prime Video, Netflix, and Facebook Messenger all need to be removed if you don't want them (this list can vary a bit over time).

Supercomputing

Can Computing Clean Up Its Act? (economist.com) 107

Long-time Slashdot reader SpzToid shares a report from The Economist: What you notice first is how silent it is," says Kimmo Koski, the boss of the Finnish IT Centre for Science. Dr Koski is describing LUMI -- Finnish for "snow" -- the most powerful supercomputer in Europe, which sits 250km south of the Arctic Circle in the town of Kajaani in Finland. LUMI, which was inaugurated last year, is used for everything from climate modeling to searching for new drugs. It has tens of thousands of individual processors and is capable of performing up to 429 quadrillion calculations every second. That makes it the third-most-powerful supercomputer in the world. Powered by hydroelectricity, and with its waste heat used to help warm homes in Kajaani, it even boasts negative emissions of carbon dioxide. LUMI offers a glimpse of the future of high-performance computing (HPC), both on dedicated supercomputers and in the cloud infrastructure that runs much of the internet. Over the past decade the demand for HPC has boomed, driven by technologies like machine learning, genome sequencing and simulations of everything from stockmarkets and nuclear weapons to the weather. It is likely to carry on rising, for such applications will happily consume as much computing power as you can throw at them. Over the same period the amount of computing power required to train a cutting-edge AI model has been doubling every five months. All this has implications for the environment.

HPC -- and computing more generally -- is becoming a big user of energy. The International Energy Agency reckons data centers account for between 1.5% and 2% of global electricity consumption, roughly the same as the entire British economy. That is expected to rise to 4% by 2030. With its eye on government pledges to reduce greenhouse-gas emissions, the computing industry is trying to find ways to do more with less and boost the efficiency of its products. The work is happening at three levels: that of individual microchips; of the computers that are built from those chips; and the data centers that, in turn, house the computers. [...] The standard measure of a data centre's efficiency is the power usage effectiveness (pue), the ratio between the data centre's overall power consumption and how much of that is used to do useful work. According to the Uptime Institute, a firm of it advisers, a typical data centre has a pue of 1.58. That means that about two-thirds of its electricity goes to running its computers while a third goes to running the data centre itself, most of which will be consumed by its cooling systems. Clever design can push that number much lower.

Most existing data centers rely on air cooling. Liquid cooling offers better heat transfer, at the cost of extra engineering effort. Several startups even offer to submerge circuit boards entirely in specially designed liquid baths. Thanks in part to its use of liquid cooling, Frontier boasts a pue of 1.03. One reason lumi was built near the Arctic Circle was to take advantage of the cool sub-Arctic air. A neighboring computer, built in the same facility, makes use of that free cooling to reach a pue rating of just 1.02. That means 98% of the electricity that comes in gets turned into useful mathematics. Even the best commercial data centers fall short of such numbers. Google's, for instance, have an average pue value of 1.1. The latest numbers from the Uptime Institute, published in June, show that, after several years of steady improvement, global data-centre efficiency has been stagnant since 2018.
The report notes that the U.S., Britain and the European Union, among others, are considering new rules that "could force data centers to become more efficient." Germany has proposed the Energy Efficiency Act that would mandate a minimum pue of 1.5 by 2027, and 1.3 by 2030.
AI

Top Physicist Says Chatbots Are Just 'Glorified Tape Recorders' (cnn.com) 216

In an interview with CNN, Michio Kaku, professor of theoretical physics at City College of New York and CUNY Graduate Center, said chatbots like OpenAI's ChatGPT are just "glorified tape recorders." From the report: "It takes snippets of what's on the web created by a human, splices them together and passes it off as if it created these things," he said. "And people are saying, 'Oh my God, it's a human, it's humanlike.'" However, he said, chatbots cannot discern true from false: "That has to be put in by a human." According to Kaku, humanity is in its second stage of computer evolution. The first was the analog stage, "when we computed with sticks, stones, levers, gears, pulleys, string." After that, around World War II, he said, we switched to electricity-powered transistors. It made the development of the microchip possible and helped shape today's digital landscape. But this digital landscape rests on the idea of two states like "on" and "off," and uses binary notation composed of zeros and ones.

"Mother Nature would laugh at us because Mother Nature does not use zeros and ones," Kaku said. "Mother Nature computes on electrons, electron waves, waves that create molecules. And that's why we're now entering stage three." He believes the next technological stage will be in the quantum realm. Quantum computing is an emerging technology utilizing the various states of particles like electrons to vastly increase a computer's processing power. Instead of using computer chips with two states, quantum computers use various states of vibrating waves. It makes them capable of analyzing and solving problems much faster than normal computers. But beyond business applications, Kaku said quantum computing could also help advance health care. "Cancer, Parkinson's, Alzheimer's disease -- these are diseases at the molecular level. We're powerless to cure these diseases because we have to learn the language of nature, which is the language of molecules and quantum electrons."

Encryption

Google's Chrome Begins Supporting Post-Quantum Key Agreement to Shield Encryption Keys (theregister.com) 13

"Teams across Google are working hard to prepare the web for the migration to quantum-resistant cryptography," writes Chrome's technical program manager for security, Devon O'Brien.

"Continuing with our strategy for handling this major transition, we are updating technical standards, testing and deploying new quantum-resistant algorithms, and working with the broader ecosystem to help ensure this effort is a success." As a step down this path, Chrome will begin supporting X25519Kyber768 for establishing symmetric secrets in TLS, starting in Chrome 116, and available behind a flag in Chrome 115. This hybrid mechanism combines the output of two cryptographic algorithms to create the session key used to encrypt the bulk of the TLS connection:

X25519 — an elliptic curve algorithm widely used for key agreement in TLS today
Kyber-768 — a quantum-resistant Key Encapsulation Method, and NIST's PQC winner for general encryption

In order to identify ecosystem incompatibilities with this change, we are rolling this out to Chrome and to Google servers, over both TCP and QUIC and monitoring for possible compatibility issues. Chrome may also use this updated key agreement when connecting to third-party server operators, such as Cloudflare, as they add support. If you are a developer or administrator experiencing an issue that you believe is caused by this change, please file a bug.

The Register delves into Chrome's reasons for implementing this now: "It's believed that quantum computers that can break modern classical cryptography won't arrive for 5, 10, possibly even 50 years from now, so why is it important to start protecting traffic today?" said O'Brien. "The answer is that certain uses of cryptography are vulnerable to a type of attack called Harvest Now, Decrypt Later, in which data is collected and stored today and later decrypted once cryptanalysis improves." O'Brien says that while symmetric encryption algorithms used to defend data traveling on networks are considered safe from quantum cryptanalysis, the way the keys get negotiated is not. By adding support for a hybrid KEM, Chrome should provide a stronger defense against future quantum attacks...

Rebecca Krauthamer, co-founder and chief product officer at QuSecure, told The Register in an email that while this technology sounds futuristic, it's useful and necessary today... [T]he arrival of capable quantum computers should not be thought of as a specific, looming date, but as something that will arrive without warning. "There was no press release when the team at Bletchley Park cracked the Enigma code, either," she said.

Space

Planetary Defense Test Deflected An Asteroid But Unleashed a Boulder Swarm (ucla.edu) 63

A UCLA-led study of NASA's DART mission found that the collision launched a cloud of boulders from its surface. "The boulder swarm is like a cloud of shrapnel expanding from a hand grenade," said Jewitt, lead author of the study and a UCLA professor of earth and planetary sciences. "Because those big boulders basically share the speed of the targeted asteroid, they're capable of doing their own damage." From a news release: In September 2022, NASA deliberately slammed a spacecraft into the asteroid Dimorphos to knock it slightly off course. NASA's objective was to evaluate whether the strategy could be used to protect Earth in the event that an asteroid was headed toward our planet. Jewitt said that given the high speed of a typical impact, a 15-foot boulder hitting Earth would deliver as much energy as the atomic bomb that was dropped on Hiroshima. Fortunately, neither Dimorphos nor the boulder swarm have ever posed any danger to Earth. NASA chose Dimorphos because it was about 6 million miles from Earth and measured just 581 feet across -- close enough to be of interest and small enough, engineers reasoned, that the half-ton Double Asteroid Redirection Test, or DART, planetary defense spacecraft would be able to change the asteroid's trajectory.

When it hurtled into Dimorphos at 13,000 miles per hour, DART slowed Dimorphos' orbit around its twin asteroid, Didymos, by a few millimeters per second. But, according to images taken by NASA's Hubble Space Telescope, the collision also shook off 37 boulders, each measuring from 3 to 22 feet across. None of the boulders is on a course to hit Earth, but if rubble from a future asteroid deflection were to reach our planet, Jewitt said, they'd hit at the same speed the asteroid was traveling -- fast enough to cause tremendous damage. The research, published in the Astrophysical Journal Letters, found that the rocks were likely knocked off the surface by the shock of the impact. A close-up photograph taken by DART just two seconds before the collision shows a similar number of boulders sitting on the asteroid's surface -- and of similar sizes and shapes -- to the ones that were imaged by the Hubble telescope. The boulders that the scientists studied, among the faintest objects ever seen within the solar system, are observable in detail thanks to the powerful Hubble telescope.

Earth

America's Offshore Wind Potential is Huge but Untapped (theverge.com) 142

A new analysis "shows that over 4,000 gigawatts (GW) of offshore wind potential is available along the U.S. coastline," capable of fulfilling up to 25% of U.S. energy demand in 2050. (And it could also add $1.8 trillion in economy-boosting investment, while employing up to 390,000 workers.)

This new analysis comes from Berkeley researchers, who worked with nonprofit clean energy research firm GridLab and climate policy think tank Energy Innovation, reports the Verge: The Biden administration has committed to halving the nation's emissions by the end of the decade and has plans to source electricity completely from carbon pollution-free energy by 2035. Adding to that urgency, U.S. electricity demand is forecast to nearly triple by 2050, according to the Berkeley report. On top of a growing economy, the clean energy transition means electrifying more vehicles and homes — all of which put more stress on the power grid unless more power supply comes online at a similar pace.

To meet that demand and hit its climate goals, the report says the U.S. has to add 27 gigawatts of offshore wind and 85 GW of land-based wind and solar each year between 2035 and 2050. That timeline might still seem far away, but it's a big escalation of the Biden administration's current goal of deploying 30 GW of offshore wind by 2030. Europe, with an electricity grid about 70% the size of the U.S., already has about as much offshore wind capacity as the Biden administration hopes to build up by the end of the decade. Right now, wind energy makes up just over 10% of the U.S. electricity mix, and nearly all of that comes from land-based turbines...

For now, the U.S. has just two small wind farms off the coasts of Rhode Island and Virginia. Construction started on the foundations for the nation's first commercial-scale wind farm off Martha's Vineyard, Massachusetts, in June... Project costs have gone up with higher interest rates and rising prices for key commodities like steel, Heatmap reports. That's led to power purchase agreements falling through for some projects in early development, including plans in Rhode Island for an 884-megawatt wind farm that alone would have added more than 20 times as much generation capacity as the U.S. has today from offshore wind. Developers are struggling to make projects profitable without passing costs on to consumers...

The study found a modest 2 to 3 percent increase in wholesale electricity costs with ambitious renewable energy deployment. But renewable energy costs have fallen so dramatically in the past that the researchers think those costs could wind up being smaller over time.

The report points out that wind energy complements solar, by producing the most wind energy right when demand is peaking (in the summertime on the West Coast, and during the winter on the East Coast).
Red Hat Software

Jon 'maddog' Hall Defends Red Hat's Re-Licensing of RHEL (lpi.org) 101

In February of 1994 Jon "maddog" Hall interviewed a young Linus Torvalds (then just 24). Nearly three decades later — as Hall approaches his 73rd birthday — he's shared a long essay looking back, but also assessing today's controversy about Red Hat's licensing of RHEL. A (slightly- condensed] excerpt: [O]ver time some customers developed a pattern of purchasing a small number of RHEL systems, then using the "bug-for-bug" compatible version of Red Hat from some other distribution. This, of course, saved the customer money, however it also reduced the amount of revenue that Red Hat received for the same amount of work. This forced Red Hat to charge more for each license they sold, or lay off Red Hat employees, or not do projects they might have otherwise funded. So recently Red Hat/IBM made a business decision to limit their customers to those who would buy a license from them for every single system that would run RHEL and only distribute their source-code and the information necessary on how to build that distribution to those customers. Therefore the people who receive those binaries would receive the sources so they could fix bugs and extend the operating system as they wished.....this was, and is, the essence of the GPL.

Most, if not all, of the articles I have read have said something along the lines of "IBM/Red Hat seem to be following the GPL..but...but...but... the community! "

Which community? There are plenty of distributions for people who do not need the same level of engineering and support that IBM and Red Hat offer. Red Hat, and IBM, continue to send their changes for GPLed code "upstream" to flow down to all the other distributions. They continue to share ideas with the larger community. [...]

I now see a lot of people coming out of the woodwork and beating their breasts and saying how they are going to protect the investment of people who want to use RHEL for free [...] So far I have seen four different distributions saying that they will continue the production of "not RHEL", generating even more distributions for the average user to say "which one should I use"? If they really want to do this, why not just work together to produce one good one? Why not make their own distributions a RHEL competitor? How long will they keep beating their breasts when they find out that they can not make any money at doing it? SuSE said that they would invest ten million dollars in developing a competitor to RHEL. Fantastic! COMPETE. Create an enterprise competitor to Red Hat with the same business channels, world-wide support team, etc. etc. You will find it is not inexpensive to do that. Ten million may get you started.

My answer to all this? RHEL customers will have to decide what they want to do. I am sure that IBM and Red Hat hope that their customers will see the value of RHEL and the support that Red Hat/IBM and their channel partners provide for it. The rest of the customers who just want to buy one copy of RHEL and then run a "free" distribution on all their other systems no matter how it is created, well it seems that IBM does not want to do business with them anymore, so they will have to go to other suppliers who have enterprise capable distributions of Linux and who can tolerate that type of customer. [...]

I want to make sure people know that I do not have any hate for people and companies who set business conditions as long as they do not violate the licenses they are under. Business is business.

However I will point out that as "evil" as Red Hat and IBM have been portrayed in this business change there is no mention at all of all the companies that support Open Source "Permissive Licenses", which do not guarantee the sources to their end users, or offer only "Closed Source" Licenses....who do not allow and have never allowed clones to be made....these people and companies do not have any right to throw stones (and you know who you are).

Red Hat and IBM are making their sources available to all those who receive their binaries under contract. That is the GPL.

For all the researchers, students, hobbyists and people with little or no money, there are literally hundreds of distributions that they can choose, and many that run across other interesting architectures that RHEL does not even address.

Hall answered questions from Slashdot users in 2000 and again in 2013.

Further reading: Red Hat CEO Jim Whitehurst answering questions from Slashdot readers in 2017.

Medicine

A New Mode of Cancer Treatment 36

As detailed in a paper published in Cell Chemical Biology, researchers have developed a "cancer-killing pill" capable of destroying solid tumors while leaving healthy cells unaffected. The new drug has been in development for 20 years and is now undergoing pre-clinical research in the U.S.. Derek Lowe, a medicinal chemist and freelance writer on science and pharmaceutical topics, writes about the new paper via Science Magazine: It's about a molecule designated AOH1996, which seems to have a unique mode of action in tumor cells, one that might make it more more selective for those as compared to normal ones. The key target here is a protein called PCNA (from its old name of "proliferating cell nuclear antigen"). [...] The current molecule is a traditional direct small molecule binder that is selective for caPCNA over the regular type, which is a very attractive advantage to explore. The team behind it has been working on it for several years now to validate that mechanism, and the new paper linked first above is their report of going all the way into animal models. AOH1996 is a very unremarkable-looking molecule - to be honest, it looks like the sort of stuff that you used to see in old combinatorial chemistry libraries in the late 90s and early 2000s, a couple of aryl-rich groups strung together with amide bonds. It's certainly not going to be the most soluble stuff in the world, but they seem to have been able to formulate it. But I'm definitely not going to make fun of any chemical structure that works! [...]

The new paper shows preclinical toxicity testing in two species (mice and dogs), which is what you need to get to human trials. It seems to pass those very well, with no signs of trouble at 6x the effective dose in either species. And if you were throwing DSBs all over the place in normal tissues, believe me, you'd see tox. It is clean in an Ames test, for example. As for efficacy, in cell assays the concentration needed for 50% growth inhibition across 70 different cancer cell lines averaged around 300nM, while it showed no toxic effects on various non-cancer lines up to 10 micromolar (at least a 30x window). The affected cells show cell-cycle arrest, replication stress, apoptosis, and so on. And application of AOH1996 along with other known chemotherapy agents made the cells much more sensitive to those, presumably because they couldn't deal with those on top of the problems that AOH1996 was already causing.

It also shows growth arrest in xenograft tumors in mouse models, with a no-effect dose at least six times its effective dose, and combination therapy with a topoisomerase inhibitor showed even more significant effects. The compound has entered a Phase I trial in humans on the basis of the above data, and I very much look forward to seeing it advance to Phase II, where it will doubtless be used in combination with several existing therapies. I hope that human cancers will prove vulnerable to this new mode of attack in the clinic, and that they are not able to mutate around it with new forms of caPCNA too quickly, either. The comparison with the peptide agent mentioned above will be especially interesting, too. There's only one way to find out - good luck to everyone involved!
Open Source

Meta Releases AudioCraft AI Tool To Create Music From Text 25

Meta on Wednesday introduced its open-source AI tool called AudioCraft that will help users to create music and audio based on text prompts. Reuters reports: The AI tool is bundled with three models, AudioGen, EnCodec and MusicGen, and works for music, sound, compression and generation, Meta said. MusicGen is trained using company-owned and specifically licensed music, it added. From Meta's press release: The AudioCraft family of models are capable of producing high-quality audio with long-term consistency, and they're easy to use. With AudioCraft, we simplify the overall design of generative models for audio compared to prior work in the field -- giving people the full recipe to play with the existing models that Meta has been developing over the past several years while also empowering them to push the limits and develop their own models.

AudioCraft works for music, sound, compression, and generation -- all in the same place. Because it's easy to build on and reuse, people who want to build better sound generators, compression algorithms, or music generators can do it all in the same code base and build on top of what others have done. Having a solid open source foundation will foster innovation and complement the way we produce and listen to audio and music in the future. With even more controls, we think MusicGen can turn into a new type of instrument -- just like synthesizers when they first appeared.
Security

Could NIST Delays Push Post-Quantum Security Products Into the Next Decade? (esecurityplanet.com) 45

Slashdot reader storagedude writes: A quantum computer capable of breaking public-key encryption is likely years away. Unfortunately, so are products that support post-quantum cryptography.

That's the conclusion of an eSecurity Planet article by Henry Newman. With the second round of NIST's post-quantum algorithm evaluations — announced last week — expected to take "several years" and the FIPS product validation process backed up, Newman notes that it will be some time before products based on post-quantum standards become available.

"The delay in developing quantum-resistant algorithms is especially troubling given the time it will take to get those products to market," Newman writes. "It generally takes four to six years with a new standard for a vendor to develop an ASIC to implement the standard, and it then takes time for the vendor to get the product validated, which seems to be taking a troubling amount of time.

"I am not sure that NIST is up to the dual challenge of getting the algorithms out and products validated so that vendors can have products that are available before quantum computers can break current technology. There is a race between quantum technology and NIST vetting algorithms, and at the moment the outcome is looking worrisome."

And as encrypted data stolen now can be decrypted later, the potential for "harvest now, decrypt later" attacks "is a quantum computing security problem that's already here."

Space

How Astronomers Discovered an Unusual Object Pulsing Radio Waves in Space for Decades (cnn.com) 29

In 2018 a doctoral student spotted "a spinning celestial space object," reports CNN. "The unfamiliar object released giant bursts of energy and beamed out radiation three times per hour."

But that was just the beginning... In those moments, it became the brightest source of radio waves viewable from Earth through radio telescopes, acting like a celestial lighthouse. Researchers thought the phenomenon might be a remnant of a collapsed star — either a dense neutron star or a dead white dwarf star — with a strong magnetic field. Or perhaps the object was something else entirely... "We were stumped," said Dr. Natasha Hurley-Walker, senior lecturer at the Curtin University node of ICRAR, in a statement. "So we started searching for similar objects to find out if it was an isolated event or just the tip of the iceberg." The team observed the sky using the Murchison Widefield Array, a radio telescope on Wajarri Yamaji Country in outback Western Australia, between July and September 2022. The scientists discovered an object 15,000 light-years from Earth in the Scutum constellation. The object, dubbed GPM J1839-10, released radio waves every 22 minutes. The bursts of energy lasted up to five minutes.

Astronomers believe it could be a magnetar, or a rare type of star with extremely strong magnetic fields that is capable of releasing powerful, energetic bursts. But if the object is a magnetar, it defies description because all known magnetars release energy in a matter of seconds, or a few minutes at the most. A study detailing the discovery was published Wednesday in the journal Nature. "This remarkable object challenges our understanding of neutron stars and magnetars, which are some of the most exotic and extreme objects in the Universe," said Hurley-Walker, who was the lead author of the new report...

"Assuming it's a magnetar, it shouldn't be possible for this object to produce radio waves. But we're seeing them. And we're not just talking about a little blip of radio emission. Every 22 minutes, it emits a five-minute pulse of radio wavelength energy, and it's been doing that for at least 33 years. Whatever mechanism is behind this is extraordinary."

The astronomers "searched through the archives of radio telescopes that have been operational for decades," the article points out — and ultimately confirmed the existence of the phenomenon. "It showed up in observations by the Giant Metrewave Radio Telescope in India and the Very Large Array in the USA had observations dating as far back as 1988," Hurley-Walker said.

"That was quite an incredible moment for me. I was five years old when our telescopes first recorded pulses from this object, but no one noticed it, and it stayed hidden in the data for 33 years. They missed it because they hadn't expected to find anything like it."
Supercomputing

Cerebras To Enable 'Condor Galaxy' Network of AI Supercomputers 20

Cerebras Systems and G42 have introduced the Condor Galaxy project, a network of nine interconnected supercomputers designed for AI model training with a combined performance of 36 FP16 ExaFLOPs. The first supercomputer, CG-1, located in California, offers 4 ExaFLOPs of FP16 performance and 54 million cores, focusing on Large Language Models and Generative AI without the need for complex distributed programming languages. AnandTech reports: CG-2 and CG-3 will be located in the U.S. and will follow in 2024. The remaining systems will be located across the globe and the total cost of the project will be over $900 million. The CG-1 supercomputer, situated in Santa Clara, California, combines 64 Cerebras CS-2 systems into a single user-friendly AI supercomputer, capable of providing 4 ExaFLOPs of dense, systolic FP16 compute for AI training. Based around Cerebras's 2.6 trillion transistor second-generation wafer scale engine processors, the machine is designed specifically for Large Language Models and Generative AI. It supports up to 600 billion parameter models, with configurations that can be expanded to support up to 100 trillion parameter models. Its 54 million AI-optimized compute cores and massivefabric network bandwidth of 388 Tb/s allow for nearly linear performance scaling from 1 to 64 CS-2 systems, according to Cerebras. The CG-1 supercomputer also offers inherent support for long sequence length training (up to 50,000 tokens) and does not require any complex distributed programming languages, which is common in case of GPU clusters.

This supercomputer is provided as a cloud service by Cerebras and G42 and since it is located in the U.S., Cerebras and G42 assert that it will not be used by hostile states. CG-1 is the first of three 4 FP16 ExaFLOP AI supercomputers (CG-1, CG-2, and CG-3) created by Cerebras and G42 in collaboration and located in the U.S. Once connected, these three AI supercomputers will form a 12 FP16 ExaFLOP, 162 million core distributed AI supercomputer, though it remains to be seen how efficient this network will be. In 2024, G42 and Cerebras plan to launch six additional Condor Galaxy supercomputers across the world, which will increase the total compute power to 36 FP16 ExaFLOPs delivered by 576 CS-2 systems. The Condor Galaxy project aims to democratize AI by offering sophisticated AI compute technology in the cloud.
"Delivering 4 exaFLOPs of AI compute at FP16, CG-1 dramatically reduces AI training timelines while eliminating the pain of distributed compute," said Andrew Feldman, CEO of Cerebras Systems. "Many cloud companies have announced massive GPU clusters that cost billions of dollars to build, but that are extremely difficult to use. Distributing a single model over thousands of tiny GPUs takes months of time from dozens of people with rare expertise. CG-1 eliminates this challenge. Setting up a generative AI model takes minutes, not months and can be done by a single person. CG-1 is the first of three 4 ExaFLOP AI supercomputers to be deployed across the U.S. Over the next year, together with G42, we plan to expand this deployment and stand up a staggering 36 exaFLOPs of efficient, purpose-built AI compute."
Supercomputing

Tesla Starts Production of Dojo Supercomputer To Train Driverless Cars (theverge.com) 45

An anonymous reader quotes a report from The Verge: Tesla says it has started production of its Dojo supercomputer to train its fleet of autonomous vehicles. In its second quarter earnings report for 2023, the company outlined "four main technology pillars" needed to "solve vehicle autonomy at scale: extremely large real-world dataset, neural net training, vehicle hardware and vehicle software." "We are developing each of these pillars in-house," the company said in its report. "This month, we are taking a step towards faster and cheaper neural net training with the start of production of our Dojo training computer."

The automaker already has a large Nvidia GPU-based supercomputer that is one of the most powerful in the world, but the new Dojo custom-built computer is using chips designed by Tesla. In 2019, Tesla CEO Elon Musk gave this "super powerful training computer" a name: Dojo. Previously, Musk has claimed that Dojo will be capable of an exaflop, or 1 quintillion (1018) floating-point operations per second. That is an incredible amount of power. "To match what a one exaFLOP computer system can do in just one second, you'd have to perform one calculation every second for 31,688,765,000 years," Network World wrote.

Transportation

Green Energy Tycoon To Launch UK's First Electric Airline (theguardian.com) 69

Dale Vince, the green energy tycoon and founder of Ecotricity, is planning to launch Britain's first electric airline called Ecojet. The Guardian reports: Ecojet, styled as a "flag carrier for green Britain," will launch early next year with a 19-seater plane traveling on a route between Edinburgh and Southampton. The planes will run initially on kerosene-based fuel for the first year, before being retrofitted with engines that convert green hydrogen into electricity. The airline will launch with several green-striped 19-seater planes capable of traveling for 300 miles. Vince hopes to expand the number of routes out to cover all of Britain's big cities. Staff will wear environmentally friendly uniforms, and serve plant-based meals.

A second phase, 18 months later, will result in 70-seater planes capable of flying to Europe being introduced. The company is in the process of applying for a license from the Civil Aviation Authority and securing takeoff and landing slots at airports. However, the process of launching an airline is regarded as slow, and Ecojet will not launch as an electric plane operator, starting by using kerosene-based fuel instead. [...] Vince said Ecojet would "price match" existing airlines on air fares and was intended to attract a mass market, beyond environment-conscious consumers. He said he would invest one million pounds initially but plans to raise further funds next year.

Slashdot Top Deals