Businesses

French Antitrust Regulators Preparing Nvidia Charges (reuters.com) 28

French antitrust regulators are preparing to charge Nvidia for allegedly anti-competitive practices, Reuters reported Monday, citing sources. From the report: The French so-called statement of objections or charge sheet would follow dawn raids in the graphics cards sector in September last year which sources said targeted Nvidia. The world's largest maker of chips used both for artificial intelligence and for computer graphics has seen demand for its chips jump following the release of the generative AI application ChatGPT, triggering regulatory scrutiny on both sides of the Atlantic.
AI

Is AI's Demand for Energy Really 'Insatiable'? (arstechnica.com) 56

Bloomberg and The Washington Post "claim AI power usage is dire," writes Slashdot reader NoWayNoShapeNoForm. But Ars Technica "begs to disagree with those speculations."

From Ars Technica's article: The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI's "insatiable" demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous "some [people]" in reporting that "some worry whether there will be enough electricity to meet [the power demands] from any source." Digging into the best available numbers and projections available, though, it's hard to see AI's current and near-future environmental impact in such a dire light... While the headline focus of both Bloomberg and The Washington Post's recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet "data centers" as a whole...

Bloomberg asks one source directly "why data centers were suddenly sucking up so much power" and gets back a blunt answer: "It's AI... It's 10 to 15 times the amount of electricity." Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry's current mania for generative AI. If you squint at Bloomberg's graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI.

Ars Technica first cites Dutch researcher Alex de Vries's estimate that in a few years the AI sector could use between 85 and 134 TWh of power. But another study estimated in 2018 that PC gaming already accounted for 75 TWh of electricity use per year, while "the IEA estimates crypto mining ate up 110 TWh of electricity in 2022." More to the point, de Vries' AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA's recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as "AI").
The future is also hard to predict, the article concludes. "If customers don't respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs..."
Unix

X Window System Turns 40 52

Ancient Slashdot reader ewhac writes: On June 19, 1984, Robert Scheifler announced on MIT's Project Athena mailing list a new graphical windowing system he'd put together. Having cribbed a fair bit of code from the existing windowing toolkit called W, Scheifler named his new system X, thus giving birth to the X Window System. Scheifler prophetically wrote at the time, "The code seems fairly solid at this point, although there are still some deficiencies to be fixed up."

The 1980's and 1990's saw tremendous activity in the development of graphical displays and user interfaces, and X was right in the middle of it all, alongside Apple, Sun, Xerox, Apollo, Silicon Graphics, NeXT, and many others. Despite the fierce, well-funded competition, and heated arguments about how many buttons a mouse should have, X managed to survive, due in large part to its Open Source licensing and its flexible design, allowing it to continue to work well even as graphical hardware rapidly advanced. As such, it was ported to dozens of platforms over the years (including a port to the Amiga computer by Dale Luck in the late 1980's). 40 years later, despite its warts, inconsistencies, age, and Wayland promising for the last ten years to be coming Real Soon Now, X remains the windowing system for UNIX-like platforms.
KDE

KDE Plasma 6.1 Released (kde.org) 42

"The KDE community announced the latest release of their popular desktop environment: Plasma 6.1," writes longtime Slashdot reader jrepin. From the announcement: While Plasma 6.0 was all about getting the migration to the underlying Qt 6 frameworks correct, Plasma 6.1 is where developers start implementing the features that will take you desktop to a new level. In this release, you will find features that go far beyond subtle changes to themes and tweaks to animations (although there is plenty of those too). Among some of the new features in this release you will find improved remote desktop support with a new built-in server, overhauled and streamlined desktop edit mode, restoration of open applications from the previous session on Wayland, synchronization of keyboard LED colors with the desktop accent color, making mouse cursor bigger and easier to find by shaking it, edge barriers (a sticky area for mouse cursor near the edges between screens), explicit sync support eliminates flickering and glitches for NVidia graphics card users on Wayland, and triple buffering support for smoother animations and screen rendering. The changelog for Plasma 6.1 is available here.
XBox (Games)

Upcoming Games Include More Xbox Sequels - and a Medieval 'Doom' (polygon.com) 32

Announced during Microsoft's Xbox Games Showcase, Doom: The Dark Ages is id Software's next foray back into hell. [Also available for PS5 and PC.] Doom: The Dark Ages is a medieval spin on the Doom franchise, taking the Doom Slayer back to the beginning. It's coming to Xbox Game Pass on day one, sometime in 2025.

Microsoft's first trailer for Doom: The Dark Ages shows the frenetic, precision gameplay we've come to expect from the franchise — there's a lot of blasting and shooting and a chainsaw. Oh, and the Doom Slayer can ride a dragon?

"Before he became a hero he was the super weapon of gods and kings," says the trailer (which showcases the game's crazy-good graphics...) The 2020 game Doom Eternal sold 3 million copies in its first month, according to Polygon, with its game director telling the site in 2021 that "our hero is somewhat timeless — I mean, literally, he's immortal. So we could tell all kinds of stories..."

Other upcoming Xbox games were revealed too. Engadget is excited about the reboot of the first-person shooter Perfect Dark (first released in 2000, but now set in the near future). There's also Gears of War: E-Day, Indiana Jones and the Great Circle, State of Decay 3, and Assassin's Creed Shadows, according to Xbox.com — plus "the announcement of three new Xbox Series X|S console options." [Engadget notes it's the first time Microsoft has offered a cheaper all-digital Xbox Series X with no disc drive.] "And on top of all that, we also brought the gameplay reveal of a brand-new Call of Duty game with Call of Duty: Black Ops 6."

Meanwhile, Friday's Summer Game Fest 2024 featured Star Wars Outlaws footage (which according to GamesRadar takes place between Empire Strikes Back and Return of the Jedi, featuring not just card games with Lando Calrissian but also Jabba the Hutt and a frozen Han Solo.) Engadget covered all the announcements from Game Fest, including the upcoming game Mixtape, which Engadget calls a "reality-bending adventure" with "a killer '80s soundtrack" about three cusp-of-adulthood teenagers who "Skate. Party. Avoid the law. Make out. Sneak out. Hang out..." for Xbox/PS5/PC.
Graphics

Nvidia Takes 88% of the GPU Market Share (xda-developers.com) 83

As reported by Jon Peddie Research, Nvidia now holds 88% of the GPU market after its market share jumped 8% in its most recent quarter. "This jump shaves 7% off of AMD's share, putting it down to 19% total," reports XDA Developers. "And if you're wondering where that extra 1% went, it came from all of Intel's market share, squashing it down to 0%." From the report: Dr. Jon Peddie, president of Jon Peddie Research, mentions how the GPU market hasn't really looked "normal" since the 2007 recession. Ever since then, everything from the crypto boom to COVID has messed with the usual patterns. Usually, the first quarter of a year shows a bit of a dip in GPU sales, but because of AI's influence, it may seem like that previous norm may be forever gone: "Therefore, one would expect Q2'24, a traditional quarter, to also be down. But, all the vendors are predicting a growth quarter, mostly driven by AI training systems in hyperscalers. Whereas AI trainers use a GPU, the demand for them can steal parts from the gaming segment. So, for Q2, we expect to see a flat to low gaming AIB result and another increase in AI trainer GPU shipments. The new normality is no normality."
Ubuntu

Canonical Launches Ubuntu Core 24 (ubuntu.com) 5

Canonical, the company behind Ubuntu, has released Ubuntu Core 24, a version of its operating system designed for edge devices and the Internet of Things (IoT). The new release comes with a 12-year Long Term Support commitment and features that enable secure, reliable, and efficient deployment of intelligent devices.

Ubuntu Core 24 introduces validation sets for custom image creation, offline remodelling for air-gapped environments, and new integrations for GPU operations and graphics support. It also offers device management integrations with Landscape and Microsoft Azure IoT Edge. The release is expected to benefit various industries, including automation, healthcare, and robotics, Canonical said.
AMD

AMD Unveils Ryzen AI and 9000 Series Processors, Plus Radeon PRO W7900 Dual Slot (betanews.com) 41

The highlight of AMD's presentation Sunday at Computex 2024 was "the introduction of AMD's Ryzen AI 300 Series processors for laptops and the Ryzen 9000 Series for desktops," writes Slashdot reader BrianFagioli (sharing his report at Beta News): AMD's Ryzen AI 300 Series processors, designed for next-generation AI laptops, come with AMD's latest XDNA 2 architecture. This includes a Neural Processing Unit (NPU) that delivers 50 TOPS of AI processing power, significantly enhancing the AI capabilities of laptops. Among the processors announced were the Ryzen AI 9 HX 370, which features 12 cores and 24 threads with a boost frequency of 5.1 GHz, and the Ryzen AI 9 365 with 10 cores and 20 threads, boosting up to 5.0 GHz...

In the desktop segment, the Ryzen 9000 Series processors, based on the "Zen 5" architecture, demonstrated an average 16% improvement in IPC performance over their predecessors built on the "Zen 4" architecture. The Ryzen 9 9950X stands out with 16 cores and 32 threads, reaching up to 5.7 GHz boost frequency and equipped with 80MB of cache... AMD also reaffirmed its commitment to the AM4 platform by introducing the Ryzen 9 5900XT and Ryzen 7 5800XT processors. These models are compatible with existing AM4 motherboards, providing an economical upgrade path for users.

The article adds that AMD also unveiled its Radeon PRO W7900 Dual Slot workstation graphics card — priced at $3,499 — "further broadening its impact on high-performance computing...

"AMD also emphasized its strategic partnerships with leading OEMs such as Acer, ASUS, HP, Lenovo, and MSI, who are set to launch systems powered by these new AMD processors." And there's also a software collaboration with Microsoft, reportedly "to enhance the capabilities of AI PCs, thus underscoring AMD's holistic approach to integrating AI into everyday computing."
Hardware

Arm Says Its Next-Gen Mobile GPU Will Be Its Most 'Performant and Efficient' (theverge.com) 29

IP core designer Arm announced its next-generation CPU and GPU designs for flagship smartphones: the Cortex-X925 CPU and Immortalis G925 GPU. Both are direct successors to the Cortex-X4 and Immortalis G720 that currently power MediaTek's Dimensity 9300 chip inside flagship smartphones like the Vivo X100 and X100 Pro and Oppo Find X7. From a report: Arm changed the naming convention for its Cortex-X CPU design to highlight what it says is a much faster CPU design. It claims the X925's single-core performance is 36 percent faster than the X4 (when measured in Geekbench). Arm says it increased the AI workload performance by 41 percent, time to token, with up to 3MB of private L2 cache. The Cortex-X925 brings a new generation of Cortex-A microarchitectures ("little" cores) with it, too: the Cortex-A725, which Arm says has 35 percent better performance efficiency than last-gen's A720 and a 15 percent more power-efficient Cortex-A520.

Arm's new Immortalis G925 GPU is its "most performant and efficient GPU" to date, it says. It's 37 percent faster on graphics applications compared to the last-gen G720, with improved ray-tracing performance with intricate objects by 52 percent and improved AI and ML workloads by 34 percent -- all while using 30 percent less power. For the first time, Arm will offer "optimized layouts" of its new CPU and GPU designs that it says will be easier for device makers to "drop" or implement into their own system on chip (SoC) layouts. Arm says this new physical implementation solution will help other companies get their devices to market faster, which, if true, means we could see more devices with Arm Cortex-X925 and / or Immortalis G925 than the few that shipped with its last-gen ones.

Nintendo

Ubuntu 24.04 Now Runs on the Nintendo Switch (Unofficially) (omgubuntu.co.uk) 6

"The fact it's possible at all is a credit to the ingenuity of the open-source community," writes the blog OMG Ubuntu: Switchroot is an open-source project that allows Android and Linux-based distros like Ubuntu to run on the Nintendo Switch — absolutely not something Nintendo approves of much less supports, endorses, or encourages, etc! I covered the loophole that made this possible back in 2018. Back then the NVIDIA Tegra X1-powered Nintendo Switch was still new and Linux support for much of the console's internal hardware in a formative state (a polite way to say 'not everything worked'). But as the popularity of Nintendo's handheld console ballooned (to understate it) so the 'alternative OS' Switch scene grew, and before long Linux support for Switch hardware was in full bloom...

A number of Linux for Switchroot (L4S) distributions have since been released, designated as Linux for Tegra (L4T) builds. As these can boot from a microSD card it's even possible to dualboot the Switch OS with Linux, which is neat! Recently, a fresh set of L4T Ubuntu images were released based on the newest Ubuntu 24.04 LTS release. These builds work on all Switch versions, from the OG (exploit-friendly) unit through to newer, patched models (where a modchip is required)...

I'm told all of the Nintendo Switch internal hardware now works under Linux, including Wi-Fi, Bluetooth, sleep mode, accelerated graphics, the official dock... Everything, basically. And despite being a 7 year old ARM device the performance is said to remain decent.

"Upstream snafus have delayed the release of builds with GNOME Shell..."
Businesses

Nvidia Reports a 262% Jump In Sales, 10-1 Stock Split (cnbc.com) 11

Nvidia reported fiscal first-quarter earnings surpassing expectations with strong forecasts, indicating sustained demand for its AI chips. Following the news, the company's stock rose over 6% in extended trading. Nvidia also said it was splitting its stock 10 to 1. CNBC reports: Nvidia said it expected sales of $28 billion in the current quarter. Wall Street was expecting earnings per share of $5.95 on sales of $26.61 billion, according to LSEG. Nvidia reported net income for the quarter of $14.88 billion, or $5.98 per share, compared with $2.04 billion, or 82 cents, in the year-ago period. [...] Nvidia said its data center category rose 427% from the year-ago quarter to $22.6 billion in revenue. Nvidia CFO Colette Kress said in a statement that it was due to shipments of the company's "Hopper" graphics processors, which include the company's H100 GPU.

Nvidia also highlighted strong sales of its networking parts, which are increasingly important as companies build clusters of tens of thousands of chips that need to be connected. Nvidia said that it had $3.2 billion in networking revenue, primarily its Infiniband products, which was over three times higher than last year's sales. Nvidia, before it became the top supplier to big companies building AI, was known primarily as a company making hardware for 3D gaming. The company's gaming revenue was up 18% during the quarter to $2.65 billion, which Nvidia attributed to strong demand.

The company also sells chips for cars and chips for advanced graphics workstations, which remain much smaller than its data center business. The company reported $427 million in professional visualization sales, and $329 million in automotive sales. Nvidia said it bought back $7.7 billion worth of its shares and paid $98 million in dividends during the quarter. Nvidia also said that it's increasing its quarterly cash dividend from 4 cents per share to 10 cents on a pre-split basis. After the split, the dividend will be a penny a share.

Graphics

Microsoft Paint Is Getting an AI-Powered Image Generator (engadget.com) 41

Microsoft Paint is getting a new image generator tool called Cocreator that can generate images based on text prompts and doodles. Engadget reports: During a demo at its Surface event, the company showed off how Cocreator combines your own drawings with text prompts to create an image. There's also a "creativity slider" that allows you to control how much you want AI to take over compared with your original art. As Microsoft pointed out, the combination of text prompts and your own brush strokes enables faster edits. It could also help provide a more precise rendering than what you'd be able to achieve with DALL-E or another text-to-image generator alone.
Ubuntu

Ubuntu 24.10 to Default to Wayland for NVIDIA Users (omgubuntu.co.uk) 76

An anonymous reader shared this report from the blog OMG Ubuntu: Ubuntu first switched to using Wayland as its default display server in 2017 before reverting the following year. It tried again in 2021 and has stuck with it since. But while Wayland is what most of us now log into after installing Ubuntu, anyone doing so on a PC or laptop with an NVIDIA graphics card present instead logs into an Xorg/X11 session.

This is because NVIDIA's proprietary graphics drivers (which many, especially gamers, opt for to get the best performance, access to full hardware capabilities, etc) have not supported Wayland as well as as they could've. Past tense as, thankfully, things have changed in the past few years. NVIDIA's warmed up to Wayland (partly as it has no choice given that Wayland is now standard and a 'maybe one day' solution, and partly because it wants to: opportunities/benefits/security).

With the NVIDIA + Wayland sitch' now in a better state than before — but not perfect — Canonical's engineers say they feel confident enough in the experience to make the Ubuntu Wayland session default for NVIDIA graphics card users in Ubuntu 24.10.

Supercomputing

Defense Think Tank MITRE To Build AI Supercomputer With Nvidia (washingtonpost.com) 44

An anonymous reader quotes a report from the Washington Post: A key supplier to the Pentagon and U.S. intelligence agencies is building a $20 million supercomputer with buzzy chipmaker Nvidia to speed deployment of artificial intelligence capabilities across the U.S. federal government, the MITRE think tank said Tuesday. MITRE, a federally funded, not-for-profit research organization that has supplied U.S. soldiers and spies with exotic technical products since the 1950s, says the project could improve everything from Medicare to taxes. "There's huge opportunities for AI to make government more efficient," said Charles Clancy, senior vice president of MITRE. "Government is inefficient, it's bureaucratic, it takes forever to get stuff done. ... That's the grand vision, is how do we do everything from making Medicare sustainable to filing your taxes easier?" [...] The MITRE supercomputer will be based in Ashburn, Va., and should be up and running late this year. [...]

Clancy said the planned supercomputer will run 256 Nvidia graphics processing units, or GPUs, at a cost of $20 million. This counts as a small supercomputer: The world's fastest supercomputer, Frontier in Tennessee, boasts 37,888 GPUs, and Meta is seeking to build one with 350,000 GPUs. But MITRE's computer will still eclipse Stanford's Natural Language Processing Group's 68 GPUs, and will be large enough to train large language models to perform AI tasks tailored for government agencies. Clancy said all federal agencies funding MITRE will be able to use this AI "sandbox." "AI is the tool that is solving a wide range of problems," Clancy said. "The U.S. military needs to figure out how to do command and control. We need to understand how cryptocurrency markets impact the traditional banking sector. ... Those are the sorts of problems we want to solve."

Hardware

Apple Announces M4 With More CPU Cores and AI Focus (arstechnica.com) 66

An anonymous reader quotes a report from Ars Technica: In a major shake-up of its chip roadmap, Apple has announced a new M4 processor for today's iPad Pro refresh, barely six months after releasing the first MacBook Pros with the M3 and not even two months after updating the MacBook Air with the M3. Apple says the M4 includes "up to" four high-performance CPU cores, six high-efficiency cores, and a 10-core GPU. Apple's high-level performance estimates say that the M4 has 50 percent faster CPU performance and four times as much graphics performance. Like the GPU in the M3, the M4 also supports hardware-accelerated ray-tracing to enable more advanced lighting effects in games and other apps. Due partly to its "second-generation" 3 nm manufacturing process, Apple says the M4 can match the performance of the M2 while using just half the power.

As with so much else in the tech industry right now, the M4 also has an AI focus; Apple says it's beefing up the 16-core Neural Engine (Apple's equivalent of the Neural Processing Unit that companies like Qualcomm, Intel, AMD, and Microsoft have been pushing lately). Apple says the M4 runs up to 38 trillion operations per second (TOPS), considerably ahead of Intel's Meteor Lake platform, though a bit short of the 45 TOPS that Qualcomm is promising with the Snapdragon X Elite and Plus series. The M3's Neural Engine is only capable of 18 TOPS, so that's a major step up for Apple's hardware. Apple's chips since 2017 have included some version of the Neural Engine, though to date, those have mostly been used to enhance and categorize photos, perform optical character recognition, enable offline dictation, and do other oddities. But it may be that Apple needs something faster for the kinds of on-device large language model-backed generative AI that it's expected to introduce in iOS and iPadOS 18 at WWDC next month.
A separate report from the Wall Street Journal says Apple is developing a custom chip to run AI software in datacenters. "Apple's server chip will likely be focused on running AI models, also known as inference, rather than in training AI models, where Nvidia is dominant," reports Reuters.

Further reading: Apple Quietly Kills the Old-school iPad and Its Headphone Jack
Games

Veteran PC Game 'Sopwith' Celebrates 40th Anniversary (github.io) 42

Longtime Slashdot reader sfraggle writes: Biplane shoot-'em up, Sopwith, is celebrating 40 years today since its first release back in 1984. The game is one of the oldest PC games still in active development today, originating as an MS-DOS game for the original IBM PC. The 40th anniversary site has a detailed history of how the game was written as a tech demo for the now-defunct Imaginet networking system. There is also a video interview with its original authors. "The game involves piloting a Sopwith biplane, attempting to bomb enemy buildings while avoiding fire from enemy planes and various other obstacles," reads the Wiki page. "Sopwith uses four-color CGA graphics and music and sound effects use the PC speaker. A sequel with the same name, but often referred to as Sopwith 2, was released in 1985."

You can play Sopwith in your browser here.
Ubuntu

Ubuntu 24.04 Yields a 20% Performance Advantage Over Windows 11 On Ryzen 7 Framework Laptop (phoronix.com) 63

Michael Larabel reports via Phoronix: With the Framework 16 laptop one of the performance pieces I've been meaning to carry out has been seeing out Linux performs against Microsoft Windows 11 for this AMD Ryzen 7 7840HS powered modular/upgradeable laptop. Recently getting around to it in my benchmarking queue, I also compared the performance of Ubuntu 23.10 to the near final Ubuntu 24.04 LTS on this laptop up against a fully-updated Microsoft Windows 11 installation. The Framework 16 review unit as a reminder was configured with the 8-core / 16-thread AMD Ryzen 7 7840HS Zen 4 SoC with Radeon RX 7700S graphics, a 512GB SN810 NVMe SSD, MediaTek MT7922 WiFi, and a 2560 x 1600 display.

In the few months of testing out the Framework 16 predominantly under Linux it's been working out very well. With also having a Windows 11 partition as shipped by Framework, after updating that install it made for an interesting comparison against the Ubuntu 23.10 and Ubuntu 24.04 performance. The same Framework 16 AMD laptop was used throughout all of the testing for looking at the out-of-the-box performance across Microsoft Windows 11, Ubuntu 23.10, and the near-final state of Ubuntu 24.04. [...]

Out of 101 benchmarks carried out on all three operating systems with the Framework 16 laptop, Ubuntu 24.04 was the fastest in 67% of those tests, the prior Ubuntu 23.10 led in 22% (typically with slim margins to 24.04), and then Microsoft Windows 11 was the front-runner just 10% of the time... If taking the geomean of all 101 benchmark results, Ubuntu 23.10 was 16% faster than Microsoft Windows 11 while Ubuntu 24.04 enhanced the Ubuntu Linux performance by 3% to yield a 20% advantage over Windows 11 on this AMD Ryzen 7 7840HS laptop. Ubuntu 24.04 is looking very good in the performance department and will see its stable release next week.

PlayStation (Games)

Sony's PS5 Pro is Real and Developers Are Getting Ready For It (theverge.com) 25

Sony is getting ready to release a more powerful PS5 console, possibly by the end of this year. After reports of leaked PS5 Pro specifications surfaced recently, The Verge has obtained a full list of specs for the upcoming console. From the report: Sources familiar with Sony's plans tell me that developers are already being asked to ensure their games are compatible with this upcoming console, with a focus on improving ray tracing. Codenamed Trinity, the PlayStation 5 Pro model will include a more powerful GPU and a slightly faster CPU mode. All of Sony's changes point to a PS5 Pro that will be far more capable of rendering games with ray tracing enabled or hitting higher resolutions and frame rates in certain titles. Sony appears to be encouraging developers to use graphics features like ray tracing more with the PS5 Pro, with games able to use a "Trinity Enhanced" (PS5 Pro Enhanced) label if they "provide significant enhancements."

Sony expects GPU rendering on the PS5 Pro to be "about 45 percent faster than standard PlayStation 5," according to documents outlining the upcoming console. The PS5 Pro GPU will be larger and use faster system memory to help improve ray tracing in games. Sony is also using a "more powerful ray tracing architecture" in the PS5 Pro, where the speed here is up to three times better than the regular PS5. "Trinity is a high-end version of PlayStation 5," reads one document, with Sony indicating it will continue to sell the standard PS5 after this new model launches. Sony is expecting game developers to have a single package that will support both the PS5 and PS5 Pro consoles, with existing games able to be patched for higher performance.

Google

With Vids, Google Thinks It Has the Next Big Productivity Tool For Work (theverge.com) 56

For decades, work has revolved around documents, spreadsheets, and slide decks. Word, Excel, PowerPoint; Pages, Numbers, Keynote; Docs, Sheets, Slides. Now Google is proposing to add another to that triumvirate: an app called Vids that aims to help companies and consumers make collaborative, shareable video more easily than ever. From a report: Google Vids is very much not an app for making beautiful movies... or even not-that-beautiful movies. It's meant more for the sorts of things people do at work: make a pitch, update the team, explain a complicated concept. The main goal is to make everything as easy as possible, says Kristina Behr, Google's VP of product management for the Workspace collaboration apps. "The ethos that we have is, if you can make a slide, you can make a video in Vids," she says. "No video production is required."

Based on what I've seen of Vids so far, it appears to be roughly what you'd get if you transformed Google Slides into a video app. You collect assets from Drive and elsewhere and assemble them in order -- but unlike the column of slides in the Slides sidebar, you're putting together a left-to-right timeline for a video. Then, you can add voiceover or film yourself and edit it all into a finished video. A lot of those finished videos, I suspect, will look like recorded PowerPoint presentations or Meet calls or those now-ubiquitous training videos where a person talks to you from a small circle in the bottom corner while graphics play on the screen. There will be lots of clip art-heavy product promos, I'm sure. But in theory, you can make almost anything in Vids. ou can either do all this by yourself or prompt Google's Gemini AI to make a first draft of the video for you. Gemini can build a storyboard; it can write a script; it can read your script aloud with text-to-speech; it can create images for you to use in the video. The app has a library of stock video and audio that users can add to their own Vids, too.

AMD

AMD To Open Source Micro Engine Scheduler Firmware For Radeon GPUs 23

AMD plans to document and open source its Micro Engine Scheduler (MES) firmware for GPUs, giving users more control over Radeon graphics cards. From a report: It's part of a larger effort AMD confirmed earlier this week about making its GPUs more open source at both a software level in respect to the ROCm stack for GPU programming and a hardware level. Details were scarce with this initial announcement, and the only concrete thing it introduced was a GitHub tracker.

However, yesterday AMD divulged more details, specifying that one of the things it would be making open source was the MES firmware for Radeon GPUs. AMD says it will be publishing documentation for MES around the end of May, and will then release the source code some time afterward. For one George Hotz and his startup, Tiny Corp, this is great news. Throughout March, Hotz had agitated for AMD to make MES open source in order to fix issues he was experiencing with his RX 7900 XTX-powered AI server box. He had talked several times to AMD representatives, and even the company's CEO, Lisa Su.

Slashdot Top Deals