Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Chrome

Chrome 94 Beta Adds WebGPU API With Support For Apple's Metal (9to5mac.com) 36

An anonymous reader quotes a report from 9to5Mac, written by Filipe Esposito: Google this week announced the beta release of Chrome 94, the next update to Google's desktop web browser. In addition to general improvements, the update also adds support for the new WebGPU API, which comes to replace WebGL and can even access Apple's Metal API. As described by Google in a blog post, WebGPU is a new, more advanced graphics API for the web that is able to access GPU hardware, resulting in better performance for rendering interfaces in websites and web apps.

For those unfamiliar, Metal is an API introduced by Apple in 2014 that provides low-level access to GPU hardware for iOS, macOS, and tvOS apps. In other words, apps can access the GPU without overloading the CPU, which is one of the limitations of old APIs like OpenGL. Google says WebGPU is not expected to come enabled by default for all Chrome users until early 2022. The final release of Chrome 94 should enable WebCodecs for everyone, which is another API designed to improve the encoding and decoding of streaming videos.

This discussion has been archived. No new comments can be posted.

Chrome 94 Beta Adds WebGPU API With Support For Apple's Metal

Comments Filter:
  • by waspleg ( 316038 ) on Tuesday August 31, 2021 @09:08AM (#61748171) Journal

    I were a malware crypto ad miner...

    • This is exactly what I was thinking as well.

      Probably also a pretty good avenue for gathering unique identifying criteria for fingerprinting as well.

      • I'm having a hard time thinking of any reason I'd want a website to have direct access to my hardware. I'm trying, but so far I've only been able to find some of my old ActiveDesktop nightmares and other dark memories about what hells used to be inflicted upon us.
    • by AmiMoJo ( 196126 )

      It does look like they screwed this one up already. Here is the section of the spec that deals with security: https://gpuweb.github.io/gpuwe... [github.io]

      Note that they didn't even consider crypto miner type attacks. Seems that as long as the crypto miner doesn't exceed the default resource limits, which they describe as having set fairly high so that most apps won't need to, it will run just fine with no user interaction or confirmation.

      WebGL is already widely used for tracking and fingerprinting, and is often disabl

      • What exactly do you want them to do about it?

      • Note that they didn't even consider crypto miner type attacks.

        What should they do about that, exactly?

        Crypto miners can already run in JavaScript.

        • by AmiMoJo ( 196126 )

          Well all modern browsers detect and limit background CPU usage of Javascript on pages, especially on mobile. It's not perfect.

          I'd say make WebGPU always require a per-site opt-in. If WebGL is anything to go by it will be abused otherwise.

          • by gweihir ( 88907 )

            Well all modern browsers detect and limit background CPU usage of Javascript on pages, especially on mobile. It's not perfect.

            I'd say make WebGPU always require a per-site opt-in. If WebGL is anything to go by it will be abused otherwise.

            For "ease of use" this will of course always work, no opt-in. Also, do you really believe people are competent to know when to opt-in and when not to?

        • Comment removed based on user account deletion
      • Part of me wants to argue with you out of habit, but damn! I see some "mays" in there that really need to be "MUSTS". Like "In order to prevent the shaders from accessing GPU memory an application doesn’t own, the WebGPU implementation may enable a special mode (called "robust buffer access") in the driver that guarantees that the access is limited to buffer bounds." Now, can anyone tell me how the hell that isn't "MUST enable a special mode..."?

        And going back to the memory issue you pointed out

    • The only other legitimate use for this is what I wonder? Playing browser games with graphics advanced enough to need GPU processing... on Apple hardware.... The segment seems to small it was almost not worth implementing, considering security holes that could create.

      • The only other legitimate use for this is what I wonder?

        Simple: Making WebGL possible on Apple.

        Playing browser games with graphics advanced enough to need GPU processing... on Apple hardware....

        Or running all those web sites that do CAD in your prowser.

        (for example)

  • I don’t know why web browsers even bother to add new features anymore apart from entrenching advertising tech and making it harder for ad blockers to detect them.
    • Why, what do you think this will be used on?

      Aside of hijacking those functionalities for mining cryptocurrency, of course.

      • It's another data point for fingerprinting and tracking you.

        I would be willing to bet that this API will allow the developer to query the underlying hardware and its capabilities.

        So now the fingerprinting tech adds that as another piece of information that, when aggregated with a hundred other data points about your browser and computer capabilities creates a unique profile that can be used to identify you later/elsewhere.

  • by gweihir ( 88907 ) on Tuesday August 31, 2021 @09:25AM (#61748227)

    Because low-level hardware access in a sandbox is such a great idea. I guess this will need yet another iteration to finally become secure as this one gets botched again.

    • It already exists: It's called "WebGL" and it's how all your browsers and dozens of other programs do 3D graphics.

      Do a search for "libGLESv2.dll" on a PC and see how many copies it can find.

      (or "libgles.so" on Linux)

  • by WaffleMonster ( 969671 ) on Tuesday August 31, 2021 @10:31AM (#61748417)

    Chrome 95 should provide an interface to /dev/mem over https only of course. This would significantly enhance user experience.

    • Are you insane?!?!?! That's not nearly good enough. Direct access to firmware or we should just give up on the web all together.
  • provide access for all.

    • While Apple does have a reality distortion field, it sadly does not do time travel as Metal was released more than 2 years before Vulkan.
      • Comment removed based on user account deletion
        • by dgatwood ( 11270 )

          Apple didn't stick with OpenGL. They deprecated it long, long ago, because Apple doesn't believe in open standards anymore.

          The latest OpenGL version is 4.6 (2017). The latest version available on any Mac is 4.1 (2010). Apple's OpenGL implementation was four years out of date by the time Metal happened on iOS, and five years out of date by the time the Metal API became available on the Mac.

          And now, WebGL is finally being ported to that API six years later. Yikes.

          It seems like the Mac platform has degene

          • Apple has WebGL support since ages. But Chrome did not use it. Hint: the story is about Chrome, not Apple.

            • by dgatwood ( 11270 )

              No, not really. Chrome has supported WebGL support on Mac since the very beginning. They just built it on top of OpenGL, not Metal.

              This story is about Chrome adopting the Metal API for its WebGL implementation, which gives Chrome the ability to support modern chipsets' capabilities beyond what Apple supported in OpenGL way back in 2010, which is the last time Apple updated their decrepit OpenGL support libraries.

              • Safari supports WebGL since ages.
                So not really sure what your point is.

                • by dgatwood ( 11270 )

                  Safari supports WebGL since ages. So not really sure what your point is.

                  Safari's WebGL is half-baked. They still don't support WebGL 2.0 (2017), nor WebRTC (2018).

          • Apple didn't stick with OpenGL. They deprecated it long, long ago, because Apple doesn't believe in open standards anymore.

            By long, long ago you mean 4 years ago in 2017 as that was the last year OpenGL has had any updates?

  • more web!

    Until it works in webChrome in webLinux running in Chrome on Linux, I'm not taking it! It needs at least *four* layers of operating systems!
    Fuck it, go five layers! I'm gonna code up webWebGPU right now! An entire OS running inside your webGPU! Of course it will be webChromeOS! MWAHAHAHAHAHA!! *gets carried away by the nice men with the funny jacket*

    I'M WHATWG! I'M NOT IN HERE WITH YOU! YOU ARE IN HERE WITH ME!!

There are two ways to write error-free programs; only the third one works.

Working...