Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel

Intel Beats AMD and Nvidia with Arc GPU's Full AV1 Support (neowin.net) 81

Neowin notes growing support for the "very efficient, potent, royalty-free video codec" AV1, including Microsoft's adding of support for hardware acceleration of AV1 on Windows.

But AV1 even turned up in Intel's announcement this week of the Arc A-series, a new line of discrete GPUs, Neowin reports: Intel has been quick to respond and the company has become the first such GPU hardware vendor to have full AV1 support on its newly launched Arc GPUs. While AMD and Nvidia both offer AV1 decoding with their newest GPUs, neither have support for AV1 encoding.

Intel says that hardware encoding of AV1 on its new Arc GPUs is 50 times faster than those based on software-only solutions. It also adds that the efficiency of AV1 encode with Arc is 20% better compared to HEVC. With this feature, Intel hopes to potentially capture at least some of the streaming and video editing market that's based on users who are looking for a more robust AV1 encoding solution compared to CPU-based software approaches.

From Intel's announcement: Intel Arc A-Series GPUs are the first in the industry to offer full AV1 hardware acceleration, including both encode and decode, delivering faster video encode and higher quality streaming while consuming the same internet bandwidth. We've worked with industry partners to ensure that AV1 support is available today in many of the most popular media applications, with broader adoption expected this year. The AV1 codec will be a game changer for the future of video encoding and streaming.
This discussion has been archived. No new comments can be posted.

Intel Beats AMD and Nvidia with Arc GPU's Full AV1 Support

Comments Filter:
    • by AbRASiON ( 589899 ) * on Sunday April 03, 2022 @04:17PM (#62413940) Journal

      Not really. Some of us closely follows codecs. Especially open ones which is good for everyone.

      We should all want av1 to succeed

      • I encode all my video using mpeg-1, you insensitive clod!

      • I love that it is open, I hate that it is slow and processer intensive as fuck combined. AV1 is its own worst enemy to adoption.
        • by Junta ( 36770 )

          But is that due to fundamental design issues with AV1 or is it due to just not having hardware designing around it, like the other codecs enjoy?

        • by ShanghaiBill ( 739463 ) on Sunday April 03, 2022 @10:34PM (#62414778)

          I hate that it is slow and processer intensive as fuck

          That is fixed with hardware acceleration, which is exactly what TFA is about.

        • AV1 is its own worst enemy to adoption.

          You just described 100% of codecs to ever hit the market. I remind you v1.0 of AV1 is only a couple of years old. When H.265 was a couple of years old it was painfully slow to encode and had no hardware codecs available. Nearly 2 decades ago H.264 was painfully slow to encode and had no hardware codecs available. Hell I still remember playing with this experimental thing called MP3 when it first came out, it took close to 5 hours to encode a single album. On that same computer I was able to do the encode an

          • > Hell I still remember playing with this experimental thing called MP3 when it first came out, it took close to 5 hours to encode a single album. On that same computer I was able to do the encode an album in 40min only 2 years later.

            Kind of uncharitable to be blaming your 486 for your slow MP3 encodes. Geez, it never took me that long to rip a CD. (And, in that era, I was running AMD K5 CPUs.)

        • One of AV1's goals was to have better compression ratios than HEVC (I think, maybe it was other state-of-the-art codec). I'm not sure if you can do that and still have a codec that is less computationally expensive than AV1.
    • by Kisai ( 213879 )

      Not really. Both NVIDIA and AMD made mistakes on their last gen hardware by not having both AV1 encode and decode. AMD had the gall to put out a budget card with NO encoder at all and still ask a premium.

      Now, the problem is likely Intel is overselling the capability. Existing Intel Quicksync falls over on 4K h264 and h265 encoding, it will not surprise me if it falls over on AV1 as well. Quicksync underperforms just Intel GPU capabilities always do. So take this with a grain of salt that even though Intel i

      • Huh?

        How does framebuffer size have any bearing on whether a video codec works or not.

        • Memory bandwidth can become a performance issue with uncompressed video frames, especially at higher resolutions. Whether it's copying back and forth between RAM and GPU, or between NUMA nodes on multi socket systems.

    • ... right?

      Every story about any hardware ever is a Slashvertisement. If you don't want to hear about new hardware developments have you considered going to Amishdot? You send them a telegram (via the post office) and you get a daily printout of all the comments about how technology is evil and everything is just an advertisement.

      • by ceoyoyo ( 59147 )

        have you considered going to Amishdot? You send them a telegram (via the post office) and you get a daily printout of all the comments about how technology is evil and everything is just an advertisement.

        Sounds like Slashdot on any story for something you can't yet buy at Walmart.

      • I am intrigued by your Amish website and wish to subscribe to its newsletter - please expect a telegram in the next week or so, my good sir.

    • I hate to break this to you but a lot of technical news comes from companies which sell things. So it's kind of silly to shrilly screech "Slashvertisement!!" any time news about a commercial produce is posted.
  • Hardware vs Software (Score:3, Informative)

    by Greeneland ( 598616 ) on Sunday April 03, 2022 @04:46PM (#62414008)
    Long ago, I used to have a HD antenna with 6 tuners connected to a server. One client was a PC that used an MPEG software stack to decode and render the stream, another was a small set-top box that had only a low-power CPU and an MPEG decoder chip.

    The PC used to crash all the time due to occasional corruption in the broadcast stream (especially during thunderstorms), whereas the set-top box plowed right through it and kept going.

    Something to think about.
    • The only thing I thought about reading your post is that you're under the delusion that hardware is magically free from bugs, and you blame your PC crashing on what, software with bugs existing?

    • Long ago, I used to have a HD antenna with 6 tuners connected to a server. One client was a PC that used an MPEG software stack to decode and render the stream, another was a small set-top box that had only a low-power CPU and an MPEG decoder chip.

      The PC used to crash all the time due to occasional corruption in the broadcast stream (especially during thunderstorms), whereas the set-top box plowed right through it and kept going.

      Something to think about.

      Here's a nickel, son. Get yourself a better software decoder.

    • by Anonymous Coward

      If your encoder is crashing due to the input data then that is a serious security flaw and an indication of very poorly written software considering how easy it is to sanitize something like video input data.

    • by AmiMoJo ( 196126 )

      What you describe is called a TV headend. You can buy commercial ones, or make one with a Raspberry Pi. Because TV is digital now you don't need a lot of computing power, just capture the steam off-air and relay it over the network.

      Wifi can be a bit hit and miss bandwidth wise, but especially if you have ethernet everywhere it's a good alternative to running coax. You can also use the Pi as a DVR.

      • by Megane ( 129182 )

        Way to completely miss the point of what you were replying to. He was talking about problems with decoding, not about his rig. Everything else was just for context, and you only focused on the context.

        I've also seen the MythTV client crash due to a really mangled MPEG2 stream, but I wouldn't put it as simply as "hardware decoding is why the set top box didn't crash". It's still going to have software to parse the streams, and that could crash if badly written.

  • by Catvid-22 ( 9314307 ) on Sunday April 03, 2022 @04:47PM (#62414012)

    My experience with hardware encoding and decoding is that, while they're indeed much faster, they tend to be inferior to software codecs in terms of quality. So the question is, is the Intel implementation superior enough to the hardware HEVC or x264 codecs already available? Videophiles will probably tweak to death their software-encoded files.

    FWIW, using FFmpeg, I see noticeable improvements in text quality (better anti-aliasing perhaps?) and reduction in color artifacts in AV1 when compared to FFmpeg-encoded HEVC.

    Maybe the FFmpeg implementation of HEVC sucks. However, I was using the dog-slow beta version AV1 codec (hours long encoding) when I started doing my unscientific comparisons. The significantly faster release version (1?) AV1 codec maintained the quality advantage, although it's still a few multiples of the HEVC encoder in terms of raw speed (minutes vs. fraction of a minute).

    Note: my use case involved the use of raw JPEG image frames to produce half-minute time-lapse videos. I'm not a big fan of signal-to-noise stats, so I rely mostly on look and feel viewing.

    • Quality should only matter for encoding.

      For decoding, either you implement the spec or you don't.

    • I totally agree about encoding in software being a better choice. I would be suspicious of whether there is a comparable amount of control available with Intels AV1 implementation.
      • I totally agree about encoding in software being a better choice.

        Define "better". In many video use cases software AV1 encoding flat out is not possible. Without a hardware encoder right now it's not possible to encode in software in realtime for instance.

        Then it's also a question of if you bother. Honestly since 10bit HEVC encoding was introduced in NVENC I've not run a software encode period. There comes a point where good enough counts and the speed boost is far more important.

        • My late '90s lab bench was set up next to a realtime MPEG-2 encoder. It was a very expensive full height 19" rack cabinet that we kept powered down when not directly in use, due to fan noise and power cost. Now, everyone's phone has a better encoder running on battery power. It took a while and a succession of technology/market evolutions to get there.
        • I totally agree about encoding in software being a better choice.

          Define "better". In many video use cases software AV1 encoding flat out is not possible. Without a hardware encoder right now it's not possible to encode in software in realtime for instance.

          Then it's also a question of if you bother. Honestly since 10bit HEVC encoding was introduced in NVENC I've not run a software encode period. There comes a point where good enough counts and the speed boost is far more important.

          To be fair, I don't see a use case at the moment for real-time AV1 encoding. Hardware HEVC is good enough, and software AV1 encoding is fast enough for any PC faster than a two-year odd Pentium. I'm sure AV1 hardware encoding will be good enough, eventually. For now, I'll let the hardware vendors iron out the bugs that might make AV1 just as fugly or good as HEVC.

          • I don't see a use case at the moment for real-time AV1 encoding. Hardware HEVC is good enough

            The use case has nothing to do with quality and everything to do with breaking the MPEG-LA stranglehold on the video industry. AV1 isn't supposed to be a generation ahead of HEVC, it was just supposed to supplant it and give a foundation that is royalty free from which to develop future codecs. There's a reason the people most heavily invested in AV1 are those who transmit the most video via the internet (Netflix, Google, etc).

            and software AV1 encoding is fast enough for any PC faster than a two-year odd Pentium.

            Please link me to your magic software encoder. Just for fun I decided to do a qui

            • AV1 isn't supposed to be a generation ahead of HEVC, it was just supposed to supplant it and give a foundation that is royalty free from which to develop future codecs.

              That's a pleasant surprise. Using the FFmpeg defaults for HEVC and both beta and release AV1, I found AV1 to be perceptually better. Again, I'm no fan of PSNR and all that jazz.

              and software AV1 encoding is fast enough for any PC faster than a two-year odd Pentium.

              Please link me to your magic software encoder. Just for fun I decided to do a quick test now with a 4K video on a Ryzen 9 5900x and clocked in at an incredible 7 fps, and at 1080p it didn't even manage to spike to 20fps. So no, there's nothing "fast enough" about AV1 encoding when a high end and recent processor can't even match the source framerate even on lower quality video, to say nothing of the expectation of streaming content. The only way to describe this is "completely unsuitable for many applications".

              Ah, but I mainly encode 480p email-friendly videos on my pre-pandemic Pentium N3700, and I can finish the encode at 1/4 real-time. So I assume that upgrading to something more modern like a 2022 Pentium Silver (Jasperlake NUC) would give me a real time encode.

    • they tend to be inferior to software codecs in terms of quality

      Indeed. The h.264/h.265 encoding quality on Intel's QuickSync on Skylake was abysmal with quality like a realvideo codec at bitrates that would choke a fibre connection. Things improved in later CPUs, but it does make me think this might be their "test" hardware that won't be fixable with a usable encoder coming RealSoonNow.

      I believe AMD put an FPGA on their cards so they don't have to bake in the wrong approach. But I'm not aware of any plans to actually use that to gift owners with support for a "new" cod

  • Safari (Score:4, Insightful)

    by backslashdot ( 95548 ) on Sunday April 03, 2022 @04:56PM (#62414034)

    Can someone ask Cook-boy to support AV1 on the iPhone/iOS? I mean, what is their excuse? Apple is a founding member of the AV1 Alliance, but its browsers don't support the codec yet ??? what the heck man? fuck you. It is the year 2022 and we still don't have a universally supported royalty-free video format.

    • Re: (Score:2, Troll)

      by c-A-d ( 77980 )

      Their excuse is likely they'd have to spend money to implement it in silicon and software which would cut into their profit margin. Plus, they'd then have no excuse for their retarded child web-browser called Safari.

      • Their excuse is likely they'd have to spend money to implement it in silicon and software which would cut into their profit margin.

        Unlikely since they design their own chips and have them made by TSMC. Since they do not sell their AX or M1 chips to anyone, there is no profit motive. Now there is a cost in terms of space on the die and engineering but there is no profit margin.

        Plus, they'd then have no excuse for their retarded child web-browser called Safari.

        So which browser do you recommend they use . . Edge which is controlled by MS or Chrome which is controlled by Google?

        • Webkit/Safari is fine, but then what does that have to do with them lagging behind on supporting AV1?

          • If I were to guess the main reason is that it is relatively new. VP9 and HEVC(h265) were both released in 2013. HEVC is still supported and has had different revisions, but Google has moved on to VP10 which was then incorporated into AV1. AV1 was released in 2018 as a spec and is still 1.0.0 with the last release being in 2019. At best it is a 4 year old codec and has only had content from some content providers the last several years. It is probably the same reason why NVidia and AMD have only hardware dec
          • Because it's done in silicon and not software.

      • Their excuse is likely they'd have to spend money to implement it in silicon and software which would cut into their profit margin.

        Yeah what an excuse given that the M1 supports AV1 hardware decode. You're quite a lame troll.

    • It's a chicken-and-egg problem. Most media is still streamed in H.264 because that's what the majority of clients support. For end users, none of this stuff matters anyway. Codec licensing isn't a major part of the cost of hardware. Hell, with most modern TVs some sort of streaming device is already built-in.

      Most likely what's really just going on here is that Intel's FPS numbers in AAA titles aren't keeping up with the incumbents, so they're tooting their horn about how great their GPUs will be for nic

    • Can someone ask Cook-boy to support AV1 on the iPhone/iOS? I mean, what is their excuse? Apple is a founding member of the AV1 Alliance, but its browsers don't support the codec yet ??? what the heck man? fuck you. It is the year 2022

      They probably didn't put hardware support for it in their homespun chipset and if you decode it in software it drains the battery at 100% CPU usage.

    • Jeesh aren't we needy. Adoption of codecs takes time. There was precisely zero browser support for any video codecs (or audio codecs) in the 1-2 years after v1.0 of a format came out. The internet at large currently does not use AV1 so there's no point rushing support, especially before the hardware is there to ensure it doesn't demolish battery life. Apple only just released hardware AV1 decoder support and prior to that there was no point in supporting it in software.

      This shit is new, give it time. Stop p

    • by AmiMoJo ( 196126 )

      They don't want to make the web experience as good as the app, because they you might not use the app and Apple loses its cut of app revenue, and its leverage over Google that needs to update said app.

      They made sure Safari was crippled to prevent people making web apps to circumvent their App Store rules and 30% cut.

  • by Anonymous Coward

    What else is a laptop going to run than play videos on an aeroplane?

    At least for the demographic where intel sees the money, anyway.

    • Point taken, but I think the "news" here is that Intel added hardware encode as well. Perhaps in case you wanted to encode a video on the plane instead of just watching one. Dunno, I don't encode videos for a living.
  • by Anonymous Coward
    Decades of experience with Intel-supplied benchmarks shows that they overstate the capabilities of the SUT and often tweak things in ways independent testers cannot replicate.
    • AV1 is not from Intel and independent tests have shown AV1 is more efficient than h265 by 30% percent. Of course you can question those independent results if you want but this is not an Intel codec.
      • by Anonymous Coward
        yep but requires >10 times the processing power to encode. save a little on space for a massive increase in processing overhead. That is why AV1 hasn't taken off, the cost vs benefit really isn't there. h265 is so much easier to deal with. The real benefit of AV1 is the licensing.
        • Comment removed based on user account deletion
        • by Rhipf ( 525263 )

          Is the amount of processing power to encode something really a big deal? I guess if it outstrips the power of your system it is but other than that is it really a problem? You only need to encode the video once after all.
          The 30% storage savings on the other hand is realized each time a copy of the file is stored. If you have a 1000mb video file encoded in h265 it would be only 666mb (save 333mb) in AV1. If you store 100 copies of the file you are saving 3330mb (~3.33gb) of storage. This advantage becomes ev

        • The fact that costs so much to encode starts to matter less when a video is going to be encoded once and played back millions of times. That's why YouTube encodes in AV1 videos that are very popular. Also why Netflix is going to be a major beneficiary of AV1: Encode once, save tons of bandwidth when millions of people play back your shows
  • by Maury Markowitz ( 452832 ) on Sunday April 03, 2022 @06:35PM (#62414308) Homepage

    "It also adds that the efficiency of AV1 encode with Arc is 20% better compared to HEVC"

    Does that mean it is 20% better at compressing AV1 than it is at compressing HEVC?

    If so, that is weird, right?

    Or does it mean AV1 is 20% better than HEVC?

    If so, isn't that lower than expected? Wasn't AV1 supposed to be 50% better or more?

    • does it mean AV1 is 20% better than HEVC?

      Yes.

    • Surely it means that an encode that was 100 MB with HEVC is 80 MB with AV1. Also, don't confuse the goals of AV1 itself with a hardware maker's goals for their hardware. To date, all hardware encoding has sucked anyway, and I feel no optimism that this encoder will suddenly be the one that is usable.

    • If so, isn't that lower than expected? Wasn't AV1 supposed to be 50% better or more?

      By which metric? By which version? There's no single benchmark for quality and there's no single fixed version of a codec. Software AV1 may be 50% better than HEVC, that doesn't mean a hardware encoder does as good of a job (it usually doesn't trading the ability to very finely tune the encoder for a ginormous speed boost). And if you compare Intel's AV1 encoder to NVENC's HEVC encoder you'll get a different number compared to libx265.

    • It likely means files AV1 encoded with Arc are 20% smaller than when encoded with HEVC on Arc while maintaining the same visual quality.

      While AV1 may theoretically be "50% better" than HEVC, the hardware encode implementation likely does not support all needed features to reach that level. For example, hardware encode implementations for MPEG2, MPEG4, HEVC (etc) all had/have certain limitations (single pass, fixed key frame interval, no forward lookup, etc?).

      So you likely can, as with most other encod
    • by AmiMoJo ( 196126 )

      It means that on average AV1 requires 20% fewer bits for equivalent visual quality to HEVC. The file size will be 20% smaller, the stream will need 20% less bandwidth.

      They measure quality through an algorithm that compares compressed frames to the original, using a model of the human visual system that weights artefacts by how noticeable they are. Blind tests to confirm.

  • I personally find hardware encoding useless when you don't have access to all encoding parameters. I find no gold in encoding something quickly that looks bad. This is what has bugged me about NVENC since its inception. I don't care how much work using NVENC offloads from the CPU if the video stream looks like ass. I'm not encoding for some big streaming service, or big company, or anything like that. Simply encoding my files for my own use. And all the experimenting I've done with NVENC has shown it to be a waste of silicon, at least for me. I sure wish companies spending time and money on developing hardware encoders would allow us to get results that mirror what we can get from x264 and x265, for example. Let me get results that match x264/x265 bit-for-bit and then your hardware encoder becomes useful to me. When all it delivers is vastly inferior results you've wasted all that R&D time and money, not to mention silicon. Give us encoders that can make everything from ultrafast to veryslow (hell, placebo, too) in x264/x265 possible in hardware, and similarly the entire range possible in AV1, too, and then you've got something worth talking about. And more importantly, worth using. Subpar results via hardware encoding is only useful in so many scenarios. Giving us hardware encoders that perform just as well as the software encoders is a much better goal. Yes, I'll pay more for that silicon. I want my money back on all the silicon we've got up until now, heh.

    • by xlsior ( 524145 )
      My AMD RX480 has AMD's VCE 3.0 (Video Coding Engine, v. 3.0 supports x265 HVEC), which can encode video at a speed that is dozens of times faster than CPU/software encoding through handbrake -- minutes vs hours. But the end result simply isn't worth it. There are a TON of artifacts and noise in the final video, visually it looks noticeably worse than whatever the software encoders crank out at their "fast" setting (let alone the better-looking "slow" settings) and the final file size is larger.

      No idea wha
    • Agreed. If this hardware AV1 encoding is anything quality wise near what h265 was from Nvidia, who wants that? Of course it's faster because it's garbage. I want to be able to encode 1080p or higher and still have it look like 1080p or higher.
    • Well, I do encode video in h264/h265 for professional broadcast use
      It is QCed by rather picky people at times, and they are not afraid to complain.
      We happily use NVENC, it is very tunable if you know what you are doing and gives great results.

      So, for whatever reason, you are blaming the wrong thing.

      • Metrics don't lie. I prefer VMAF, but pick whichever one you wish. The NVENC results suck. Artifacts galore, particularly in the chroma. The people looking at your encodes must be blind and/or ignorant of compression.

      • Though I do invite you to share some settings with NVENC that you feel should yield great results and I'll be happy to give it a shot and compare it to unaltered x265 presets, which I'm pretty sure will yield better results at smaller filesizes.

        • >which I'm pretty sure will yield better results at smaller filesizes.

          DING DING DING and here we have it!

          Those artifacts you were bitching about? Yeah, over compression without tuning the encoder, and then trying to stuff the video stream into as small a space as you can. Stop bitstarving your video or learn to tune whatever encoder is using NvEnc better.

          Here's a hint - it's 2022 disk space is cheap as dirt, wireless can connect at 300Mbps+ speeds, and failing that pretty much anything and everything tha

          • Your reply doesn't even make sense. If I'm using bitrates that result in files much larger than x265 files how exactly am I bitstarving it? Disk space is cheat as dirt? Yeah, it is, which is precisely why I've got 47 TB sitting in my little file server. I'm not sitting here trying to use NVENC to get a movie to fit on a floppy. Heh.

            "x265 does it better with less bits" != "I'm not using enough bitrate for NVENC"

            You can use bitrates with NVENC that are far, far, FAR beyond what anyone would ever actually

            • It has now dawned on me, after I already submitted that reply, that you somehow misunderstood what I said and you think I meant "NVENC sucks at smaller filesizes." In fact, what I said was, given any quality level provided by NVENC, x265 will provide the same or better quality at a smaller filesize than the NVENC file. This does not mean I am using filesizes/bitrates that are too small when using NVENC. It means x265 does better with less bits than the NVENC file. Take a 50 Mbps NVENC file as a random e

    • Sometimes I wonder, maybe hardware vendors shall create a video encoding accelerator module that works like OpenCL / CUDA. Software encoders such as x264 and x265 will then be able to hook into it to accelerate their encoding speed without sacrificing any quality. Such module will probably be more general and don't need chip upgrade to start supporting newer formats too.
    • I find no gold in encoding something quickly that looks bad.

      Cool story. But hardware encoding doesn't look "bad", it just doesn't look as good as software for the same bitrate. If your modern NVENC encode looks "bad" you're doing something very wrong.

      You'll find the vast majority of the world has different priorities than you. A codec lives and dies on hardware support in industry. People are very conscious of hardware accelerated video encoding for pretty much everything except an archival master.

  • by presearch ( 214913 ) on Sunday April 03, 2022 @08:31PM (#62414590)

    Q1/Y1 - Buggy driver and hideous interface shipped with initial hardware
    Q3/Y1 - Low hanging fruit bugs fixed. Lime green trim added to interface.
    Q4/Y1 - Hardware patch shipped. Interface always crashes on exit.
    Q2/Y2 - Forced compatibility driver released. Software always hangs on certain files.
    Q3/Y2 - Project disbanded. Frequent random crashes on startup. Dialog links go nowhere.
    Q4/Y4 - New groundbreaking initiative announced by interim Intel CEO

  • ...because Intel has never exaggerated their performance results before.

My sister opened a computer store in Hawaii. She sells C shells down by the seashore.

Working...