Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD

AMD Reveals The Radeon RX 6000 Series 57

Preparing to close out a major month of announcements for AMD -- and to open the door to the next era of architectures across the company -- AMD wrapped up its final keynote presentation of the month by announcing their Radeon RX 6000 series of video cards. From a report: Hosted once more by AMD CEO Dr. Lisa Su, AMD's hour-long keynote revealed the first three parts in AMD's new RDNA2 architecture video card family: the Radeon RX 6800 ($579), 6800 XT ($649), and 6900 XT ($999). The core of AMD's new high-end video card lineup, AMD means to do battle with the best of the best out of arch-rival NVIDIA. And we'll get to see first-hand if AMD can retake the high-end market on November 18th, when the first two cards hit retail shelves. AMD's forthcoming video card launch has been a long time coming for the company, and one they've been teasing particularly heavily. For AMD, the Radeon RX 6000 series represents the culmination of efforts from across the company as everyone from the GPU architecture team and the semi-custom SoC team to the Zen CPU team has played a role in developing AMD's latest GPU technology. All the while, these new cards are AMD's best chance in at least half a decade to finally catch up to NVIDIA at the high-end of the video card market. So understandably, the company is jazzed -- and in more than just a marketing manner -- about what the RX 6000 means.

Anchoring the new cards is AMD's RDNA2 GPU architecture. RDNA2 is launching near-simultaneously across consoles and PC video cards next month, where it will be the backbone of some 200 million video game consoles plus countless AMD GPUs and APUs to come. Accordingly, AMD has pulled out all of the stops in designing it, assembling an architecture that's on the cutting-edge of technical features like ray tracing and DirectX 12 Ultimate support, all the while leveraging the many things they've learned from their successful Zen CPU architectures to maximize RDNA2's performance. RDNA2 is also rare in that it isn't being built on a new manufacturing process, so coming from AMD's earlier RDNA architecture and associated video cards, AMD is relying on architectural improvements to deliver virtually all of their performance gains. Truly, it's AMD's RDNA2 architecture that's going to make or break their new cards.
This discussion has been archived. No new comments can be posted.

AMD Reveals The Radeon RX 6000 Series

Comments Filter:
  • by Swift Kick ( 240510 ) on Wednesday October 28, 2020 @12:45PM (#60659078)

    Is there a link somewhere to this report?

    • Don't care how much AMD is proud of their shit. How does it stack up against Nvidia to the 5900?

      • by nojayuk ( 567177 )

        First-person (i.e. AMD did the testing) gaming benchmarks suggest the AMD 6900XT card performs about the same as the nVidia RTX3090 while consuming a little less power. The AMD card has less RAM than the nVidia flagship GPU card but it implements AMD-only fast memory transfer technology which needs an AMD 5000-series Ryzen CPU and 500-series motherboard (B550 or X570) to match or beat the RTX3090. The AMD 6900XT's RRP is $999, two-thirds the RRP of the RTX3090.

        What I didn't see in the keynote was any mentio

        • by Luthair ( 847766 )
          Note the 6900 xt has less ram than the 3090, but at the other price points the AMD cards have more RAM
          • The extra memory isn't necessary at 4k which is the resolution they showed the benchmarks at. The 3090's extra memory is mostly there to make 8k possible although that's little more than a novelty at this stage since 8k displays are few and outrageously expensive.

            • Not having looked at GPU prices for about a decade ... $1000 is outrageously expensive already! Let alone $1500!
              Like I would buy a GPU for over $200-250, are you kidding me?? And yes, I mean for a top-end card that is the fastest on the market. For mid-range, $150 or fuck off and die.

              • Top end GPU's haven't been that cheap in 15 years. Inflation is a bitch but also the cards have gotten much more complicated and difficult. Nvidia has definitely milked the market somewhat and AMD being competitive again should help bring prices down but $350 flagship GPUs will never come back. If you can't hang then go buy a console. I don't know what else to tell you. Most adult hobbies cost a few grand a year to maintain, I remember reading once that $2k or so is average. Enthusiast PC's are no different

              • Zen APUs are for you: AMD Vega graphics are more than good enough, for today gaming needs...
            • The extra memory isn't necessary at 4k which is the resolution they showed the benchmarks at. The 3090's extra memory is mostly there to make 8k possible although that's little more than a novelty at this stage since 8k displays are few and outrageously expensive.

              It's more there for raytracing support. You can't simply cull out geometry that isn't in the view frustum when doing raytracing because you need reflections and light contributions from offscreen geometry and materials. Obviously this requires more memory as you need more of the scene in memory at a given time.

            • by Luthair ( 847766 )
              Its not just a frame buffer though, the more ram you have the more textures/models you can store, not simply more detailed textures/models.
            • The 3090's extra memory is mostly there to make 8k possible although that's little more than a novelty at this stage since 8k displays are few and outrageously expensive.

              pricey, but you can always run 8k and downsample to your 4k monitor (basically 4xSSAA).

              useful for games where vanilla image quality is so-so.

        • One thing to keep in mind is that they didn't show anything with raytracing or upscaling. So it's likely that AMD is roughly compatitive in raster performance but I don't expect great things on the RT front if they didn't show it off.

          At this point though DLSS seems like a must-have. It's great that the 6800XT is similar to the 3800 in a benchmark they set up, but IRL, you can turn on DLSS and get >40% boost in many games for free (and get great AA). Oops.

          • by nojayuk ( 567177 )

            The keynote did mention ray-tracing acceleration. It's not EXACTLY the same as DLSS but it's likely to be competitive-slash-comparable with nVidia's RTX ray-tracing system. It may need patches for current games to enable it though. Remember that AMD's RDNA2 core technology is the same that's going into both the new Xbox and PlayStation consoles and they're not going to sell well without ray-tracing, especially at 4k.

            AMD is collaborating with game partners to provide dev support for this capability and ther

            • The keynote did mention ray-tracing acceleration. It's not EXACTLY the same as DLSS but it's likely to be competitive-slash-comparable with nVidia's RTX ray-tracing system.

              ??? DLSS is Deep Learning Super Sampling, a technique that uses deep learning to do high quality resolution upscaling. Raytracing isn't related to DLSS anymore than rasterization is.

              • DLSS is important to raytracing because it allows passable image quality with fewer rays. The downside is, which it glitches it glitches badly.

              • AMD did list super resolution as one of it's six core features of Fidelity FX (Contrast Adaptive Sharpening, Denoiser, Variable Shading, Ambient Occlusion, Screen Space Reflections, and Super Resolution) but they didn't go into much detail on how it works or how well it works compared to DLSS.

                From an Anandtech article [anandtech.com]:

                AMD is also continuing to work on its FidelityFX suite of graphics libraries. Much of this is held over from AMD’s existing libraries and features, such as contrast adaptive sharpening, but AMD does have one notable addition in the works: Super Resolution.

                Intended to be AMD’s answer to NVIDIA’s DLSS technology, Super Resolution is a similar intelligent upscaling technique that is designed to upscale lower resolution images to higher resolutions while retaining much of their sharpness and clarity. Like DLSS, don’t expect the resulting faux-detailed images to completely match the quality of a native, full-resolution image. However as DLSS has proven, a good upscaling solution can provide a reasonable middle ground in terms of performance and image quality.

                As things currently stand super Resolution is still under development, so it won’t be available to game developers (or gamers) at the time of the RX 6000 series launch. But once it is, like AMD’s other FidelityFX graphics libraries, it will be released as an open source project on GPUOpen, and AMD is explicitly noting that it’s being designed as a cross-platform solution. Game devs will also be happy to hear that AMD is aiming to make it easier to implement across games than DLSS, which would make the tech more accessible and able to be implemented into a larger number of games.

            • by Bengie ( 1121981 )
              On of the many features of the newer versions of DLSS is it can work with any games that use TAA(Temporal Anti Aliasing). DLSS can use AI to zoom-and-enhance. This can allow for high performance resolution upscaling. I saw a video of someone running a game a 240p and up-scaling to 1080p and it really wasn't "that" bad. Worse than 720p, but significantly better than 240p. The real magic start happening when up-scaling from 1080p+ to 4k. Th way I see it is it allows you to upscale render 1440p to 4k with a gr
            • Yeah they have an RT core for each compute unit and supposedly there's something called Super Resolution which might be an upsaclaer like DLSS.

              But they didn't show any benchmarks though, so it either doesn't work yet, or works but is too slow to to be competitive. Anyway it'd be dumb to buy anything before third party testing is available so we'll see how things work out soon enough.

              • by jiriw ( 444695 )

                It may be slower, yes, but it definitely already works. The clips of the games they showed were with RT features enabled, as mentioned.

        • Crossfire is really low on AMD's list of priorities. Partly because game engine flows that aren't designed from the ground up for it don't tend to partition into two streams well at all, so it's no magic bullet, but mainly because the market size for it is just way too small. Two GPUs won't fit many mainstream boxes and the vast majority of power supplies will be overwhelmed. Then their is the additional fan noise, power bill and heat, all just for bragging rights.

      • by Entrope ( 68843 )

        It is almost 3000 more than Nvidia's current cards. Doesn't that mean it's twice as good?

      • Seeing how the nVidia 5900 was launched in 2003 on a 130nm process, I'm assuming it will destroy it...
      • and they've been accurate in the past. The performance seems to be slightly better than Nvidia's current offerings at the same price point. Not that it matters much given what scalpers are doing right now.

        Also Nvidia usually just leapfrogs AMD when these announcements happen. They seem pretty far ahead, which is a shame as the GPU market could use a bit more competition. At the mid range ($200-$240) there just isn't a lot of reason to go with AMD.

        Also AMD seems to be pretty sensitive to Memory Timin
        • One thing that steers me heavily toward Nvidia is their ancillary software. For instance RTX voice is perfect for my needs. It makes my voice quite acceptable even if I have air conditioner, air purifier, and lawnmower running in background. Strangely enough it is completely worthless for improving incoming audio, but whatever. Does AMD offer similar programs that I haven't heard publicized?

          • AMD doesn't have anywhere near the resources NVidia does. TBH the only real reasons to go AMD are cost, availability and FreeSync. AMD does have one other thing going for them, their board partner Sapphire makes some of the best made graphics cards in the industry (from a build quality standpoint). But for those kind of ancillary features they just don't have the cash to fund them.
        • Remember how they were a bit behind Intel at first.
          Then about the same.
          Now ahead.
          All released in quite a short timescale.

          With GPUs, they are now very slightly ahead already.
          If nVidia releases their new range in a few months, AMD will already be at the ready for the next iteration. And from there on, things will play out differently than before.
          At least if they follow the same timing strategy as with their CPUs.

        • Same or better performance as a 3090 at 2/3 the prices seems like a win for AMD that NVidia won't leapfrog that easily.

          • and nvidia just dropped the price on the 2060 RTX. Same thing with the 5500XT vs the 1660 Super.

            I love my 580, but I bought it cheap after the bitcoin bubble burst. If I was going to buy a card today it would be one of the 1660s.
    • I like to pretend that msmash and BeauHD have some kind of bet about who can feign ineptness and draw the most commenter ire, and both are constantly upping the ante by posting increasingly more inane and banal articles, dupes, or pulling other hijinks such as this.
  • There's a quote, but I'm not seeing any links. What's the source?

  • by Major_Disorder ( 5019363 ) on Wednesday October 28, 2020 @01:06PM (#60659178)
    Slightly off topic, but which way are people going? AMD or NVIDIA for gaming on Linux? I know the closed source drivers ar an issue on NVIDIA, but I am not sure how good the open source AMD drivers are. Guessing wrong on which GPU to get is an expensive error.
    • by Anonymous Coward

      AMD Linux drivers have been great as of lately.

      • That's more useful news to me than anything else!

        AMD's drivers have been utter crap in the past. On every OS. Which was my main reason to go with nVidia. (I don't like monopolism. Like CUDA. Call me crazy.)

        • from what I can tell the main issue is that their cards are more sensitive to bad overclocks. Especially if you push cheap RAM too hard via XMP settings. Yes, it's supposed to work "out of the box" but the AMD forums are littered with horror stories of folks who enabled their XMP overclock and immediately got crashes.

          What I *think* is going on is that nVidia is detecting crashes and throttling their GPUs to avoid them, while AMD just assumes your hardware can handle anything they throw at it.
    • by malkavian ( 9512 )

      Expensive for some. If you're in the real high end, with specific requirements and competitive gaming, then yes, getting it wrong is feasibly expensive.
      For the majority of people, the extra bang you get isn't worth the bother, and it's not an issue. I just acquired a new system, with new cards (actually, had to do that as the old one died on me, right at the start of the COVID outbreak, where nobody had much stock in anything, so I went with what I could scavenge) and it's no problem. I could have spen

    • Even AMD's open source drivers require closed source firmware for full functionality, but the AMD drivers have had a lot of improvement over the last year or two.

    • by An Ominous Coward ( 13324 ) on Wednesday October 28, 2020 @03:37PM (#60659924)

      Five years ago, I swore I wouldn't use AMD GPUs any more because their Linux drivers were utter crap compared to the Nvidia blob, and the radeon driver was a performance joke. Now I say the exact opposite: no more buggy Nvidia proprietary garbage, only AMD GPU with the open source drivers. Works so much better, 2D and 3D performance is awesome, no painful hoops to jump through when upgrading kernels. What a turn around for them on both the CPU and GPU front. Happily running an inexpensive AMD+AMD Linux-only rig now, it'll become a media PC next year when I build a Ryzen 5000 + Big Navi machine for my main PC.

    • by vyvepe ( 809573 )
      AMD, but do not buy a top line AMD card. Buy a cheaper widely used AMD card. E.g. Radeon VII Linux drivers are buggy and not being improved.
      • AMD, but do not buy a top line AMD card. Buy a cheaper widely used AMD card. E.g. Radeon VII Linux drivers are buggy and not being improved.

        I never buy top of the line anything. In fact my current new PC is the first all NEW PC I have built in at least 20 years. I am a proud cheap bastard!

        • I'm guessing you don't play games then. At least not anything from the past 5 years. My previous build lasted just over 12 years (2007-2020) and that was really pushing it. Mostly I did it by not caring about/not having time for new games.

          Unless you have some weird definition of "build"... I had the same motherboard, case, and power supply. Something in the PSU ended up dying and took the board out with it. Probably didn't dust it enough. It was caked in there pretty heavy.

          • I'm guessing you don't play games then. At least not anything from the past 5 years. My previous build lasted just over 12 years (2007-2020) and that was really pushing it.

            I was away from PC gaming for a very long time. Just recently got back into it with a cheap used PC. Then decided that I was quite enjoying it, and decided to build an all new gaming rig. Most of the games I play are pretty old, but I want a decent GPU so I am not stuck buying something better when my needs evolve.

    • Comment removed based on user account deletion
    • Thanks to Proton, I've played steam Windows games on Linux that I never would have bothered with. There's the gaming client Lutris which is also worth checking out. https://lutris.net/about/ [lutris.net]

      • Thanks to Proton, I've played steam Windows games on Linux that I never would have bothered with. There's the gaming client Lutris which is also worth checking out. https://lutris.net/about/ [lutris.net]

        Thanks to Proton and Steam I have been able to install and play GTA Vice City, a game that I hear is almost impossible to install or play on Windows 10. I love this game. :)

  • for a 6900XT. And Christmas bonus coming in. Temptation!
  • This will be interesting. AMD have presented a great option but for low-end video cards only in the past few generations. I'm excited about the possibility of AMD once again competing with the top end of NVIDIA. There was absolutely no serious competitor from AMD for the RTX 2070S and up which is a major shame.

    Now if they can also launch hard and fast and take advantage of the NVIDIA supply issues they can finally be real contender in the GPU space once more.

    I should have bought AMD shares years ago. From t

  • What's the state of ray tracing in games now anyway? I'm assuming it hasn't reached maturity since so few games are using it. (Are there any?)

    I'm assuming it will be like pixel shaders in the mid-2000s, where you bought a card that could do some basic shading, then 2 years later the new games didn't even support Pixel Shader 1.1 or whatever your card did anymore. This time, maybe even more brutal, since it's such a fundamental change to how the graphics are done.

    How many years do we need to wait to get a "f

    • Re:Ray tracing (Score:4, Informative)

      by UnknownSoldier ( 67820 ) on Wednesday October 28, 2020 @05:04PM (#60660262)

      TL:DR; Hardware Ray-tracing is still hideously expensive at run-time. Maybe in ~5 years off will we see game rendering 100% with it, probaby closer to 8 years; currently it is only being used for VFX to augment the rendering pipeline such as reflections and shadows, diffuse illumination and ambient occlusion.

      The long answer:

      Was your question only referring to hardware ray-tracing on the GPU? Or was it ANY ray tracing in general?

      It may seem pedantic but I'm not sure if you were aware that the UE5 demo that Unreal's Nanite technology showcased [youtube.com] on the PS5 had over 1 Billion source triangles. Nanite was drawing 20 million triangles per frame with a hybrid software + hardware rasterization at a dynamic 1440p [eurogamer.net] resolution.

      UE5 actually uses multiple rasterization techniques:

      * Software rasterization
      * Compute shaders
      * Hardware rasterization if fast path using primitive shaders

      Last year, Crytek demoed the amazing visuals of Neon Noir: Crytek's Software Ray Tracing! How Does It Work? How Well Does It Run? [youtube.com] showcasing software ray tracing. With high thread-counts this is becoming feasibile.

      Yes, ray-tracing hardware is still in its infancy but 2nd gen hardware is here. Both AMD RX 6000 and Nvidia RTX 3000 have native hardware ray-tracing now.

      BVH (Bounding Volume Hierarchy) acceleration has been heavily research for the (few?) decades on the CPU but it is only recently they have come to the GPU with GigaVoxels and Voxel Cone Tracing . GPUs finally [anandtech.com] have dedicated hardware to speed up BVH just 2 years ago.

      Ray Tracing is NOT the only solution. Sonic Ether implemented Path Tracing Global Illumination [sonicether.com] in Minecraft on non-Ray-traced GPUs! While Path Tracing can be done [madebyevan.com] on pre-Ray-tracing hardware, THE biggest problems with both Ray-tracing and Path-tracing are:

      1. Noise.
      There has been a LOT of progress in "denoisers" in the last year. See Nvidia's Minecraft With RTX: Crafting a Real-Time Path-Tracer for Gaming [youtube.com] for some state of the art techniques.

      2. Performance.
      Whether it be in Minecraft RTX [youtu.be] or Quake 2 RTX [youtu.be] ray tracing is a HUGE performance hit compared to traditional polygon rasterization. While you can download Quake 2 RTX [nvidia.com] that is 100% ray-traced, it barely gets 60 FPS at 1440p and 23 FPS [youtu.be] at 4K on the RTX 2080 Ti. This is a game from 1997!

      It is rumored that Nvidia's RTX 3000 has better ray-tracing performance then AMD's RX 6000 but we'll have to wait for hardware to compare the two.

      3. Price
      The problem isn't so much maturity -- since D3D12 has a native Ray-tracing API so we shouldn't see the fragmentation that we saw with Pixel Shaders -- the third problem is the cost of entry. Not everyone is going $500 on a GPU for Ray-tracing. Which leads into your next question:

      How many years do we have to wait? If I were to estimate I'd say maybe AT LEAST ~5 more years for Raytracing to be used 100% in games but probably closer to 8 - 10 years.. When sub $100 and $200 cards support Ray-tracing at decent framerates then developers will switch to it. Until then, traditional polygon rendering with either Forward+ or Deferred rendering is here to stay.

      Hope this helps!

      • Great comment!!! It has been a while since I have seen someone putting in the effort to actually provide good information. Thank you for that!
        • You're welcome! I'm doing my small part to keep /. interesting / informative. :-)

          You'll definitely want to watch this very cool making of [youtube.com] video of the Teardown [steampowered.com] game. (It is still Early Access but currently available on Steam for $20.)

          It uses ray tracing for lighting but does NOT require a ray-tracing GPU, although it does require a NVIDIA GeForce GTX 1060 or better.

  • I've been switching and flipping between nVidia and AMD (formely ATI) video cards for two decades and it has always been a chase of the best bang-for-the-buck price-to-performance (min-maxing) ratio since I would jump on the AMD video card bandwaggon anytime they offered affordable mid-range cards without the super high premium pricing of nVidia cards.

    The previous time around 2015 I went with an AMD ATI Radeon 6950/6970 (ROM dip-switch unlock thanks to Sapphire) and was super happy with that card for many y

  • Does AMD have anything going on for hardware accelerated video encoding like Nvidia's NVEnc?

    Years ago I used to pretty much hang in the camp of perf vs. $$ but AMD/ATI's hasnt really been competitive in the ranges I've been looking at for probably close to a decade.

    I'm really happy to see competition in the space now but, I'm pretty reliant on NVEnc for H.265 encoding, and if AMD doesn't have something similar (that gains some wide support), I don't see myself shifting back over to AMD for primary GPU's any

There is no opinion so absurd that some philosopher will not express it. -- Marcus Tullius Cicero, "Ad familiares"

Working...