Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD

AMD Announces Radeon VII, Its Next-Generation $699 Graphics Card (theverge.com) 145

An anonymous reader shares a report: AMD has been lagging behind Nvidia for years in the high-end gaming graphics card race, to the point that it's primarily been pushing bang-for-the-buck cards like the RX 580 instead. But at CES, the company says it has a GPU that's competitive with Nvidia's RTX 2080. It's called the Radeon VII ("Seven"), and it uses the company's first 7nm graphics chip that we'd seen teased previously. It'll ship on February 7th for $699, according to the company. That's the same price as a standard Nvidia RTX 2080. [...] AMD says the second-gen Vega architecture offers 25 percent more performance at the same power as previous Vega graphics, and the company showed it running Devil May Cry 5 here at 4K resolution, ultra settings, and frame rates "way above 60 fps." AMD says it has a terabyte-per-second of memory bandwidth.
This discussion has been archived. No new comments can be posted.

AMD Announces Radeon VII, Its Next-Generation $699 Graphics Card

Comments Filter:
  • Good news! (Score:5, Insightful)

    by DaMattster ( 977781 ) on Wednesday January 09, 2019 @03:37PM (#57932694)
    Yes, AMD is lagging behind but I will still go with AMD graphics over NVIDIA because NVIDIA has an anti-open source stance. It's good news that AMD's graphics chipsets are getting better.
    • Comment removed based on user account deletion
      • Re:Good news! (Score:4, Insightful)

        by vyvepe ( 809573 ) on Wednesday January 09, 2019 @03:53PM (#57932856)
        NVidia is not an option if you need a longer term linux support.
        • by Anonymous Coward

          The parts of the Nvidia driver that are loaded in the kernel and need to be compiled against the kernel are open source. The fact that the driver then loads a binary blob does not alter the long term support and open source nature of the part that needs to be compiled against the kernel.

          In addition there are both short term and long term drivers:
          https://www.nvidia.com/object/unix.html

          • The part that needs to be compiled against the kernel isn't the part Steam checks for. Fuck your astroturfing.

        • NVidia is not an option if you need a longer term linux support.

          If you buy a modern and mainstream nvidia card, you can be fairly sure that it will be supported on linux from a time near its release (maybe on time, maybe not) to a time some years later. However, some features supported on other platforms will not be supported, and the open source driver will not support all of the functionality and/or performance of the proprietary driver for a long period, if ever. It used to be the obvious choice, but now it's an obviously flawed one. If the AMD platform OSS drivers a

      • Re:Good news! (Score:5, Insightful)

        by WaffleMonster ( 969671 ) on Wednesday January 09, 2019 @04:25PM (#57933074)

        Come on, don't give them a pass. It's not a very good value proposition is it. It's for the fanboys only. You can buy a 2080 for $699 and you get RT and Tensor cores (ray tracing, DLSS, etc.).

        I watched the Nvidia CES and the whole presentation + RT/Tensor thing felt like one giant scam.

        DLSS as near as I can tell is basically just an upscaler using substantially similar "AI" database approach as Sony's x-reality asic. This technology has been around for years. While it's nice it sure as heck doesn't produce magical outcomes that are anywhere near rendering native resolution.

        Then there was gratuitous use of TAA throughout the demos as a reference which would be hilarious if they were not serious. TAA is only state of the art in blurry mess technology... using that as basis for comparisons especially given the effective resolution of the window as it was viewable in the CES demo... was basically a scam.

        Personally if 2080 can't deliver high frame rate ray tracing at 4k what does it matter? Modern shader hacks for dynamic lighting are quite realistic.. so is it really worth cranking resolution down so much .. just for slightly more realistic lighting? Would that really produce a better overall quality image? Personally I'm more impressed by 1TB/s memory bandwidth than I am with ray tracing at this point.

        No doubt in the future RT will win out but right now to make buying decision based on it ... I personally don't see the value.

        • TAA is only state of the art in blurry mess technology

          Yup! It fucking appalls me that this is seen as a good standard to measure up against and, worse, that people prefer it!

        • that made the programming easier. Right now AAA games cost a fortune and they're kind of simplistic. Compare any modern game to Deus Ex. The stupid complexity of modern graphics are a big part of that. Having to hand code shaders for every little look and effect gets really pricey really fast....
          • Far Cry 5 dropped from $60 to $25 on steam. You just have to wait.

          • that made the programming easier. Right now AAA games cost a fortune and they're kind of simplistic. Compare any modern game to Deus Ex. The stupid complexity of modern graphics are a big part of that. Having to hand code shaders for every little look and effect gets really pricey really fast....

            These days, it's more likely that they're not spending as much time on game design / gameplay to rival Deus Ex, because they've spent that time on microtransaction systems instead. For example, why should Bungle go through all of that effort to make a good game when they can just put in bullet sponge enemies into Destiny 2 that force you to go grinding for better loot to kill them. Max out your character with the loot and they just need to raise the enemy health count so you can repeat the grind for new loo

            • Max out your character with the loot and they just need to raise the enemy health count so you can repeat the grind for new loot to kill stronger enemies.

              Repeat forever, or until the community gets bored.

              Slightly OT, but you can kill a game that way. Skyforge for instance. It is nicely made and even gets new content from time to time, but every two months there is a new invasion where the level cap is increased by 10, while the mobs gain proportionally in health. Then it is grinding time again just to keep your effective power level.

              By now Skyforge is down in the Steam charts to around 170 average players and 300 peak players. I wonder if MyCom still make any profit from this. I stopped playing myself last

        • by Kjella ( 173770 )

          Personally if 2080 can't deliver high frame rate ray tracing at 4k what does it matter? Modern shader hacks for dynamic lighting are quite realistic.. so is it really worth cranking resolution down so much .. just for slightly more realistic lighting? Would that really produce a better overall quality image?

          Well... it's 50% shader hacks and 50% avoiding the situations where the flaws are obvious. There's a reason most games avoid shiny reflective surfaces, mirrors, translucent materials and that you don't get proper shadows from dynamic elements like leaves blowing in the wind or the right reflections from a muzzle fire or explosion. But that also means that until it's a commodity you'll continue to avoid the situations where ray tracing makes the most sense. And with the current performance drop I'd probably

          • Watched a bunch of RTX demos. I like it and appreciate the big step up from screen space hacks, but I suspect I'm in the minority. Scenes have to be pretty contrived before you really notice, like explosions reflected in shiny car paint in Battlefield V. Like, who polished the wrecked car to a mirror finish in the middle of a war? I appreciate the more subtle global lighting in Metro Exodus much more, but again I'm in the minority. Most gamers won't know or care that it lights up the dark corners of a room

            • by Kjella ( 173770 )

              The standard hack has always been to put an arbitrary ambient light in the scene, most viewers won't notice the difference.

              I think it's absolutely noticeable... but so are a lot of other obvious clues that you're running around in a game world, you wouldn't exactly confuse it with a live video stream. Heck they're still struggling with that in all-CGI movies though I must admit they're getting pretty good at it.

        • Comment removed based on user account deletion
      • AMD had traditionally excelled on compute. I don't expect RT acceleration on Vega 7nm, but INT8 performance is 58.9 FlOPs on the M160 could be competitive with tensor. Especially with a PCI-e 4.0 option available combined with the HBM2. (At least on the data-center side of things). Gaming performance (Radeon cards) probably isn't going to be outstanding, but it should still be pretty good. I don't thing-k the lack of accelerated RT is going to hurt them as NVidia can't make it perform adequately even wit

        • Comment removed based on user account deletion
          • They improved performance by being a lot more selective where and how it was actually used. If you have to hand tune the engine and game for it, then it doesn't say a lot of good things about current state of the tech. Yes a few developers always experiment, but at this point most of them thing the effort is better spent elsewhere. 3-5 years down the line is maybe a different story but only if the hardware to do it is a lot more common.

    • Lets compromise on second tear technology, because the other superior product doesn't oblige with your favorite license agreement.

      NVidia isn't Anti-Open Source if that was the case they wouldn't be giving Linux drivers at all. They are just not pro-Open Source. They are not trying to put a stop to it, they just do not want to participate.

      • Lets compromise on second tier technology, because the other superior product doesn't oblige with your favorite license agreement.

        I agree completely, though I don't care about a specific license agreement, just that it be OSI compliant.

      • Comment removed (Score:4, Insightful)

        by account_deleted ( 4530225 ) on Wednesday January 09, 2019 @04:48PM (#57933242)
        Comment removed based on user account deletion
      • by epine ( 68316 )

        Lets compromise on second tear technology, because the other superior product doesn't oblige with your favorite license agreement.

        Let's compromise on second-tier literacy while we're at it.

        Here's a better metaphor: both Nvidia and AMD are fancy hotels, but for the last five or ten years, Nvidia has a posh penthouse bridal suite, and AMD doesn't. For a good while, concerning posh penthouse bridal suites, there was only one game in town.

        Frasier: Why would I stay across the street in a shitty hotel that doesn'

    • Yes, AMD is lagging behind

      So why is it that bitcoin miners universally voted Vega the most profitable mining GPU? Maybe because they have actual money riding on the results, as opposed to GPU review sites, which reportedly get considerable pressure from Nvidia to pick and choose benchmarks and engage in even slimier manipulation?

      AMD lagging is an Nvidia-created myth. AMD not owning the high space, that's true. But AMD not delivering the best performance/value equation, that's Nvidia's FUD.

      • So why is it that bitcoin miners universally voted Vega the most profitable mining GPU?

        Because it was the cheapest way to get GDDR5 in your system with good power consumption. Miners don't necessarily have the same needs as gamers.

        My next GPU will probably come from AMD, anyway, since by that time I probably won't run Windows on the bare metal any more.

  • But I'd say they really have to deliver on that promise of being competitive... does that include raytracing?

    Now I have no reference on the performance DMC demands but "way above 60 fps" doesn't sound THAT impressive.

    Also if shadow.tech keeps its promises, I'm not sure I'm gonna build a gaming rig anytime soon anyway.

  • This is rather disappointing really. It's just a Vega refresh that offers ~30% improved performance in most workloads or frame rates, but only at 40% additional cost compared to Vega 64. I suppose it's nice if you need more than 8GB of memory, but this isn't anything to get excited about as far as I'm concerned.

    At least the sneak peek at the new Ryzen CPUs looked promising.
    • Vega was a bit cheaper because of stiff competition from Nvidia, but Nvidia isn't all that competitive right now except in power utilization.

      That's what's got me interested. There's reviews of the 590 where folks found it was throttling on a 500 watt power supply and they had to put a 600 watt in to fix it. As an adult I pay for all that power and it does add up. So for me the question is are the competitive with Nvidia on power consumption now?

      Oh, and DMC at 4k/60fps? It's a beat'em'up/spectacle br
      • I'm all for efficiency, but if money is your concern, lets look at the numbers... 100W difference * 365 days/year * what, an average of 5 hours per day of use? = ~183kWh/year difference. Times the U.S. average of $0.12/kWh = $22/year.

        Yes, it does add up - but it's going to have to add up for a long time before it's more than a minor factor in the total cost of ownership. And if you're buying a cutting-edge video card today, you're probably going to buy a replacement long before the difference in power cos

        • is so you don't have to upgrade. There's videos on youtube of folks benchmarking 7 year old flagships and still hitting 60fps. So spend $700 now and pay $100/yr for a card or $300 every 2-3 years and hit somewhere around $800-$900. Plus the flagships tend to hold their value better, so you'll probably get $200 for it in 7 years when you sell it.

          Also if you replace it with something just as power hungry that kinda defeats the point...

          Thing is, if I keep a card 4 years (which I usually do) and save $1
          • Plus the flagships tend to hold their value better

            Only recently has this even been partly true. The cryptocurrency bubble created insane demand which artificially inflated video card prices. Case in point: I bought an RX580 almost two years ago for about $250. Less than a year ago that same card was going for almost $350 on Amazon. Only recently has it finally fallen back to what I originally paid for it.

            What *used* to happen was cards were rapidly made obsolete by advances in video card tech. A $1000 card would be worth half that in a year and be wo

      • Slightly OT:
        If power efficiency is important to you, then a Vega 56 might be a better choice than the RX590. Slightly lower TBP and slightly better performance at the same time. It is more expensive though.

  • Realistically it will be 5 to 7 percent if history is any indicator.
  • I mean, if it is supposedly "competitive" with the Nvidia RTX 2080, which means.. almost as fast and the same price as the Nvidia RTX 2080, why would you not buy the Nvidia RTX 2080?

    • Because maybe you want open source drivers? or you want to be able to install it freely in a DC without stupid NVIDIA restriction, etc. Nvidia is producing good stuff, but as a company they really are behaving like MS in the bad years.
    • I mean, if it is supposedly "competitive" with the Nvidia RTX 2080, which means.. almost as fast and the same price as the Nvidia RTX 2080, why would you not buy the Nvidia RTX 2080?

      Easy because Nvidia. I'll take a shit onboard Intel over the best Nvidia card. There are only 3 companies that i will NEVER give one penny of my money to. Sony, Apple, and Nvidia. All 3 could and should die and the world would be a better place for it.

      • Why? What is so bad about them relative to say, Intel?

        It would be hard to argue their cards are crap, considering it's not possible to buy something faster.

      • Isn't it strange when you explain that to somebody in words of one syllable and they still don't get it.

    • by AHuxley ( 892839 )
      More memory can be of use.
      For games get the RTX 2080.
      • For games, get the RX 580 if you want best value, or Vega VII if you want prosumer and regard NVidia as too disgusting to give your money to.

      • Also get the Vega VII if you want your rig to run cool and quiet.

    • by Kuruk ( 631552 )
      For professional work this will kick nvidia in the pants. 16G HBM2. Just look at the openCL graph.
    • Because:
      16G vs 8G
      3 games
      not nVidia

  • by Joe_Dragon ( 2206452 ) on Wednesday January 09, 2019 @04:28PM (#57933098)

    apple mac pro price $999

  • I'm all for dedicated graphic chips as long as they cost less than $100.

    • by Luckyo ( 1726890 )

      That's probably about what mid range costs. Just the dedicated chip of course. The board is extra.

  • ....I guess I'm still waiting for the "glut" of Nvidia top end cards to hit the market, somehow I can't comprehend how Nvidia sitting on thousands and thousands of cards in inventory and that hasn't impacted their prices.

    • by Luthair ( 847766 )
      If they flood the market with old stock at a discount, that doesn't perform much worse than the new cards, who is going to buy the new expensive ones.
      • Right. NVidia might end up stuck with a bunch of 1080 overstock that is only good for scrap

        • They should let at least a portion of those cards out just to keep stringing people along. Odds are they can't produce the new cards rapidly enough to meet demand even at their ridiculous prices.

The one day you'd sell your soul for something, souls are a glut.

Working...