AMD Announces Radeon VII, Its Next-Generation $699 Graphics Card (theverge.com) 145
An anonymous reader shares a report: AMD has been lagging behind Nvidia for years in the high-end gaming graphics card race, to the point that it's primarily been pushing bang-for-the-buck cards like the RX 580 instead. But at CES, the company says it has a GPU that's competitive with Nvidia's RTX 2080. It's called the Radeon VII ("Seven"), and it uses the company's first 7nm graphics chip that we'd seen teased previously. It'll ship on February 7th for $699, according to the company. That's the same price as a standard Nvidia RTX 2080. [...] AMD says the second-gen Vega architecture offers 25 percent more performance at the same power as previous Vega graphics, and the company showed it running Devil May Cry 5 here at 4K resolution, ultra settings, and frame rates "way above 60 fps." AMD says it has a terabyte-per-second of memory bandwidth.
Good news! (Score:5, Insightful)
Re: (Score:1)
Re:Good news! (Score:4, Insightful)
Re: (Score:1)
The parts of the Nvidia driver that are loaded in the kernel and need to be compiled against the kernel are open source. The fact that the driver then loads a binary blob does not alter the long term support and open source nature of the part that needs to be compiled against the kernel.
In addition there are both short term and long term drivers:
https://www.nvidia.com/object/unix.html
Re: (Score:1)
With the open source "shim" support might even last longer than for Windows. Because you can keep the old blob and only adapt the interface kernel-side. But obviously, you will be stuck with the functionality of the last blob that Nvidia released. Including whatever bugs it still had at that point.
Besides, with about 30% more "raw" computing power and twice the memory, the Vega VII might actually beat the RTX 2080 long-term. Short term I agree that the RTX 2080 will probably bring slightly higher FPS, but A
Re: (Score:2)
No it doesn't self-destruct, but it does bitrot. You'll never get wayland running on the depreciated cards,
Re: (Score:1)
The part that needs to be compiled against the kernel isn't the part Steam checks for. Fuck your astroturfing.
Re: (Score:2)
NVidia is not an option if you need a longer term linux support.
If you buy a modern and mainstream nvidia card, you can be fairly sure that it will be supported on linux from a time near its release (maybe on time, maybe not) to a time some years later. However, some features supported on other platforms will not be supported, and the open source driver will not support all of the functionality and/or performance of the proprietary driver for a long period, if ever. It used to be the obvious choice, but now it's an obviously flawed one. If the AMD platform OSS drivers a
Re:Good news! (Score:5, Insightful)
Come on, don't give them a pass. It's not a very good value proposition is it. It's for the fanboys only. You can buy a 2080 for $699 and you get RT and Tensor cores (ray tracing, DLSS, etc.).
I watched the Nvidia CES and the whole presentation + RT/Tensor thing felt like one giant scam.
DLSS as near as I can tell is basically just an upscaler using substantially similar "AI" database approach as Sony's x-reality asic. This technology has been around for years. While it's nice it sure as heck doesn't produce magical outcomes that are anywhere near rendering native resolution.
Then there was gratuitous use of TAA throughout the demos as a reference which would be hilarious if they were not serious. TAA is only state of the art in blurry mess technology... using that as basis for comparisons especially given the effective resolution of the window as it was viewable in the CES demo... was basically a scam.
Personally if 2080 can't deliver high frame rate ray tracing at 4k what does it matter? Modern shader hacks for dynamic lighting are quite realistic.. so is it really worth cranking resolution down so much .. just for slightly more realistic lighting? Would that really produce a better overall quality image? Personally I'm more impressed by 1TB/s memory bandwidth than I am with ray tracing at this point.
No doubt in the future RT will win out but right now to make buying decision based on it ... I personally don't see the value.
Re: (Score:2)
TAA is only state of the art in blurry mess technology
Yup! It fucking appalls me that this is seen as a good standard to measure up against and, worse, that people prefer it!
Re: (Score:3)
I play warframe at 120fps lock. TAA is fucking awful. It's like vaseline smeared all over your screen. I suspect the only people who like it are the people who really like motion blur. It produces a sorta kinda similar look.
What I'd like to see is tech (Score:2)
Re: (Score:2)
Far Cry 5 dropped from $60 to $25 on steam. You just have to wait.
Re: (Score:2)
Lol. Get woke, go broke.
Less broke, by $35.
Re: (Score:2)
that made the programming easier. Right now AAA games cost a fortune and they're kind of simplistic. Compare any modern game to Deus Ex. The stupid complexity of modern graphics are a big part of that. Having to hand code shaders for every little look and effect gets really pricey really fast....
These days, it's more likely that they're not spending as much time on game design / gameplay to rival Deus Ex, because they've spent that time on microtransaction systems instead. For example, why should Bungle go through all of that effort to make a good game when they can just put in bullet sponge enemies into Destiny 2 that force you to go grinding for better loot to kill them. Max out your character with the loot and they just need to raise the enemy health count so you can repeat the grind for new loo
Re: (Score:1)
Max out your character with the loot and they just need to raise the enemy health count so you can repeat the grind for new loot to kill stronger enemies.
Repeat forever, or until the community gets bored.
Slightly OT, but you can kill a game that way. Skyforge for instance. It is nicely made and even gets new content from time to time, but every two months there is a new invasion where the level cap is increased by 10, while the mobs gain proportionally in health. Then it is grinding time again just to keep your effective power level.
By now Skyforge is down in the Steam charts to around 170 average players and 300 peak players. I wonder if MyCom still make any profit from this. I stopped playing myself last
Re: (Score:2)
Personally if 2080 can't deliver high frame rate ray tracing at 4k what does it matter? Modern shader hacks for dynamic lighting are quite realistic.. so is it really worth cranking resolution down so much .. just for slightly more realistic lighting? Would that really produce a better overall quality image?
Well... it's 50% shader hacks and 50% avoiding the situations where the flaws are obvious. There's a reason most games avoid shiny reflective surfaces, mirrors, translucent materials and that you don't get proper shadows from dynamic elements like leaves blowing in the wind or the right reflections from a muzzle fire or explosion. But that also means that until it's a commodity you'll continue to avoid the situations where ray tracing makes the most sense. And with the current performance drop I'd probably
Re: (Score:2)
Watched a bunch of RTX demos. I like it and appreciate the big step up from screen space hacks, but I suspect I'm in the minority. Scenes have to be pretty contrived before you really notice, like explosions reflected in shiny car paint in Battlefield V. Like, who polished the wrecked car to a mirror finish in the middle of a war? I appreciate the more subtle global lighting in Metro Exodus much more, but again I'm in the minority. Most gamers won't know or care that it lights up the dark corners of a room
Re: (Score:2)
The standard hack has always been to put an arbitrary ambient light in the scene, most viewers won't notice the difference.
I think it's absolutely noticeable... but so are a lot of other obvious clues that you're running around in a game world, you wouldn't exactly confuse it with a live video stream. Heck they're still struggling with that in all-CGI movies though I must admit they're getting pretty good at it.
Re: (Score:1)
Re: (Score:3)
AMD had traditionally excelled on compute. I don't expect RT acceleration on Vega 7nm, but INT8 performance is 58.9 FlOPs on the M160 could be competitive with tensor. Especially with a PCI-e 4.0 option available combined with the HBM2. (At least on the data-center side of things). Gaming performance (Radeon cards) probably isn't going to be outstanding, but it should still be pretty good. I don't thing-k the lack of accelerated RT is going to hurt them as NVidia can't make it perform adequately even wit
Re: (Score:1)
Re: (Score:2)
They improved performance by being a lot more selective where and how it was actually used. If you have to hand tune the engine and game for it, then it doesn't say a lot of good things about current state of the tech. Yes a few developers always experiment, but at this point most of them thing the effort is better spent elsewhere. 3-5 years down the line is maybe a different story but only if the hardware to do it is a lot more common.
Re: (Score:1)
Re: (Score:1)
Lets compromise on second tear technology, because the other superior product doesn't oblige with your favorite license agreement.
NVidia isn't Anti-Open Source if that was the case they wouldn't be giving Linux drivers at all. They are just not pro-Open Source. They are not trying to put a stop to it, they just do not want to participate.
Re: (Score:2)
Lets compromise on second tier technology, because the other superior product doesn't oblige with your favorite license agreement.
I agree completely, though I don't care about a specific license agreement, just that it be OSI compliant.
Comment removed (Score:4, Insightful)
Re: (Score:3)
Let's compromise on second-tier literacy while we're at it.
Here's a better metaphor: both Nvidia and AMD are fancy hotels, but for the last five or ten years, Nvidia has a posh penthouse bridal suite, and AMD doesn't. For a good while, concerning posh penthouse bridal suites, there was only one game in town.
Frasier: Why would I stay across the street in a shitty hotel that doesn'
Re: (Score:2)
Yes, AMD is lagging behind
So why is it that bitcoin miners universally voted Vega the most profitable mining GPU? Maybe because they have actual money riding on the results, as opposed to GPU review sites, which reportedly get considerable pressure from Nvidia to pick and choose benchmarks and engage in even slimier manipulation?
AMD lagging is an Nvidia-created myth. AMD not owning the high space, that's true. But AMD not delivering the best performance/value equation, that's Nvidia's FUD.
Re: (Score:2)
So why is it that bitcoin miners universally voted Vega the most profitable mining GPU?
Because it was the cheapest way to get GDDR5 in your system with good power consumption. Miners don't necessarily have the same needs as gamers.
My next GPU will probably come from AMD, anyway, since by that time I probably won't run Windows on the bare metal any more.
er, not gddr5 (Score:2)
But anyway, not about gaming performance, but about bandwidth.
Re: (Score:2)
AMD is not lagging in performance/value, I thought I made that clear. Whether steam gamers realize that or not is a different question. Obviously, 15% of them do.
Re: (Score:2)
You must be fun at parties
Re: (Score:1)
The game shown in the keynote was a new, unreleased version of Devil May Cry (DMC 5). That makes it hard to make comparisons. I could not find any other source that shows off DMC5 performance. I guess we will know more in a month, when the Vega VII is released and testers can compare it to the RTX 2080 in existing games.
Okay, I may be pessimistic (Score:1)
But I'd say they really have to deliver on that promise of being competitive... does that include raytracing?
Now I have no reference on the performance DMC demands but "way above 60 fps" doesn't sound THAT impressive.
Also if shadow.tech keeps its promises, I'm not sure I'm gonna build a gaming rig anytime soon anyway.
Re: Okay, I may be pessimistic (Score:2)
Dmc5 isn't even out yet. So the fact that it runs it on ultra at above 60 FPS in 4K at that. Is a good thing.
Ray tracing isn't really ready for prime time (Score:2)
Re: (Score:2)
And the 2080 is basically the same as the 1080 Ti.
And that's if you believe AMD's benchmarks. Sure, it'll win in some Vulkan games if you compare stock clocks to stock clocks, but it'll lose or barely keep up in everything else. And if you ever overclock, the Nvidia option will pull miles ahead while still using less power. This is with AMD having a node shrink advantage! Vega is trash.
Re: (Score:3)
It's almost like you haven't seen what a new 1080ti or rtx2800 costs right now
Re: (Score:3)
I see the 1080 ti going from almost $800 to almost $1680 (newegg) I can find some Nvidia RTX 2080 for as cheap as 699.99 to $1,699.00.
I fail to see what your issue is?
Disappointing (Score:2)
At least the sneak peek at the new Ryzen CPUs looked promising.
New stuff's always expensive at launch (Score:2)
That's what's got me interested. There's reviews of the 590 where folks found it was throttling on a 500 watt power supply and they had to put a 600 watt in to fix it. As an adult I pay for all that power and it does add up. So for me the question is are the competitive with Nvidia on power consumption now?
Oh, and DMC at 4k/60fps? It's a beat'em'up/spectacle br
Re: (Score:2)
I'm all for efficiency, but if money is your concern, lets look at the numbers... 100W difference * 365 days/year * what, an average of 5 hours per day of use? = ~183kWh/year difference. Times the U.S. average of $0.12/kWh = $22/year.
Yes, it does add up - but it's going to have to add up for a long time before it's more than a minor factor in the total cost of ownership. And if you're buying a cutting-edge video card today, you're probably going to buy a replacement long before the difference in power cos
One reason to buy a high end card (Score:2)
Also if you replace it with something just as power hungry that kinda defeats the point...
Thing is, if I keep a card 4 years (which I usually do) and save $1
Re: (Score:1)
Plus the flagships tend to hold their value better
Only recently has this even been partly true. The cryptocurrency bubble created insane demand which artificially inflated video card prices. Case in point: I bought an RX580 almost two years ago for about $250. Less than a year ago that same card was going for almost $350 on Amazon. Only recently has it finally fallen back to what I originally paid for it.
What *used* to happen was cards were rapidly made obsolete by advances in video card tech. A $1000 card would be worth half that in a year and be wo
Re: (Score:1)
Slightly OT:
If power efficiency is important to you, then a Vega 56 might be a better choice than the RX590. Slightly lower TBP and slightly better performance at the same time. It is more expensive though.
Re: (Score:3)
Yeah, real time ray tracing isn't really a thing yet. They haven't really built video cards powerful enough to enable it without a huge performance hit.
Maybe the next gen cards will support it in 2020... but it's not gonna happen now.
Re: (Score:2)
Yeah, there are a few games out there that support it. Turning it on often cuts the game performance by half, though. It's basically the same impact as jumping from 1080p to 4K resolution at the moment.
Re: (Score:2)
Re: (Score:2)
The graphics card is NOT the device that controls when the new frame starts. Not if you want anything to work, that is.
If you use a format or timing the display doesn't support, you'll get nothing or you'll get a blinking (and possibly scrolling), unusable mess.
If you just vblank/vsync/rsync/whatever willy nilly on an LCD, it will laugh at you.
Variable refresh rates have been in the standards for ages, sure. But they were not implemented in any way that someone could drive a continuous, uninterrupted video
Re: (Score:2)
Except that's not how it works. Screen updates are not instantaneous, and vsync originates with the monitor, not the video card - you can think of it as the monitor saying "okay, I'm drawing the next frame, start feeding me the data NOW", and the video card is expected to comply and start sending whatever is in the frame buffer, because the monitor is already busy displaying it.
What Gsync and Freesync essentially do is let the computer say back "Wait, not yet... okay now", so that if your rendering engine i
Re: (Score:3)
There's a bit of a terminology snafu here:
v(ertical)sync's original meaning is the once-per-frame signal between monitor and video card that lets them stay coordinated and display a stable image (in combination with h(orizontal)sync, which happens once per line)
Then there's the more modern usage where you "turn on vsync" in a game. But what that's actually turning on is "vsynced rendering", or perhaps better expressed "sync the framebuffer update to the monitor vsync signal"
And yes,"turning on vsync" does
Re: (Score:2)
Okay, so I'm specifically talking about rendering *without* tearing - which means that you can never alter the front framebuffer while it's active, only during the brief pause between when the monitor finishes displaying one frame and begins displaying the next. That includes no swapping buffers. Do so, and the screen will tear, displaying the top part of the old frame buffer, and the bottom part of the new one.
No buffering means that the only way to avoid tearing is to render the entire frame during that
Re: (Score:2)
Well, time wasted on my reply then I guess.
And yes, triple buffering is no longer super memory intensive, though it used to be. 24MB for three 32-bit 1080p frames, 96MB for 3x4k... I've gone through a LOT of video cards that had far less RAM than that.
Triple buffering does though add a variable amount of lag though - which can futz with your reflexes. Just look at the time between starting to render a frame (last input actions processed), and the frame being displayed (visual feedback received)
Double buff
25 percent gains? (Score:1)
Re: (Score:1)
Even 580 was more than 7% faster than 480 and here you have major clock bump.
Why buy one? (Score:2)
I mean, if it is supposedly "competitive" with the Nvidia RTX 2080, which means.. almost as fast and the same price as the Nvidia RTX 2080, why would you not buy the Nvidia RTX 2080?
Re: (Score:2)
Re: (Score:2)
I mean, if it is supposedly "competitive" with the Nvidia RTX 2080, which means.. almost as fast and the same price as the Nvidia RTX 2080, why would you not buy the Nvidia RTX 2080?
Easy because Nvidia. I'll take a shit onboard Intel over the best Nvidia card. There are only 3 companies that i will NEVER give one penny of my money to. Sony, Apple, and Nvidia. All 3 could and should die and the world would be a better place for it.
Re: (Score:2)
Why? What is so bad about them relative to say, Intel?
It would be hard to argue their cards are crap, considering it's not possible to buy something faster.
Re: (Score:2)
Isn't it strange when you explain that to somebody in words of one syllable and they still don't get it.
Re: (Score:2)
For games get the RTX 2080.
Re: (Score:2)
For games, get the RX 580 if you want best value, or Vega VII if you want prosumer and regard NVidia as too disgusting to give your money to.
Re: (Score:2)
Also get the Vega VII if you want your rig to run cool and quiet.
Re: (Score:1)
Re: (Score:2)
1 TB of memory bandwidth is legendary
16Gb, 3 games, not nVidia (Score:1)
Because:
16G vs 8G
3 games
not nVidia
Re: (Score:2)
AMD still doesnt release drivers as frequently as nvidia
That should be a positive. Frequent driver updates are indicative of shitty quality and a broken system where hardware manufacturers bend over backwards to add in game-specific hacks to deal with shitty code put out by developers / engines. Yes, occasionally the games are running into flaws in the driver (or hardware) and the updated drivers fix / work around that. For those instances, see the prior point regarding shitty quality.
Re: (Score:2)
That's straight up marketing BS. Reality is, unfixed bugs stay across multiple driver releases with remarkable consistency. In some cases, they persist for many months, even when they're so well documented that nvidia has to literally copy/paste it in documentation errata in each release for months, like the firefox cursor corruption issue.
Re: (Score:2)
That's straight up marketing BS. Reality is, unfixed bugs stay across multiple driver releases with remarkable consistency.
I think you're both right, or at least, in my experience both things are true. nvidia does have more releases that fix my problems, and also in the past I have needed DnA (patched) drivers to get my ATI/AMD GPUs to work correctly.
Re: (Score:2)
I completely agree that in the past, some games required patches from AMD to run properly. Same is true in reverse, where they needed patches from nvidia to run properly. This is mainly linked to each manufacturers "best played on our cards" campaigns, which tend to block the other GPU manufacturer access from the title to make relevant driver adjustments ahead of time.
As AMD's program is notably smaller than nvidia's, it tends to run into this problem more than nvidia.
Re: (Score:2)
"I completely agree that in the past, some games required patches from AMD to run properly."
Lets be clear here, DnA drivers were third party patched. The fact is that amd was so bad at writing working drivers that they literally could not do it, and it was necessary to get drivers which had been modified by someone else to get the GPU working correctly. Meanwhile Nvidia was putting out fairly regular releases with their own fixes.
Everyone says that amd is much better now, but most of their customers claimed
Re: (Score:2)
"No marketing gimmicks, and drivers that are released when updates are relevant".
Where's the downside?
(Before you pretend that releasing drivers often lets them fix problems, tell us, how many driver releases did they kick the the firefox browser corruption bug down the road again?)
Re: (Score:1)
The 2080 costs around $800. If you compare based on price/performance, the Vega VII is more attractive if it can match the performance of the 2080. The 16 GB might also be more future-proof with constantly increasing memory requirements in gaming.
apple mac pro price $999 (Score:3)
apple mac pro price $999
Re: (Score:2)
"Navi" should be the new one, presumably on 7nm and GDDR6. Why not now?
Because the Navi team is working on PS5 [extremetech.com]. When that's done, they'll work on the PC Navi hardware.
Also related: Everyone talking about Ray tracing are playing right into NVidia's hands; they would want any talking points regarding AMD to include "but where's the ray tracing?" even though it's a vendor specific thing that not everyone will use, even if they have the hardware, because using it drops your frame rate.
Meanwhile, the game developers are going to primarily target AMD based GPUs for most of their wor
Re: (Score:2)
The raytracing Nvidia is touting is DXR, or DirectX Raytracing. It's a standard AMD can support, rather than a proprietary Nvidia-only thing. Honestly I don't care about the raytracing that much (in this gen of video cards) because it's too slow and poorly-supported. However, if I were dropping $700 on a new top-end video card, I'd question paying the same money for fewer features. If the Radeon 7 were a couple hundred cheaper, I might decide to skip DXR for a few years.
Another thing to keep in mind is that
Re: (Score:2)
Argument: AMD wouldn't want to support DXR right now because they have no properly DXR compliant HW, and their next targets are PS5 and Apple (Navi) where they will focusing on Vulcan and Metal
Will they have coupons? (Score:2)
I'm all for dedicated graphic chips as long as they cost less than $100.
Re: (Score:2)
That's probably about what mid range costs. Just the dedicated chip of course. The board is extra.
Sticker shock (Score:2)
....I guess I'm still waiting for the "glut" of Nvidia top end cards to hit the market, somehow I can't comprehend how Nvidia sitting on thousands and thousands of cards in inventory and that hasn't impacted their prices.
Re: (Score:2)
Re: (Score:2)
Right. NVidia might end up stuck with a bunch of 1080 overstock that is only good for scrap
Re: (Score:2)
They should let at least a portion of those cards out just to keep stringing people along. Odds are they can't produce the new cards rapidly enough to meet demand even at their ridiculous prices.