Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Hardware Technology

AMD 2nd Gen Ryzen Processors Launched and Benchmarked (hothardware.com) 106

MojoKid writes: AMD launched its 2nd Generation Ryzen processors today, based on a refined update to the company's Zen architecture, dubbed Zen+. The chips offer higher clocks, lower latencies, and a more intelligent Precision Boost 2 algorithm that improves performance, system responsiveness, and power efficiency characteristics. These new CPUs still leverage the existing AM4 infrastructure and are compatible with the same socket, chipsets, and motherboards as AMD's first-generation products, with a BIOS/UEFI update.

There are four processors arriving today, AMD's Ryzen 7 2700X, the Ryzen 7 2700, the Ryzen 5 2600X, and the Ryzen 5 2600. Ryzen 7 chips are still 8-core CPUs with 20MB of cache but now top out at 4.3GHz, while Ryzen 5 chips offer 6 cores with 19MB of cache and peak at 4.2GHz. AMD claims 2nd Gen Ryzen processors offer reductions in L1, L2, and L3 cache latencies of approximately 13%, 34%, and 16%, respectively. Memory latency is reportedly reduced by about 11% and all of those improvements result in an approximate 3% increase in IPC (instructions per clock). The processors now also have official support for faster DDR4-2933 memory as well. In the benchmarks, 2nd Gen Ryzen CPUs outpaced AMD's first gen chips across the board with better single and multithreaded performance, closing the gap even further versus Intel, often with better or similar performance at lower price points. AMD 2nd Gen Ryzen processors, and new X470 chipset motherboards that support them, are available starting today and the CPUs range from $199 to $299.

This discussion has been archived. No new comments can be posted.

AMD 2nd Gen Ryzen Processors Launched and Benchmarked

Comments Filter:
  • by Anonymous Coward on Thursday April 19, 2018 @09:31PM (#56468799)

    I hope AMD can keep this ball rolling.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      It would be nice to see a return to the Athlon days when AMD chips were strong competitors. From a quick skimming of the benchmarks, it looks like for gamers and average users, there's no real benefit to the Intel chips performance wise and they cost around $100 more. If I were AMD, I would be pounding that fact hard in the press.

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        The benchmarks are being run before the patches that fixed Intel's shit security. Check out the Anandtech one for something a bit more realistic.

      • Plenty of benefits for Intel chips precisely for gaming.
        That's were they are leading.

        Rather than putting it like you do the truth is that you don't need the fastest cpu unless you have a beast of a graphics card or run at settings to maximize frames per second. Because you'll be gpu limited anyway.

        But if you aren't then Intel have a lead in games.

        • by Anonymous Coward on Thursday April 19, 2018 @11:28PM (#56469153)

          Plenty of benefits for Intel chips precisely for gaming.
          That's were they are leading.

          Rather than putting it like you do the truth is that you don't need the fastest cpu unless you have a beast of a graphics card or run at settings to maximize frames per second. Because you'll be gpu limited anyway.

          But if you aren't then Intel have a lead in games.

          Actually, if you turn on XFR2 and if you apply the meltdown patches on Intel (which will be automatically done w/ Windows updates), the AMD processors are now faster for gaming.

          Check out Anandtech article here: https://www.anandtech.com/show/12625/amd-second-generation-ryzen-7-2700x-2700-ryzen-5-2600x-2600 [anandtech.com]

          The AMD 2700X absolutely kills the top of the line Intel in gaming.

          • by aliquis ( 678370 )

            No, they unlikely aren't.

            I know a bunch of people speculate completely out of their asses that that would be the reason Anandtech results are different to most others but that's likely the case because any serious tester would use the latest Windows update, BIOS and drivers.

            You just claim that out of nothing. You shouldn't do that. You don't know that's the case. Just a bunch of idiots claim that like if it was the truth.

            https://www.sweclockers.com/te... [sweclockers.com]
            720p medium GTX 1080Ti:
            BF1: 8700K 25% better avg. 19%

      • by MachineShedFred ( 621896 ) on Thursday April 19, 2018 @11:36PM (#56469175) Journal

        The "Athlon" days when AMD was ahead in performance was also when Intel had their head wedged so far up their ass they had to cut in switchback trails to find it. The Pentium 4 architecture was fucking horrible, and had the albatross of Rambus around it's neck. When they corrected that, they blasted right back in front and stayed there.

        The good news for AMD: It appears that Intel once again has their head wedged so far up their ass that the suction from extraction just might kill them. If AMD was ever to give Intel another crotch-punch in benchmarking, now's the time.

        • by ShanghaiBill ( 739463 ) on Friday April 20, 2018 @01:09AM (#56469473)

          The Pentium 4 architecture was fucking horrible, and had the albatross of Rambus around it's neck.

          Intel also had most of their best people working on Itanium, and only the B-team working on x86.

          Once the Athlon iceberg had sunk the "Itanic", Intel put the A-team back to work on x86.

          • Re: (Score:2, Informative)

            by Anonymous Coward

            You know jack shit about this.

          • by Megol ( 3135005 )

            You think Intel only had two teams? The P4 had incredibly skilled people working on it but the project started with some assumptions that turned out wrong in the end. Out of order execution could be improved and it could be made deeper/wider without excessive power draw. Process improvements didn't make a true speed-demon the best performing design.

        • Well, that and they froze AMD out of the market entirely so they had no revenue stream to commit to R&D. Combined with the fact they pushed for their own foundry too early and AMD was fucked.

        • by Lonewolf666 ( 259450 ) on Friday April 20, 2018 @05:56AM (#56470089)

          It is a bit more complex like that, the performance leadership changed a few times.
          At first, the Pentium 4 was badly outclassed, unless you bought the really expensive RDRAM memory from Rambus.

          Then Intel released the Northwood series with support for DDR RAM and eventually two memory channels, which helped the bandwidth-hungry Pentium 4 architecture a lot. I guiess that is what you meant with "blasting in front" and for a while, the reworked Pentium 4 was in fact faster than AMD's Athlon XP.

          Enter the Athlon 64. For a while, AMD was leading the race again, the Pentium 4 had run into a clock speed limit Intel had not foreseen (that was the age of the extremely hot running Prescott).

          In 2006, Intel countered with the Core 2, which brought them in front again for several years. AMDs efforts in that time varied between "inferior" and "competitive but not ahead" (Phenom II). AMD kept itself afloat with aggressive pricing, at the expense of meager financial results.

          Now there is the Ryzen, which started out competitive a year ago and appears to win over Intel with the new models. We seem to be back in a new "Athlon age" which pleases me no end as someone who dislikes Intel's business methods. I hasten to add that this is not meant to disparage Intel's engineering team:
          Their processors are pretty good, at worst there has been a bit of stagnation lately. It just seems that AMD can do even better these days :)

           

  • Call me when they get "official support" for this.

    • by Anonymous Coward

      Check out the conditions for "officially supported" memory speeds from the motherboard manufacturers sites for the x470 boards. Either they have not updated the sites, or even the 2933 support is mostly just speculation. Who wants to use one, single rank stick of memory with their APU? I'm still hoping that the memory manufacturers would release 2933 single rank sticks and that all the motherboard manufacturers would at least support dual channel single rank configuration at the speed. ECC-support seems to

  • by UnknownSoldier ( 67820 ) on Thursday April 19, 2018 @10:50PM (#56469041)

    I have zero plans to let Microsoft's spyware, Windows 10, on my systems.

    • by Anonymous Coward

      You could run Windows 7 in a virtual machine and even use GPU pass-through to play games (but you'd need two GPUs to pull this off).

      • Ah, yes, a VM would work.

        How's the performance of GPU pass through for 3D graphics / games? (This is the first time I've heard of it.)

        • by UnknownSoldier ( 67820 ) on Thursday April 19, 2018 @11:33PM (#56469165)

          For anyone interested, found this article about GPU passthrough, Linux, and Windows:

          GPU passthrough: gaming on Windows on Linux [davidyat.es]

        • Re: (Score:3, Informative)

          by Anonymous Coward

          If you're doing it right, the GPU is dedicated to the VM. This has been a thing for a few years now - Intel calls it VT-d, AMD calls theirs AMD-Vi [wikipedia.org].

          Most hypervisors that aren't ancient also support it, but your mileage may vary with GPUs made by twats that specifically disable this in their firmware in order to get you to buy a much more expensive version of the same card (Nvidia)

          • Its still going to be a serious hit to modern games.

            Just because the GPU passes through, doesn't mean the CPU doesn't take a hit. Unless, perhaps, with some sort of ultra-thin OS "dedicated VM host" instead of hosting on Linux or Windows.

            There are various benchmarks and stats to read online if you're so inclined.

            • by Anonymous Coward

              Actually, if you're using KVM with hardware with the appropriate extensions, your cpu hit is almost negligible. Expect something in the 2-5% range.

        • by Khyber ( 864651 )

          "How's the performance of GPU pass through for 3D graphics / games?"

          If you're going the RemoteFX route, utter shit for 'modern' Windows now days compared to Win 7/Server 2008R2. Literally halved performance because of some Microsoft fuckery.

    • Re: (Score:3, Insightful)

      by Gravis Zero ( 934156 )

      If you actually gave a damn about Microsoft spying on you then you wouldn't be using Microsoft products because (surprise!) they all spy on you.

      The most common excuse for not using Linux is lazy users that insist on 100% feature parity.

      • by MrL0G1C ( 867445 )

        My excuse is that I'm a gamer that doesn't want to spend endless hours trying to get games to work rather than simply playing them. Using an OS should not be hard work.

        • Mr. Logic. Here is some logic you may want to think about. You may have to learn something, to use something thats new(to you). Unless youre planning on using PCIE Passthrough, gaming on linux isnt hard at all considering most games for linux come from steam. The main reason people want to switch to linux is because windows has pissed them off, be it crashes, spying, forced upgrades, you name it microsoft has pissed off hundreds of users with it. Considering the upsides you get with linux, Stability, Securi

          • by MrL0G1C ( 867445 )

            I just watched a video benchmarking Linux wine vs windows in VM and the benchmarker couldn't get some of the benchmarks to work and had to do what sounds like a lot of research to get the stuff working that did work.

            Unlike my windows games (win7 - updates individually picked) where I download a game and play it 99% of the time without any difficulty. My windows machine doesn't crash any more than a Linux machine, it doesn't force down updates and I haven't allowed the spyware updates.

            I'm a gamer, I want to

            • You muddied wine with playing games on Linux. Steam has thousands of games that have native Linux support. No wine no vm. There are tons of Indy games available for the platform. You're correct in most likely not playing AAA games on Linux. However with PCIE Passthrough once you get it setup, it's done it's just like having Windows desktop on bare metal. Also once microsoft no longer supports Windows 7 what will you do? How do you plan to play the latest and greatest games then? And just because one YouTube

              • by MrL0G1C ( 867445 )

                And for every game with native Linux support there are 10 games without including most of the 2000+ games I own.

            • and I haven't allowed the spyware updates.

              Last I heard, Microsoft was bundling all the spyware updates with the security updates. Is this not true?

    • Re: (Score:2, Redundant)

      Are you seriously thinking MS didn't inject any new spying or similar code into one of their numerous W7 updates since 2009?
    • by Anonymous Coward

      Windows 7 End of Extended Support: January 14, 2020.

      IMO, if you're already planning to use Windows 7 after Jan 14 2020, then it's time to turn in your geek card.

    • I have zero plans to let Microsoft's spyware, Windows 10, on my systems.

      Better... Linux supported, runs like a dream. Except of course for those pre-week 24 segfaulting parts, which AMD rma'd without fuss. Sent the new part with a nicer cooler too, with programmable LEDs :)

    • by AmiMoJo ( 196126 )

      I've noticed that AMD's GPUs now only support Windows 7 and Windows 10. They dropped support for 8.1! The last good version before the spyware came in...

  • by jwhyche ( 6192 ) on Friday April 20, 2018 @12:06AM (#56469269) Homepage

    I'm in the market for a new cpu. AMD's timing couldn't be more perfect.

    • Re:How sweet (Score:5, Interesting)

      by AmiMoJo ( 196126 ) on Friday April 20, 2018 @05:47AM (#56470067) Homepage Journal

      Really liking AMD's offerings too. Great CPUs, great chipsets and a socket that won't be obsolete in a few months.

      For a workstation I'd save up for a Threadripper though. It's not just the threads, it's the fact that you get so many more PCIe lanes. Loads of PCIe lanes effectively future proofs you because you will have enough expansion capability to add that 30GB/sec SSD or USB 4.0 controller. Also the IOMMU support is good so you can run Windows in a VM with near native GPU performance on a Linux host.

    • AMD's timing couldn't be more perfect.

      You're not kidding. I planned out a new PC build yesterday (Thursday) and ordered all the parts. It wasn't until today that I realized that my "order confirmation" email from Micro Center was a pre-order. And that explained why I couldn't find any benchmark results.

      • by jwhyche ( 6192 )

        I planned out a new PC build yesterday

        Ahh, nothing like the thrill and the dread on first power up of a new build. Lovingly you put all the parts in, rechecking everything. Then you flip the switch and ..... nothing. After a few frantic moments of rechecking everything you realize it would work better if you plugged the damn thing in.

        For a workstation I'd save up for a Threadripper though. It's not just the threads, it's the fact that you get so many more PCIe lanes

        I was disappointed by the number of pci-e lanes in this chip. Good call on the future expansion capability.

  • Anyone remember this image [anandtech.com] from just a few months ago? AMD was throwing stones at Intel's 'mere' 8% average annual IPC improvements, implying they would do much better than that. And then they drop a chip with 3% better IPC than last year's. Hard not to feel disappointed. When the best thing a review can say is "it's faster than last year's chip in every benchmark" that's damning with faint praise.

    I still think I'm gonna wait to build a new rig until PCIe 4.0 mobos are out. AMD and Intel are dragging their

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...