Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Intel

New CPU Performance Testing Concludes AMD Beats Intel (tomshardware.com) 115

An anonymous reader quote Hot Hardware: If you're looking for the best gaming CPU or the best CPU for desktop applications, there are only two choices to pick from: AMD and Intel. That fact has spawned an almost religious following for both camps, and the resulting flamewars, that make it tricky to get unbiased advice about the best choice for your next processor.

But in many cases, the answer is actually very clear. In fact, for most users, it's a blowout win in AMD's favor. That's an amazing reversal of fortunes for the chipmaker after it teetered on the edge of bankruptcy a mere three years ago, making its turnaround all the more impressive as it continues to upset the entrenched Intel that enjoyed a decade of dominance... Pricing is the most important consideration for almost everyone, and AMD is hard to beat in the value department. The company offers a plethora of advantages, like bundled coolers and full overclockability on all models, not to mention complimentary software that includes the innovative Precision Boost Overdrive auto-overclocking feature.

You also benefit from the broad compatibility of Socket AM4 motherboards that support both forward and backward compatibility, ensuring that not only do you get the most bang for your processor buck, but also your motherboard investment. AMD also allows overclocking on all but its A-Series motherboards (see our article on how to overclock AMD Ryzen), which is another boon for users. And, in this battle of AMD vs Intel CPUs, we haven't even discussed the actual silicon yet. AMD's modern processors tend to offer either more cores or threads and faster PCIe 4.0 connectivity at every single price point.

"We're not covering laptop or server chips," the article notes, adding "There's a clear winner overall, but which brand of CPU you should buy depends most on what kind of features, price and performance are important to you."

Still, it's noteworthy that AMD beats Intel in 7 out of 10 comparisons. The three in which Intel won were gaming performance ("only because we measure strictly by the absolute top performance possible"), drivers and software ("the company has an army of software developers [and] a decade of dominance also finds most software developers optimizing almost exclusively for Intel architectures"), and overclocking, where Intel "has far more headroom and much higher attainable frequencies.

"Just be prepared to pay for the privilege."
This discussion has been archived. No new comments can be posted.

New CPU Performance Testing Concludes AMD Beats Intel

Comments Filter:
  • by Anonymous Coward

    I've always found it funny that gaming performance testing is always so broken. Its always done on a clean Windows 10 install.
    Theres no badly written device drivers (Corsair), no 3 or 4 game launchers (Epic, Steam, Bethesda etc), no Spotify, no Discord or even a browser with a few open windows. They dont even have the standard Windows lag & bloat that accumulates over time.
    The only advantage Intel has had lately is in single threaded performance in some games and I bet the more additional processes ther

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Sunday April 26, 2020 @03:57PM (#59993562)
      Comment removed based on user account deletion
    • ECS, which is the new wondertechnology for multithreading, still runs like an absolute dream on my ancient i7 4970K, so well in fact I am holding off upgrading for a bit longer. Doom Eternal uses ECS, and the way it makes use of my CPU and everything is crazy good. It also highlights the issue of bloat and terrible approaches to performance by many actors in the industry. Unity 3D is going to use ECS eventually, but its current job system is also impressive for performance increase on existing systems due t
      • In case anyone was wondering what ECS stands for it is Entity Component System.

        As great as it is, here is a great talk [youtube.com] on why you might not want it.

        Part of the problem is the clusterfuck of OOP, [youtube.com] namely Pitfalls of OOP [cat-v.org] -- objects scattered across memory completely blow the data cache. The common case is NOT one object, but MANY objects -- hence object pooling is an inexpensive way to get some performance back.

        In contradistinction Data Orientated Design takes advantage of the cache. Matt Godbolt gave a gre

        • First, I am not singing the praises of OOP, however...

          OOP is a way to use code to document data, and in that regard its better than what most OOP AND non-OOP programmers do which is document the code instead of the data.

          i++; // increment the index

          With that being said, OOP code is terribly inefficient. Its not even a close contest.

          The efficient code in any domain structures the data for the important algorithms that process it. Anyone who has followed the writing of someone like Abrash for the past 3
        • Re:Still broken (Score:5, Interesting)

          by TechyImmigrant ( 175943 ) on Monday April 27, 2020 @12:49AM (#59995102) Homepage Journal

          >In contradistinction Data Orientated Design takes advantage of the cache. Matt Godbolt gave a great talk [youtube.com] comparing OOP, functional, and DOD is informative.

          I design silicon chip innards. RTL logic design is entirely data and dataflow oriented design. So when I write software, which I also do a lot, I find myself naturally using data oriented design. Identifying all the state, all the state transformations and where and when the data needs to move. Then build the code around those aspects of the design.

          I can still do that within an OOP or functional or pythonesque or old-school C sort of way. But the efficiency of the data design and transformations and how they interact with the processor memory architecture is what usually counts for the speed and memory performance of the code.

    • By the same logic, Motor Trend should only test badly-maintained* and neglected cars.

      *The fact that Windows needs to be nuked from orbit and reinstalled every ten days in order to be kept in proper operating condition... is a different discussion.

      • by Junta ( 36770 )

        By the same logic, Motor Trend should only test badly-maintained* and neglected cars.

        I don't know about "only", but that actually sounds like a cool sort of evaluation.

        If a drivetrain in pristine shape with OCD maintenance does great, but does so by insanely unrealistic tolerances that fall over if some dust gets in somehow then it may be less interesting than a slightly more underwhelming drivetrain that could just have sludge all over the place and barely notice.

    • Setting aside all the security concerns with the Xeon chips, the HP z400/z600/z800 workstations are very upgradeable & very aggressively priced these days & are built like tanks [you could probably shoot an HP z400 with a 7.62 NATO and there'd be a good chance the machine would still boot]. Note that the HP z420/z620/z820s are even nicer platforms, although more expensive.

      What would be the equivalent for AMD, in terms of a platform which is two or three generations out of production, widely availa
      • I checked all the local Craigslists [out to about 300 miles], and got precisely zero hits on the keyword "Epyc".

        At eBay, the cheapest Epyc motherboard [used] is $300, and the cheapest Epyc processor [open box] is $350.

        That's $650 + S&H before you've even started looking at cases, power supplies, RAM, graphics cards, etc etc etc.

        Where's the quality [& affordable] AMD used market, like we have for the z400/z600/z800 Xeon workstations from HP?
      • Any affordable AMD Server-Class [or even just Workstation-Class] recommendations would be most welcome.

        Again, we're talking used equipment, which could be purchased off of eBay or Craigslist or similar.

        And also the proper keywords to search on, when looking for these AMD systems, would be most appreciated.
      • by Kjella ( 173770 )

        What would be the equivalent for AMD, in terms of a platform which is two or three generations out of production, widely available, aggressively priced, built to last, and ripe for the picking off of Craigslist or eBay?

        Doesn't exist in any meaningful way. It's only with Zen that AMD returned to the server market and they're not "two or three generations out of production" yet. In June it'll be three years after the first gen server CPUs (Naples) launched, maybe you'll see a few showing up second hand after that but realistically I'd keep a server around for at least 5 years these days. I wouldn't expect anything on the market now to be cheap...

        • Thanks.

          That wasn't exactly the answer I wanted to hear, but it'll save me a lot of time which I would otherwise have wasted in searching for something which doesn't actually exist.

          So I guess it'll be circa 2025 to 2030 before we bargain hunters can afford a nice used AMD Server/Workstation?
          • Depending on your requirements, you might even not need Epyc. What performance levels do those old Xeons have anyway?
    • There is a reason for that, WHEN you have that software installed yes it does impact performance BUT IT COULD impact more at 1 sec then another. Its Done clean install to remove ALL POSSIBLE things that could impact the test against 1 side or other just cause 1 burp in cpu usage from said application's.
    • I've always found it funny that gaming performance testing is always so broken. Its always done on a clean Windows 10 install.
      Theres no badly written device drivers (Corsair), no 3 or 4 game launchers (Epic, Steam, Bethesda etc), no Spotify, no Discord or even a browser with a few open windows. They dont even have the standard Windows lag & bloat that accumulates over time.
      The only advantage Intel has had lately is in single threaded performance in some games and I bet the more additional processes there are the quicker that advantage disappears.

      Yep! It's what that clued-in people call a "baseline".

  • I remember hearing AMD was starting to pull ahead, but I also remember them being held back by utterly crap motherboard selection.

    • by darkain ( 749283 ) on Sunday April 26, 2020 @04:14PM (#59993614) Homepage

      That has since changed. Vendors are no longer putting shit-tier components on the motherboards. AMD's X570 platform has been treated as top-tier by all of the various vendors.

      For instance, I'm currently using a X570 Mini-ITX board that boasts Thunderbolt 3, quality and quantity of VRMs, great RAM compatibility, PCIe Gen 4 NVMe, and 64GB ECC RAM. It is currently running a 3900X CPU (only because the 3950X wasn't on the market yet when building). This is a 12c/24t CPU, but could easily support the 16c/32t CPU. Those used to be reserved for the HEDT market, and are now in the normal consumer level market. ECC RAM is another thing missing from Intel's consumer lineups.

      • Does ECC have consumer level value outside of a homebrew NAS?

        • Probably depends how much ram you have.

          Double the ram, double the probability of a corrupted bit at any given moment. A couple doublings adds up....
          • Isn't the probability of error chiefly a matter of surface area?

            • by Kjella ( 173770 )

              Isn't the probability of error chiefly a matter of surface area?

              That's for manufacturing, for reliability it's measured in operations/bit. It's pretty reliable anyway and it would take a lot for it to be of any actual consequence though, like my browser takes up many megabytes and flipping the amount or account number of my online banking would be extremely bad luck. And even then I'd probably catch it on some confirmation screen. Most likely it'd just be a discolored pixel or a game crashing as they do quite often anyway. I wouldn't skimp on it in the data center or a

          • As long as your ram passes memtest86 full run when you obtain it the odds of any errors over its entire lifetime are statistically null.

            This is how we started treating our corp laptop intake when thinkpad switched to soldered on ram. If it passed memtest, deploy. If it failed memtest in any way, RMA. We haven't had a ram failure in years.
        • amd needs an e3 level cpu with ECC for systems that don't need the cost jump to basic epic cups.
          My pbx box does not need 128 pci-e lanes or 8 channels of ram. But does need ecc and IPMI

          • Comment removed based on user account deletion
            • Supporting ECC memory may not be the same as actually doing error correction. Modern ECC memory just has an extra bit per byte to store the code used for correcting errors. The memory controller actually does the calculation and comparison work to implement the error correction. One can get motherboards that will work with ECC memory by ignoring the extra bits.

              Of course your $50 board might really implement error correction - but it would pay to make sure before you shell out extra for ECC memory...

              • The actual ECC handling is done by the CPU, not the motherboard. So it doesn't matter if its a cheap motherboard or an expensive one. The motherboard support itself amounts to early-boot configuration of the memory controller by the BIOS.

                For AMD, all AGESA updates (part of the BIOS image) as of around 2 years or so ago included full detection and enablement support for ECC on Zen, Zen+, and Zen 2 platforms.

                There really isn't much of a distinction between 'cheap' motherboards and 'expensive' motherboards a

            • Yes, basically all AMD motherboards that take Zen/Zen+/Zen2 architecture cpus support and implement ECC. This includes many of the original Zen mobos which did not have it enabled in the BIOS but subsequently made the feature available after a BIOS update.

              The main trade-off is that virtually no ECC sticks are certified for overclocking out of the box, meaning that you have to overclock them yourself if you want to match non-ECC DDR4 that you can buy OCd out of the box. This isn't difficult to do, but is n

          • I think Intel did have the Avoton line for that niche. Pity about the C2000 bug causing Avoton boards to brick themselves and Intel allegedly abusing NDA's to cover it up.

          • by darkain ( 749283 )

            Unless things changed in newer generations, E3 CPUs don't use ECC, they use the same RAM as i3/i5/i7 systems. That's what makes them E3s vs E5s. The E5 CPUs use RDIMMs. AMD CPUs can use UDIMMs, so less common, but still very practical. My AMD 3900X is running on a pair of 32GB UDIMM ECC sticks right now clocked at 3200MHz, so very decent performance wise while maintaining stability.

        • In terms of the value of ECC, there are several components to answering this question. The first thing to note is that DDR4 has error detection and retry for physical trace errors built-in so normal non-ECC DDR4 will be considerably less error-prone than non-ECC DDR3 (or earlier) ever was.

          In terms of whether the ECC helps on top of that, the answer basically comes down to a combination of the use-case, amount of memory installed, and uptime.

          So, for example, I would never even consider putting non-ECC stick

          • So, for example, I would never even consider putting non-ECC sticks into a thread-ripper system with loads of memory in it. That's just asking for bit rot to happen.

            Why? The only way to get undetected bit rot is during write to disk. Any other case will likely result in application error, and I suspect if I had a high end threadripper with 128GB of RAM I wouldn't necessarily use all that power to hit the save button in my applications any faster.

            Yeah I get it the odds of an error are higher, but the odds of something you care about (working data waiting for writing to disk) getting affected remains unchanged.

            • That's not really how it works. Both file data and filesystem meta-data is cached in memory. If you have 128GB of ram, then that's up to potentially 128GB worth of data and meta-data subject to bit-rot.

              Many modern filesystems must update significants amount of meta-data whenever they synchronize modifications to storage. Even small modifications can result in hundreds of kilobytes being written to storage. Much of that is meta-data that had been cached in ram and then modified as part of the topology sy

      • The X570 board I bought late 2019 was just complete shit. (Aorus Elite-something-something). Windows wouldn't recognize NVME disks, and refused to install. Spent over a week trying to figure out what the issue was, then it started showing up in my research that it was a known issue.

        Set up Linux on it and the board would randomly fail to POST about 20% of the time. Then after 2 weeks the board just died completely. No lights, no beep code.

        Swapped the board out with an X470 board - all other component

        • Just a data point but you may want to put the MSI Meg Unify X570 on your watchlist. It's been running flawless the last 5 months for me (3900X, 32GB Gskill F4-3200C14D-32GFX, Linux only). Like what you went through, I read ad nauseum about all the problems among Asus, Gigabyte, etc. back then and almost gave up and went X470. Then this one was released and seems to have been the most well designed right out of the gate. Worth a look anyway.

        • Odd as that Gigabyte board you mentioned is the most reliable and got the top ratings.

          Chipset issues and qualities is why I stuck with Intel/Nvidia this time around as I want something that just works. AMD has fallen greatly and even back when they were competing with Intel it was well known AMD cpus were faster but were for cheap folks like me who was much younger then and had a tight budget. VIA and Nvidia chipsets all had quirks and bugs and Intel chipsets on the Intel side were reliable and just worked

          • Chipset issues and qualities is why I stuck with Intel/Nvidia this time around

            How often do you keep telling yourself that? I mean the Ryzen platform has seen 6 different chipsets all of which have reached the point of maturity.

            AMD got into a stigma with motherboard makers keeping their best boards too for Intel which pissed them off. It wasn't until the 570x did they include at least quality components including your Aorus.

            It sounds like you like the GP are basing an opinion on a product line from multiple companies on a single bad experience. The reality is there are many quality X470, and B450 boards on the market.

        • by ncc74656 ( 45571 ) *

          I might be trying another X570 board later when the good ones come down in price and/or reach maturity. Maybe that's now, I haven't looked since October.

          About three months ago, I upgraded from a Core i5 4690K to a Ryzen 7 3800X. I went with an MSI X570-A Pro [amzn.to] as (1) it was one of the cheaper X570 boards available and (2) as a more business-PC-oriented board, it's not "riced out" like the gamer boards. It's been pretty solid with 32 GB of non-ECC memory under both Windows 10 and Gentoo Linux. (On Linux, I

        • Comment removed based on user account deletion
        • I bought that board a couple months ago. Itâ(TM)s been completely stable, although picky with RAM. The only thing I found it lacking in is fan headers but a splitter took care of that. Iâ(TM)m using an nvme boot drive and dual booting Windows 10 and Ubuntu. Itâ(TM)s been through major stress tests with days-long runs without any problems. Sounds like you got a bad board. Itâ(TM)s frustrating but it happens. I had a new Intel i5-9600k go bad on me, which is why I switched to AMD. I don
    • but I also remember them being held back by utterly crap motherboard selection.

      Which public relations firm are you believing, and why are you believing them? Oh, its Intels and Intel never lies......

    • by guruevi ( 827432 )

      They do well in the benchmarks but that's about it though. What holds back AMD right now is the chip design. It's great for single threaded benchmarking, but when comparing multi-core, high memory and I/O benchmarks they could be as bad as 50% of Intel performance.

      AMD is great for budget gaming platforms and perhaps even office machines, but then again, you don't need much for those.

  • by Ecuador ( 740021 ) on Sunday April 26, 2020 @04:46PM (#59993726) Homepage

    AMD was further ahead from Intel in the Althon 64 / Opteron era and it didn't really make much of a difference to their bottom line. Some reviews showed they were faster, but if you try to find the articles of the era, you'll see most tech magazines & sites had the ludicrous Pentium 4 as roughly equal. And we found out Intel was paying everyone, including large integrators like Dell to keep AMD out. They got a judgement for that, but even if it sounds like a big amount, it was just a slap on the wrist if you look at the company numbers. So I hope AMD capitalizes on their current lead, because we really need the competition, Intel left unchecked is not good at all.
    Interesting anecdote, back in 2004 I was asked by my Uni professor to benchmark offers for servers from a few companies with the software our lab would run on them, for an order of about 100 rackmounted servers. 2 offers used 64 bit Opterons (one was HP), one offer used Xeon (from Dell - it was the Northwood or Prescott equivalent Xeon, HT but 32 bit). We mainly wanted it for two things, an NLP pipeline mostly on Perl and custom C programs for bioinformatics. The Perl stuff was running about 30% faster on the 64bit Linux with the Opteron, even using extra threads on the Xeon (HT), while compiling our C programs on 64 bits without modifications made them fast enough that overall the Opteron was at least 2x faster. The prices of the systems were the same, the Xeons used more power, so my report was about choosing which Opteron system. Dell came back and told them we will give you the servers for half the already low educational price (complicated tax scheme) and my prof said they were getting twice the servers. But... what about the power and heat??? So we got the servers and they stayed for a couple of years off until they could be housed with the appropriate power/AC setup... And they were sort of obsolete when they went online, thing of a 32 bit P4 in 2006-7 (I forget when they went online).

    • by redback ( 15527 )

      Wasn't hard to compete when the P4 was such a turd.

      Its no surprise that current Intel chips are descended from the P3

      • by Luckyo ( 1726890 )

        Initial K7 athlons killed P3s too. Most people forget that era because P4 was absolutely destroyed by athlon64 right after it, but P3 was actually the worse option than first athlon offerings. Even in Slot A format.

        It was incredible after the garbage tier performance of K6. I owned a 700mHz variant of Slot A athlon back then, and and for the weird way you had to slot the processor in and mount the cooler, which didn't do it any favours on thermal/overclocking front (which is why AMD dropped it within the sa

        • They didn't kill P3's. It was more of trading blows. That era was exciting because Intel and AMD were truly competitive and each new chip was faster than the last. It seemed every month or so they traded the performance crown. P4 was never really competitive though. The first gen chips were slower than P3s and used expensive RDRAM. The Northwoods were competitive for a short time because the classic 32bit Athlons were at the end of their life. But that all changed when A64 came out and the P4 never caught u

    • by steveha ( 103154 ) on Sunday April 26, 2020 @05:18PM (#59993816) Homepage

      AMD was further ahead from Intel in the Althon 64 / Opteron era and it didn't really make much of a difference... ...we found out Intel was paying everyone ... to keep AMD out

      There's an important difference this time around.

      In the Opteron era, Intel had committed to two processors: Pentium 4 and Itanium. Both were disasters (for different reasons). But Intel is a large company, and they had a team (in Israel if I recall correctly) that had been researching laptop processors, and had been playing with a die-shrink of the Pentium III design.

      The problem with the Itanium was that it was an experimental design and the experiment didn't work out. Plus nobody wanted it.

      The problem with the Pentium 4 was that it was designed for clock rate above all else, and it didn't get much done per clock. Intel was planning to push clocks to crazy levels to end up with performance, but that didn't work because the chip got too hot.

      The Pentium III architecture got more work done per clock, and with a die shrink it was significantly better than the Pentium 4. Not just better for laptops... better all-around. Intel grabbed this project with both hands, named it the "Core" architecture, and the rest is history.

      Intel copied every other thing AMD was doing better, and had competitive products.

      Then Intel pulled ahead on chip fabrication technology. Intel at one point was two generations ahead and AMD could not make competitive products. AMD was forced to slash prices and try to sell inferior chips on price.

      Now, Intel's big problem is that their 10 nm fab process doesn't work, while AMD is using 7nm in their chips. Now it's AMD who is ahead of Intel on process. And I consider it improbable in the extreme that Intel will find some team somewhere in their empire who has a working 10 nm process (and Intel didn't notice before now).

      So AMD has the better chips again, but Intel doesn't look like it will be able to pull a rabbit out of a hat again. So Intel can bribe companies to hold off on AMD... but how long can Intel afford to do that? Last time, Intel just needed to stall AMD for long enough for the Core CPUs to become available; this time it could take years for Intel to become competitive again.

      P.S. Even through the bad times I have been buying AMD and not Intel. When I read about how Intel's compilers and libraries sabotage their competition [reddit.com] I decided that I don't want to give money to a company that would do that.

      It would be sweet icing on the cake for me if, once AMD chips become really widespread, Intel was forced to stop sabotaging non-Intel chips. The US government didn't stop Intel, just made them publicly admit to it (and they posted a notice as an image file on the web instead of text so it would be harder to search for) but if customers who buy the compiler start treating this sabotage as a bug, then either the compiler will get fixed or everyone will stop using it. Either would make me happy.

      I'd even be happy if it simply became common practice to run a post-processor that nerfed the sabotage.

      • There was also the big issues last time of Intel's monopolistic behavior that they eventually settled for $1.3B with AMD.

      • by Osgeld ( 1900440 )

        Lets not forget about the time all this was going on AMD payed a boatload of money to get a steaming turd of rotten old ATI corpse, to which they still have not fully recovered

        • Comment removed based on user account deletion
          • by Billly Gates ( 198444 ) on Sunday April 26, 2020 @09:36PM (#59994500) Journal

            GO on reddit on AMD and read the comments? All were BSOD, Black screen, How do I fix X, I returned 5700xt for Nvidia, etc.

            I used to be an AMD fanboy but their shit is junk. I returned my 5700XT and my bsod and black screens went away when I paid a $200 premium for those Nvidia drivers. I have DLSS 2.0 and raytracing which AMD doesn't even have on the radar as Nvidia keeps innovating.

            AMD ryzen 1 was notoriously unstable. I had customers call me complaining asking me to take back their AMD systems and requesting to pay extra for Intel etc. All their problems went away when they switched from Ryzen 1700s to i5s. AMD is like the kia of chips. You never know what you are going to get and require work from the user while Intel and Nvidia just work.

            I know your shop uses AMD products and I may have made you mad but in my experience I had to give up on AMD/ATI. I still love my older RX 470 that my 2080Super replaced but I have switched from AMD phenom II to Intel 5 years ago and never looked back on the CPU side. Intel offers me Intel RST for raid and embedded virtualization for my SCCM hyper-V labs and MCSE labs that AMD chips won't do.

            • Comment removed based on user account deletion
            • /r/amd comments are few and far between, and mostly just the loud complainers with autism, or straight up astro-turfing from Intel.
            • If you go on ANY subreddit for people to talk about a product it is filled with problems and people needing help. That is why they are there asking for help. Very few people make threads just to say everything is okay and works fine.

            • by modecx ( 130548 )

              Going to a forum for people where people are requesting help isn't going to enhance your confirmation bias? Go figure. It's like telling a hypochondriac to browse r/medical...

          • by tlhIngan ( 30335 )

            Riiiight...who OWNS the console market again? What APU does the Playstation and Xbox use? Oh that is right...AMD thanks to them buying ATI. Which means not only are they getting a cut of every single console being sold but when all those games get ported to PC guess which chips it'll be better optimized for?

            AMD became the console chip because Intel punted Sony and Microsoft to AMD. Intel had a fun experience doing the original Xbox and didn't want that ever again.

            And AMD was in dire straits years ago - they

          • by AmiMoJo ( 196126 )

            AMD laptops are very attractive now. Intel beating CPU performance and integrated graphics that are good enough to play a lot of games, especially eSports titles.

            The only thing they are lacking is Thunderbolt, but I think USB-C is enough for most people. If you don't want an external GPU then USB-C has everything else covered.

      • The pentium IV architecture was designed around the horrid Rambus ram which they offered Intel free stocks and cheated Micron/IEEE by patenting their specs. Basically the ultra expensive rambus had high bandwidth AND high latency. It needed a long slow instruction pipelines to maximize it's performance with their netburst architecture.

        PentiumIV was also designed to be slower to make Itanium look good. They didn't want too good perofrmance for workstation applications. PentiumIV chipset crippled PCI Express

      • I can't speak for server space, but Every. Single. Time. I got AMD+radeon on PC for gaming, I ran into BSOD, Compatibility issues, and non support as everybody was geared toward intel. Nowadays I buy intel not carring about speed, but support and not getting BSOD'ed regularly. It may have changed in the mean time, but I would really need evidence this has not changed. Quick chip is good, but if the driver/OS support is poor...
      • Now, Intel's big problem is that their 10 nm fab process doesn't work, while AMD is using 7nm in their chips. Now it's AMD who is ahead of Intel on process.

        Right you were up until this line. It's hard to say AMD is ahead of Intel on process given that a) AMD spun off their manufacture, and b) "nm" has been a marketing term since the move to FinFETs. If you look at dimensional comparisons, density per sqmm and other such manufacturing stats you'll see that they stopped being related to the dimension quoted back around the time the 22nm process came out. e.g. Intel's 10nm process is smaller in every dimension than TSMC's 10nm process for 2 years earlier. TSMC's

        • by steveha ( 103154 )

          I'm not really an expert on semiconductor fabrication process technology. But my understanding is that Ryzen 4000 series chips are using 7nm for the "chiplets" but also have an I/O die that was fabbed on 14nm. IMHO this counts as "using" 7nm. I didn't claim that AMD developed the technology; what I care about is what can I get when I buy AMD chips.

          Meanwhile, Intel has some kind of 10nm process but yields are horrible, which I summarized by saying Intel's 10nm process "doesn't work."

          Anyone who wants to un

        • Intel's not behind on fab and process. They are behind in architecture and design. That's what makes AMD have better chips.

          Intel has vast architectural coverage in the Xeon server space that no one product from AMD is going to cover overnight, no matter how much any one AMD product cleans up on its primary Intel competitor at the dead center of the datacenter sweet spot.

          Intel retains immense architectural equity, but right at the moment, it's strangely absent from the center of the sweet spot.

          Intel Expects [extremetech.com]

      • by AK Marc ( 707885 )
        Intel's problem is hubris. They believe their win is inevitable. So they don't fight for it. AMD was the "loser", so they could only win, or Cyrix out of business. They worked on hitting gaps in Intel. Intel "owned" the high end. So make something 80% as good for 50% the price. Boom, sales. x64 was slower than it should have been, so AMD beat Intel to x64. Intel chased a physical impossibility, clock speed. AMD chased efficiency and price. AMD lead for a while, before Intel managed to beat AMD ba
    • The Opteron was popular in the server space but Intel was faster in the desktop space due to better single threaded performance especially in the gaming space.

      These days there there is very little difference between the two for gaming. AMD has the unofficial tag line: "Core for cheap". Intel basically held back quad-core gaming for almost a decade.

    • Only reason AMD didn't to better back then was because Intel was spending literally [b]billions[/b] of dollars every year in direct payments and discounts paying off OEMs to not ship machines using AMD chips. In their own internal communications they referred to Dell as "the best friend money can buy" and were very annoyed at them, even going as far as to reminding them of the "consequences", when they buckled to market demand and started selling Opteron-based servers. Which I might add sold pretty well so
  • When I did my last build in October with a Ryzen 3800 and X570 board, all the benchmarks online, as well as my own, showed little-to-no gain with Precision Boost Overdrive. Between a 5% drop and 3% increase in FPS, depending on the game or benchmark.

    • by Luckyo ( 1726890 )

      Depends on the game. If you're playing a modern GPU-intensive game at high resolution, it's not going to matter much.

      Play something less graphically demanding at low resolution and high refresh rate, run an emulator etc CPU heavy stuff, and you will probably see more difference.

    • Unless you are gaming at a low resolution like 1080p, or have an extreme refresh-rate monitor (120hz or higher), the fps differences are irrelevant. And even if you do, if you game at any decent resolution (1280p maybe, but mostly 1440p or higher), the game will be gpu-bound and the differences will not really be noticable.

      I look at it this way... why spend an extra few hundred dollars on an Intel chip when you could instead spend that extra dough on a better GPU or more memory and still get a CPU that's j

  • In TFS there is a false dichotomy:

    If you're looking for the best gaming CPU or the best CPU for desktop applications, there are only two choices to pick from: AMD and Intel.

    This is false. There is a third option: Zhaoxin.

    This company (a joint venture between Via and the Shangai municipality) makes processors based on old Via technology (with S3 incorporated graphics, also at some time a property of Via) which are 100% IA-32/AMD-64 compatible.

    While these processors are not suitable (for the time being) for Servers or High end Workstations, they are adequate for for office desktops, HTPCs, automotive/infotaintment and low power/single board appl

    • by DRJlaw ( 946416 )

      If you're looking for the best gaming CPU or the best CPU for desktop applications, there are only two choices to pick from: AMD and Intel.

      This is false. There is a third option: Zhaoxin. ...
      While these processors are not suitable (for the time being) for Servers or High end Workstations, they are adequate for for office desktops, HTPCs, automotive/infotaintment and low power/single board applications.

      Right now they have a 4 core processor which is nothing to write home about...

      If you completely redefine th

      • If you're looking for the best gaming CPU or the best CPU for desktop applications, there are only two choices to pick from: AMD and Intel.

        This is false. There is a third option: Zhaoxin. ...
        While these processors are not suitable (for the time being) for Servers or High end Workstations, they are adequate for for office desktops, HTPCs, automotive/infotaintment and low power/single board applications.

        Right now they have a 4 core processor which is nothing to write home about...

        If you completely redefine the problem, you can shoehorn in an "adequate" processor "that is nothing to write home about."

        Got it.

        Do you mean that since the 2007 amd barcelona cores, all the way to 2017 amd xen cores, users had only one choice and that is intel?

        I myself conceded that these machines can not work as servers or workstations (gaming included). You qoted me on that (i made it bold for your convenience).

        Desktop applications can be something like browsing the web, and writing emails and memos in an office. A corei 9 or a Threadripper would be a waste (of money and energy) on these situations.

        • by DRJlaw ( 946416 )

          Do you mean that since the 2007 amd barcelona cores, all the way to 2017 amd xen cores, users had only one choice and that is intel?

          Irrelevant to this issue. However, if you're excluding rather than including the zen cores, then basically yes. 2009-17 was Intel's oyster. But that's as far as I'm willing to digress from the original topic, so I don't recommend that you attempt to pursue this line of argument any further.

          I myself conceded that these machines can not work as servers or workstations (gaming

      • Yup, I saw it a few days ago. Interesting, even in the boring monotone of gamernexus.

        Waiting for the review of the 8 core variant with the non-S3 graphics.

    • Comment removed based on user account deletion
      • by AmiMoJo ( 196126 )

        You do that, just don't come crying to us in a few years when suddenly Chinese x86 chips are taking over the budget end and even the performance side is seeing competition.

    • by hawk ( 1151 )

      One would be *insane* to put such chip into a computer that even *might* handle sensitive data.

      These are for internal ChiCom use, for their clients, and for dannfools (regular fools will not use them).

      hawk

  • AMD is useless as they backscreen all the time due to lack of stable drivers. You always go with Intel CPUs and Nvidia GPUs if you are a serious gamer or PC enthusiast as that is common sense.

    • Comment removed based on user account deletion
    • by Khyber ( 864651 )

      What lack of stable drivers? My 5700XT had no issues out of the box and remains as operational.

    • by rl117 ( 110595 )

      This is nonsense. I've been running AMD GPUs with a mixture of Intel and AMD CPUs over the last 15 years from the HD4850 to the 580X on Vista to Windows 10, and they have all been absolutely fine throughout for both Windows and Linux for normal and gaming use.

      The ATI drivers had a bad rap back in the '90s and early 2000s. Maybe it was justified; I don't know myself since I wasn't using their stuff back then (having a Matrox G400), but I seriously doubt that the same degree of criticism holds true today.

  • Which is probably why Intel is not ultra performance focused and instead focused on scale and power usage.. also max bugs in their chips heh. Well really we're probably using machine learning to find bugs faster or such.
  • Wait a minute... (Score:5, Informative)

    by msauve ( 701917 ) on Sunday April 26, 2020 @10:41PM (#59994740)
    Intel won in drivers? It seems it was just in the past year where they removed their archives of "old" drivers from their website, as if they couldn't afford the storage space. That should be considered. If you want me to buy your new stuff, support your old stuff.
    • Intel won in drivers? It seems it was just in the past year where they removed their archives of "old" drivers from their website, as if they couldn't afford the storage space. That should be considered. If you want me to buy your new stuff, support your old stuff.

      I'm not sure why you think Intel's support for older drivers has anything to do with *other companies* supporting and optimising for Intel, and general higher quality of Intel's driver releases.

      Seriously AMD, how many drivers and AGESA releases do you go through on each new product before people's PCs actually boot consistently and run stably.

    • Intel won in drivers? It seems it was just in the past year where they removed their archives of "old" drivers from their website, as if they couldn't afford the storage space. That should be considered. If you want me to buy your new stuff, support your old stuff.

      Yeah, because pretty much every OS has those old archived drivers baked in now.

  • I had been thinking about going with a Core i7 for the slight edge in gaming performance but the price advantage and overall performance advantage going to the Ryzen 7, it was clear what the better option was. So far, I'm really enjoying it.

    LK

  • The more they stay the same. Except, nothing really has changed here in the last 20+ years. Although AMD occasionally managed to outperform Intel for periods of time, Intel would continue to claim the lion's share of years at the top of the list. And AMD was always the value leader.

    I'm glad to see that as a company, AMD is performing better financially. Intel needs a strong competitor to keep them on the right innovation track. But Intel has way too much depth in their bench to allow AMD a chance to dominat

  • I find the that of the rounds where Intel wins, 1 stands above all:
    "AMD has been beset by issues with its CPU chipset drivers and graphics drivers of late, which is a natural byproduct of its limited resources compared to its much-larger rivals. Intel isn't without its missteps on the driver front, but the company's reputation for stability did help earn it the top spot in the processor market, particularly with OEMs."
    Forget OEMs, for every penny I have saved in AMD builds, I have paid for it in frustration

  • by Inglix the Mad ( 576601 ) on Monday April 27, 2020 @02:28PM (#59997408)
    Intel and AMD swapping the performance crown is a good thing in my mind. This keeps both companies on their toes, but Intel has more resources to cover for when they screw up. AMD typically gets near bankruptcy when they really mess up.

    Back in the 486 days, AMD procs were good chips, you could get a DX4-100 that was a decent chip for a good price. Intel's chips were good, but the price was higher. We won't go into Cyrix and others.

    However AMD's K6-2/3 chips were -expletive- for new builders. They were also a PITA for OEMs. They just ran too hot, and the heatsink was way too easy for new builders to screw up. I can't count how many came back to the store, even with a warning (and offer to mount on the board) about the processor / heatsink. This was just a poor design, and the thermals were terrible. At the time if you ran an Intel chip with an improperly mounted heatsink, the system would start glitching and shutdown. MUCH BETTER and newbie friendly.

    Intel rode the dominance for a few years with the newer Pentiums until the Pentium D... aka the space heater. That CPU was just an inefficient way to turn electricity into heat, terrible. However, for many years, the Pentiums were usually better... but more expensive. AMD's were competitive, but really you bought them for the price point or because you were a fanboy.

    AMD came back with a gusto, targeting smaller makers like I worked for, using the A64 architecture. Intel was farked in comparison, and could only beat the A64 in a few very specific tests, at a much higher price point. Yep, Intel had their back to the wall. The Athlon 64, and any decent A64 board, would run circles around an Intel system gaming, and many other uses.

    Intel got lucky, breaking out of their design dead-end, later thanks to their laptop chip design team quite accidentally developing a kick-butt CPU that could be scaled to desktops. The Core series of processors was just head and shoulders above anything AMD had at the time. Yes, it literally ran rings around AMD, and then dunked on them to be safe. Intel also released the Corporate Stable Model (CSM) boards around the same time. Boring to a gamer, CSM boards were welcomes by IT departments and system builders

    I kind of fall out of the game here in comparison. I quit working part-time for the PC maker, and started flying more as I had more free time. Suffice to say, I think Intel and AMD leapfrogging each other is a good thing. Both companies have had their -expletive- CPU's (and other products), but by and large they've both done fairly well. My latest Intel machine is a few years old, and I've been looking to replace it for gaming. Should I get around to it this year, although I don't have a new AAA title to aim for yet, looks like it will be an AMD this time around.

    I have no problems with this, as I am not a zealot for either. I thought the Athlon64-3200 I built was awesome, and it held up well because my brother is now using it in the original Lian Li case with the original motherboard, raptor 10k's, and 6800 Ultra. On the flip side my Core i7 (original series) is still running at home as a guest computer, and works flawlessly with the "new" solid state drive I installed in it after the other drive croaked.

    The only part of AMD I don't really care for is the ATI/Graphics division. I've always had issues with ATI cards and 3D gaming. Not that the hardware is terrible, it's usually fairly good, but the driver issues drive me nuts. I've gotten Nvidia chip cards to replace some ATI chip cards, just because I got sick of the driver issues. However it has been over 3 years, might be time to give them another try. (please, oh please, don't f**k up the drivers)

    Best of luck everyone! Enjoy what you buy!
  • Then what's the point? All of these gorgeous synthetic benchmarks fall flat on their faces when it comes to light that Intel was lazy yet again with some part of their core design requiring a predictive pipeline to be disabled for security reasons. If you're ok with a significant chance of becoming a gigantic fireball, there are redeeming qualities to a Pinto performance too!

"If I do not want others to quote me, I do not speak." -- Phil Wayne

Working...