Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel

MSI Leaks Intel 14th Gen Core Specs, Confirms It's 3% Faster on Average Than 13th Gen (videocardz.com) 56

An anonymous reader shares a report: MSI made a mistake of sharing an unlisted video which has now leaked out. This leak involves a product training video showing MSI's latest Intel 700 motherboard series and upcoming PC cases. While the majority of the video focuses on improvements for these motherboards, there is a slide explaining Intel's next-gen Core series in a very short form. A slide (check the linked news post) has confirmed that Intel's 14th Gen Core series will see no major core count upgrades. Only one of the upcoming K-series CPUs will see a core change for its hybrid configuration of Performance and Efficient.

The Core i7-14700K is getting 12 Efficient cores, which is an upgrade over 8 E-cores the Core i7-13700K. The Core i9-14900K, now confirmed by MSI as well, will use the same 8P+16E configuration. This also applies to Core i5-14600K with 6P+8E config. [...] The CPU series will use the same Intel 7 process technology and will only provide higher DDR5 frequency support. The company confirms that Intel 14th Gen Core is 3% faster on average compared to current-gen series. The most important upgrade involves the Core i7-14700K which has up to 17% faster multi-threaded (MT) performance due to extra cores.

This discussion has been archived. No new comments can be posted.

MSI Leaks Intel 14th Gen Core Specs, Confirms It's 3% Faster on Average Than 13th Gen

Comments Filter:
  • by Anonymous Coward
    Maybe they're focusing on security improvements rather than speed.
  • by ddtmm ( 549094 ) on Thursday August 24, 2023 @10:09AM (#63793310)
    Pretty sure the price increase won't be just 3%.
    • This is sad, but true.
    • Re: (Score:3, Informative)

      by KiloByte ( 825081 )

      A spec increase of just 3% is misleading.

      Haven't bought x86 hw for a while, but here's an example from early 2019: at the time, both me and The Linus bought/got a fat 64-way Threadripper, a few months apart thus his was a slightly newer model (2990 vs 3950). On paper, they have identical specs, with just a 3% performance diff claimed.

      But that's only if your marketing dept is so stuck on benchmarks or "TCO" to be unable to find other issues.
      On my piece, memory channels are wired to only two out of four chip

    • by Moof123 ( 1292134 ) on Thursday August 24, 2023 @12:00PM (#63793600)

      I was expecting a lot Moore than 3%...

    • Even if the difference between Intel 13th and 14th gen was ONLY 3% (which it's not, it's more), that's still an entirely new purchase. (On a usually non-upgradable laptop soldered in CPU.)

      $$$$$$$$$

      (I mean, the same is true of every CPU upgrade, ahahahha.)

  • Gotcha.

    Anyway, I'm a Ryzen fanboy now.

    • by AmiMoJo ( 196126 )

      I've been waiting for a really good Ryzen laptop for years. I'm hoping this year Lenovo does one because so far they have all had deal-breaking issues.

      If you go Intel it's no problem to get a couple of Thunderbolt ports, a couple of NVMe slots, a couple of DIMM sockets, and a reasonable price. Every Ryzen laptop falls short somewhere, usually a single USB4 (Thunderbolt) port, or soldered RAM because Ryzen really needs high RAM clocks.

      I don't want to get an Intel because they run hot and Intel suck for a var

      • I'm sure you have already assessed this but if not AFAIk this is the ideal geek ryzen laptop:
        https://frame.work/products/la... [frame.work]

        • meant to link the 16 inch
          https://frame.work/products/la... [frame.work]

        • by AmiMoJo ( 196126 )

          Yeah, I took a look at it. It very nearly hits all the marks, only missing the dual NVMe capability.

          The ports are decent, although the USB4 ones are only the top two which is a shame. If you want to use them both for USB4, you end up with a cable sticking out of either side. If they were both on the same side it would be a lot easier.

          Minor, I know. And maybe not such a big deal as you can have a USB3.2/DP port in one of the lower slots.

          The 16" version might be a better option for me. The only issue I can se

      • by jiriw ( 444695 )

        I bought an ASUS ROG Stryx Scar two years ago for a little over 2K euro, AMD Ryzen 5900HX with 2 M.2 slots and the USB-C 3.1 2nd gen port does support 4K HDMI with a Vention USB hub I bought for a couple of tenners. (In device manager the monitor seems to be directly connected to the NVidea 3070 mobile in there). With the included HDMI port you can drive 2 external 4k monitors. Don't know if those specs are good enough for you but they are for me :P It doesn't have soldered RAM, though and I'm glad for tha

        • by AmiMoJo ( 196126 )

          Thanks. Even the 2023 model doesn't have an USB4 ports though. Lack of USB4 is a real deal-breaker for me.

      • or soldered RAM because Ryzen really needs high RAM clocks.

        Also because LPDDR has no non soldered form factor yet. CAMM should change that when it's finalised.

        • by AmiMoJo ( 196126 )

          Ah yes. Still, I'd rather trade some battery life for RAM sockets.

          • Yeah depends on mobility (weight, size) too.

            Sockets certainly improve the longevity. I recently upgraded my ThinkPad W510 to the maximum (reasonable?) amount of RAM, 32G which is still more than respectable for day to day tasks. 13 year oldmachine, so not bad that it's still going!

            It's a massive brick though. If I'm to be traveling around a lot I'd opt for portability over longevity, TBH.

            I'm glad CAMM is coming. Best of both worlds.

            • by AmiMoJo ( 196126 )

              I don't travel much these days... I think my next one will be heavier than my current ultrabook, an NEC.

              Probably either a Framework 16 or one of the larger Thinkpads. The problem with Thinkpads is there is only one good time of the year to buy them, which is around Black Friday. Without the big discounts they are very expensive.

              • I don't travel much these days... I think my next one will be heavier than my current ultrabook, an NEC.

                Me neither. My SO does, and if I did, well, weight is probably my #1 priority. I've managed to mitigate most of the problems with my bad back, but any carrying on my shoulders unless it's in a very good camping or hiking type backpack sucks for me.

                The problem with Thinkpads is there is only one good time of the year to buy them, which is around Black Friday. Without the big discounts they are very expensi

                • by AmiMoJo ( 196126 )

                  I'm starting to lean towards the Framework. Last year Lenovo Japan did a Black Friday deal where the RAM and SSD upgrades were priced about the same as buying and fitting them yourself, but there is no option to buy the machine without RAM and SSD. You have to at least get the lowest spec option, even if don't have a pre-installed OS.

                  It might be for support reasons, as in they require you to fit the original parts and show that the fault persists before doing any warranty work. Which is fair enough I guess,

                  • Irritating, but I don't think it's warranty related, having invoked the onsite repair.

                    The machine in question had a disk upgrade. Before calling the repair in, I ran their diagnostic tools and wrote it up thoroughly enough that the repair person went straight for the motherboard swap. I think they have spares of every replaceable part anyway because they don't know what it might be until they get there and the diagnostics all run booting off USB sticks.

                    • by AmiMoJo ( 196126 )

                      I do worry a bit about having only one primary machine. Last year my PSU died and I was able to swap it out very quickly with one from a local supplier. Now with USB C charging at least the PSU part is replaceable, but a laptop motherboard fault is likely to mean a few days of downtime at minimum.

                      I should make the effort to keep a backup system reasonably up to date.

                    • Yep!

                      Currently I've got a decent desktop, an old laptop and an old desktop in bits.

                      My general solution is to use Borg to backup my laptop, desktop and SO's laptop to a spinning disk in the desktop and sync that with a cloud provider using rclone. So I have full on and off site backups.

                      Setting up a new machine, or the equivalent is very easy. The vast majority of the time is spent restoring a snapshot, which in practice means a fair amount of time is spent trying to remember how to set up ac second interface

                    • by AmiMoJo ( 196126 )

                      I've put a lot of effort into making everything as portable as possible, so reinstalls are easier. Some stuff can be scripted, but I only really use that for settings. I'm probably very out of date with it, I know a lot of people use OS images and share them between machines.

                    • I know a lot of people use OS images and share them between machines.

                      That sounds like a bit of a faff. I certainly don't reinstall that often, and half the time it's driven by an OS being out of support, so that wouldn't help much either.

                      Mostly I've got my files backed up just in case, and got my desktop config in git, with my bashrc, fvwm2rc and start scripts, so all I need do is clone that, install a few packages from apt and I'm good to go. Recently I've been trying out XFCE instead of my usual FVWM jus

      • by gweihir ( 88907 )

        I bought a Lenovo E14 with a Ryzen a while ago for teaching. No complaints. Can even do not too demanding gaming. RAM is in a slot and there is a second long NVME slot that is empty. No Thunderbolt, but I really do not want that anyways.

        • by AmiMoJo ( 196126 )

          I'm keen on Thunderbolt because I want to use it as a desktop replacement most of the time. Two 4k monitors, loads of USB, 2.5Gb or even 10Gb ethernet. I'll get a couple of docks. Maybe down the road look at an eGPU if needed, although the Framework 16 has the option to fit one right into the case.

          • by gweihir ( 88907 )

            You are looking in the wrong place there. That is the job of a real computer with PCI-E slots, not some laptop. "Desktop replacement" only works if you have low requirements.

            • by AmiMoJo ( 196126 )

              Maybe... But it seems like a laptop with iGPU is powerful enough, in the right configuration. An Intel model would be fine.

      • I have a HP convertible Laptop with 2 usb-thunderbolt port and 16gb of ram (don't know if soldered.
        But anyway, I always find that HP are for today the best laptop maker regarding bang for the buck and longevity.

        • by AmiMoJo ( 196126 )

          I just dislike HPs, having had a series of crappy ones from employers. Their high end ones are probably really good.

      • In my experience, simply having been made by Lenovo is sufficiently "deal-breaking".

        The temps on the last two generations of Intels are frikkin absurd. 100 degrees Celsius is not okay on a laptop. I've no idea why they thought they could do that.

        • by AmiMoJo ( 196126 )

          That's pretty much just Intel for you though. Unless you buy a much thicker laptop, like a gaming one, they run insanely hot.

          Their AMD thin ones are pretty good on thermals.

    • There's no question that Ryzen is good. But their major feature is TSMC having a smaller process node than Intel. A smaller gap than before but still better. Makes a huge difference in speed and efficiency.

      Intel was way behind until the 10th generation and then only got down to 7nm equivalent with the 12th generation. TSMC is already making Ryzen CPUs in the 4nm size.

      TSMC is doing some amazing work - supplying Apple for all of their iPhones and Macs, making all the Ryzen stuff, all the NVIDIA stuff. An

  • Is that before or after all the security patches to fix all the ways they've tried to boost speed for several generations of chips? I'll go with AMD, but thanks for the info.
  • by xack ( 5304745 ) on Thursday August 24, 2023 @10:33AM (#63793366)
    Technically we are still shrinking transistors but only one company can still make the fabbing equipment due to the economics of complexity. But we are effectively getting slower yoy thanks to new security patches, modern apps being embedded web browsers with gigabytes of ram usage plus operating systems in general just being vehicles for subscription these days. People are too addicted to dopamine levers to actually make progress these days so the new Moore's law is the enshittfication doubles every six months.
    • Technically we are still shrinking transistors but only one company can still make the fabbing equipment due to the economics of complexity.

      Economics of wha? Not quite sure what the MBA-speak is implying, but the investment into shrinking transistors is either worth it, or it is not. There isn't even a question of worrying about demand, so the math should be far easier than what most businesses have to endure. to justify the investment.

      But we are effectively getting slower yoy thanks to new security patches, modern apps being embedded web browsers with gigabytes of ram usage plus operating systems in general just being vehicles for subscription these days. People are too addicted to dopamine levers to actually make progress these days so the new Moore's law is the enshittfication doubles every six months.

      Won't disagree too much with you there. the ad-infected UI devolved much like the Leave No Child Behind classroom did; to the lowest common denominator. For electronics that meant marketing to toddlers to sta

      • The economics of fabs are a whole subject in itself. In general the margins are low, and the required investment is huge for each node shrink. If you don't "win" the next node you have very high sunk costs, and an empty fab. Fabs only turn a profit when they are substantially full, so you actively burn cash just to stay in place, starving you from competing for the next smallest node.

        Global Foundries threw in the towel and stopped chasing the smaller nodes during the move to 7 nm a few years back, pointi

        • What happens 10 - 20 years from now when a entire planet has decided to simply stop chasing, and now all modern portable electronics are fully reliant on a single company who just suffered massive attacks against their main processing plants by some nutjob off the chain who just wanted to screw with THE global supply provider for shits and giggles?

          That scenario is not even diving into the real politics of needing TSMC to remain a corporate Switzerland no matter what. TSMC suddenly finding a reason to stop

    • But we are effectively getting slower yoy thanks to new security patches, modern apps being embedded web browsers with gigabytes of ram usage plus operating systems in general just being vehicles for subscription these days.

      [Citation Required]. Hitting the browser icon and clicking Slashdot is still orders of magnitude faster than doing the same thing 10 years ago, or even 20 years ago.
      Even my pathetic work laptop has no issue opening a browser or Word / Excel in just a couple of seconds, which since I don't wear road coloured glasses is massively faster than opening Office 2007 back in the day, or firing up Mozilla or a v 2.x of Firefox.

      Now if you were one of those people who spent a fortune on RAM back 10 years ago and insis

    • Technically we are still shrinking transistors

      Not even that any more. The nanometer size given for gate size has not actually been the gate size for a very long time. They are "process nodes", and are given in pseudo-nanometer gage sizes as marketing terms. It has zero relation to any actual size of gate pitch or length. The increase in transistor count per chip is more due to tighter packing (eliminating wasted space) and stacking than in actual gate shrinkage. They keep using smaller pseudo-gate sizes simply as a marketing term to try and convin

      • Agreed. It would be nice to have a better metric for CMOS nodes. Some standard hunk of Verilog code and what the resulting area consumption is, all-in (power busses, routing, etc). Maybe like how many ARM XYZ cores per mm^2, and the resulting clock speed for a fixed number of retiming stages. Gate length was already pretty useless, now it is just marketing mumbo jumbo. I realize there are too many variables, but maybe even just the minimum gate/flop density of the standard cells. X um^2 per invert/NAN

    • 486 dx266
      16MB
      2x256MB hard drives
      Windows for Workgroups 3.11

      From the time I double-clicked on the MS Word 2.0 icon, to the time I could type?

      Less than a second.

      Things have never been that good since.

    • Being a retro enthusiast, it never ceases to amaze me how little power is actually required to do what we really need computers to do.

      But hey, animated special effects and dark patterns go BRRR...

  • I'm seeing that on servers too, they are still not able to catch up to AMD, not only that, ARM solutions are starting to catch up as well (and I am not referring to Apple Silicon which is way faster but does not really compete directly with AMD/Intel).

    My last cloud server performance comparison [dev.to] included Intel's latest Sapphire Rapids and it was pretty much toe to toe with AMD's previous gen (Milan). And Amazon's own Graviton3 was at their heels for peak single core performance (or even ahead for multi-threa

    • I'm evaluating AMD's next gen (Genoa) which is under preview on some providers, and while it's under NDA and you won't hear much about it yet, there is an actual per-core improvement (in addition to the number of cores), unlike this sort of 3% etc Intel has been managing lately...

      It may interest you to look at the recent Phoronix benchmarks of the new Epyc Genoa based AWS EC2 instances here: https://www.phoronix.com/revie... [phoronix.com] These AMD chips are super fast.

  • I am not sure all that speed is for me. It might encourage an increase bloatware. What I means is, it may turn out like my annual gym membership. I would not use it except as an excuse to eat more donuts because I’m going to the gym.

  • Still running happily on a 10700F and will be for a while yet. I have a Ryzen 9 3900x that was relegated to server use because the cooling solutions were just too noisy and/or complicated. Still have a bunch of FX-8350 Bulldozers doing useful work here and there.

    When someone tames price/performance to the point where it is useful to upgrade some stuff, maybe i'll pay attention then.

  • This announcement is so boring I fell asleep on the spot and hit my head on the desk.

  • That might be in the measurement noise range.
  • So no net gain. They need faster processors because no one can code properly anymore. All applications are floundering, the skills and knowledge are lost. It's just going to degrade from here. We've past the peak (about '86).
  • This was/is a token processor, only released so that intel can claim they released a new processor, while helping manufacturers to get rid of 1700 mobos and chipsets. And you know what? This is 100% ok. Manufacturers need to get ibnventory levels in check for the market to be healthy, and if this helps to achieve that, so be it.

    In any other timeline, thhis would have been catalogued as a mask revision of the previous chip, not as "next gen".

    I am chugging along with a 6 core 12 threads 8th gen core, and I s

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...