Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses Intel

Intel's Horrible Quarter Revealed an Inventory Glut and Underused Factories (cnbc.com) 86

An anonymous reader quotes a report from CNBC: Intel's December earnings showed significant declines in the company's sales, profit, gross margin, and outlook, both for the quarter and the full year. [...] In short: Intel had a difficult 2022, and 2023 is shaping up to be tough as well. Here are some of the most concerning bits from Intel's earnings report and analyst call: Intel didn't give full-year guidance for 2023, citing economic uncertainty. But the data points for the current quarter suggest tough times. Intel guided for about $11 billion in sales in the March quarter, which would be a 40% year-over-year decline. Gross margin will be 34.1%, a huge decrease from the 55.2% in the same quarter in 2021, [CEO Pat Gelsinger's] first at the helm. But the biggest issue for investors is that Intel guided to a 15 cent non-GAAP loss per share, a big decline for a company that a year ago was reporting $1.13 in profit per share. It would be the first loss per share since last summer, which was the first loss for the company in decades.

Management gave several reasons for the tough upcoming quarter, but one theme that came through was that its customers simply have too many chips and need to work through inventory, so they won't be buying many new chips. Both the PC and server markets have slowed after a two-year boom spurred by remote work and school during the pandemic. Now, PC sales have slowed and the computer makers have too many chips. Gelsinger is predicting PC sales during the year to be around 270 million to 295 million -- a far cry from the "million units-a-day" he predicted in 2021. Now, Intel's customers have to "digest" the chips they already have, or "correct" their inventories, and the company doesn't know when this dynamic will shift back. "While we know this dynamic will reverse, predicting when is difficult," Gelsinger told analysts.

Underpinning all of this is that Intel's gross margin continues to decline, hurting the company's profitability. One issue is "factory load," or how efficiently factories run around the clock. Intel said that its gross margin would be hit by 400 basis points, or 4 percentage points, because of factories running under load because of soft demand. Ultimately, Intel forecasts a 34.1% gross margin in the current quarter -- a far cry from the 51% to 53% goal the company set at last year's investor day. The company says it's working on it, and the margin could get back to Intel's goal "in the medium-term" if demand recovers. "We have a number of initiatives under way to improve gross margins and we're well under way. When you look at the $3 billion reduction [in costs] that we talked about for 2023, 1 billion of that is in cost of sales and we're well on our way to getting that billion dollars," Gelsinger said.
The bright spot for Intel: Mobileye, its self-driving subsidiary that went public during the December quarter. According to CNBC, the company reported earnings per share of 27 cents and revenue growth of 59%, to $656 million. "It also forecast strong 2023 revenue of between $2.19 billion and $2.28 billion," the report adds.
This discussion has been archived. No new comments can be posted.

Intel's Horrible Quarter Revealed an Inventory Glut and Underused Factories

Comments Filter:
  • by iAmWaySmarterThanYou ( 10095012 ) on Friday January 27, 2023 @09:14AM (#63244423)

    Global economic growth is down. Inventories rising everywhere. This is not an Intel specific event.

    But they have an additional problem as a cpu maker that's only getting worse every year. Most people can easily get by with a 5-10 year old cpu for daily use. Email, light surfing, document creation, small office spreadsheets etc don't need a 24 core 3 ghz cpu. Yes, my pedantic ocd friends, there are always some people with unlimited cpu need but they're not the core market.

    Step outside an office and look at the gamers as the next largest buyers. For many years gaming has been about gpu first, cpu just needs to be sufficient for most games. Great for nvidia, nice for and, meh for Intel.

    I used to get a new cpu/system every year or two. Now I look at Newegg out of old habit and can't justify any of it.

    This trend is only going to get worse as cpus continue to get faster but average buyer need doesn't match.

    • by kalpol ( 714519 )
      It's true. I'm still flogging an AMD FX-8350 that is just fine for everything I've need it for so far. It's hot, but with a big old radiator on it I'm getting my money's worth. And as I recall from last year and the year before, Ryzen processors were the thing to have anyway, not Intel.
      • I just sold my FX-8350 system to someone who lost their home server in a landslide-related outage. With 32GB of very fast RAM (for DDR3 anyway) it is surprisingly peppy. My Ryzen 5 1600AF (continuing my tradition of using $100 CPUs) has significantly better single thread performance, but the multithread performance is surprisingly not that much better. I got every penny out of that FX — including roughly 50% of the purchase price of MB, CPU and RAM back out at sale time, I kept all of the storage devi

        • by MobyDisk ( 75490 )
          Despite being a gamer and a developer, I've been using an i5-4570 which was first launched in 2013! I just kept upgrading the video card and SSD. That setup got 50-70fps playing modern games like RDR2, Witcher 3, and Midnight Suns. Interestingly, games got the same frame rate on "Low" and "Ultra" settings since it was CPU-limited. I finally upgraded to an AM5 setup.
          • Yes, i5-4590 here. Finally upgraded to 1440p and a 2019 GPU rather than my 2014 GPU.

            I had everything ordered for a new, 13th gen Intel build. And then I realized "Why?". And took everything back to the UPS store.

            My plan is to move to AM5 when the next-gen AMD CPUs come out and 7600 prices drop. And motherboard prices drop. And DDR5 prices drop.

            And then perhaps move to the beefiest reasonable CPU from the last AM5 gen, whenever that comes out. And then wait 10 more years until I can build again.

            • by MobyDisk ( 75490 )
              I endorse your plan. I game at 1080p, but on a 144fps monitor. I jumped at a 7700X because Micro Center offered $20 off a motherboard and a free 32GB of DDR5-6000 RAM (G.SKILL Flare X5, CL36). So that $344 processor became $344-$160=$184. The 13-gen Intel still looks good, but that sale made the AMD a winner. I also love how AMD kept around AM4 for so long. If Intel had that philosophy, we would probably have upgraded our 4th-gen chips. I expect to be able to do that with my AM5 chip.
      • by gweihir ( 88907 )

        Indeed. Just replaced an FX8350 with a Ryzen 9 7900X. Much better value for money overall than Intel.

    • by DarkOx ( 621550 )

      Well the average user isn't using a 24 core 3 ghz CPU for those things now, they either using much older kit or they have moved on to an ... iPad Pro.

      There is huge market in replacing the still common breadbox PC with all its bulk and fan noise across millions of corporate cubicle spaces and older units under desk in homes. The problem Intel and probable AMD have or will have in the near future is

      1) Relatively high-end parts (as compared to other makers of CPUs not say i9 vs i5) make up a large portion of t

      • There is a huge range of efficiency measured in operations per watt/hour over the same architecture. Chips running at higher clock rates tend to use disproportionately more power, while providing more absolute performance.
        So compact and energy efficient vs. maximum performance is mostly a choice of the manufacturer. Both Intel and AMD have CPUs in the 15W class, in AMD's case even with semi-decent graphics performance.
        Currently, the typical desktop is more of a power hog, but Intel/AMD could produce boards

    • Indeed, this is why I'm still running a 13 year old computer that I put together myself. It was a decent computer at the time (i7 920, 6GB of RAM, NVIDIA GTX260) but even now it does all I need. The only improvements I put into it were 6GB of extra RAM because it was cheap and a slightly better video card Radeon 7770) because the original card died on me. I run Linux as my main OS (dual booting to Windows 10 when needed) so it's still a pretty peppy system. About the only thing I can't do is run current
      • Indeed, this is why I'm still running a 13 year old computer that I put together myself. It was a decent computer at the time (i7 920, 6GB of RAM, NVIDIA GTX260) but even now it does all I need. The only improvements I put into it were 6GB of extra RAM because it was cheap and a slightly better video card Radeon 7770) because the original card died on me. I run Linux as my main OS (dual booting to Windows 10 when needed) so it's still a pretty peppy system. About the only thing I can't do is run current games, but I have consoles for that. I'm sure I'll have to upgrade eventually, but that most likely won't be for another 5 or so years (or when components start failing). The requirement/need to upgrade your computer every 5 years is long gone unless you're running some specialty applications or are a gamer.

        Amazingly, I did the same, except I started with 12 GiB of RAM as a way to future-proof it: I didn't know how long compatible RAM would be available. I am still using an Nvidia card. I put it in a full-size quiet case, what did you use to contain it?

        A couple of years ago I decided that 12 GiB wasn't enough for the image processing I wanted to do, so built a replacement with 256 GiB of RAM. I still run the old system, though it is usually headless and just drives the storage tower that holds the backup di

        • >> I put it in a full-size quiet case, what did you use to contain it?

          I have a full sized case, I can't remember the manufacturer (Apevia maybe?). One thing I've learned is that while blue LEDs were cool when they first came out, they're kind of bright and obnoxious now. My next case will be a little more on the functional side and less on the 'cool' side.
          • I put it in a full-size quiet case, what did you use to contain it?

            I have a full sized case, I can't remember the manufacturer (Apevia maybe?). One thing I've learned is that while blue LEDs were cool when they first came out, they're kind of bright and obnoxious now. My next case will be a little more on the functional side and less on the 'cool' side.

            Good choice. My case is a Nexus with no lights at all. My new computer is in a Be Quiet case with LEDs, but they can be turned off.

    • We have growth of 2.9% last quarter. That's pretty good actually. A few people wants to make the economy slow because they either think it'll help inflation or they want to destroy worker bargaining power or both
      • Read this and see if you feel the same. Detail matter. A lot.

        https://www.foxbusiness.com/ma... [foxbusiness.com]

        • Hmm yes Fox Business and Larry there would always find something

          • Did you bother reading it or just go ad hominem on glancing the url?

            But feel free to invest your own money like the economy is doing great and getting better and ignore factual financial business data because you don't like the source url.

            • Did you bother reading it or just go ad hominem on glancing the url?

              I did click the link, saw who it was, and decided that I don't need to listen to what Larry "there's no recession coming" Kudlow has to say about recessions.

            • I read it and it's pointless fear-mongering. It's just talking about the economy slowing down. The head of the Federal reserve keeps cranking up interest rates to slow the economy down. They're measuring the Delta of a Delta to get the numbers they want. Despite everything the 1% are doing to try and crash the economy it's not working.

              Mind you the reason it's not working is because the boomers are just dying too fast. That's freeing up job opportunities for younger workers who don't have any savings and
              • If they give any attention to Fox media properties, they're likely indoctrinated to vote and hope against their own interests.

                You need look no further than everyone hoping that the Biden administration fails, not recognizing that if the administration fails that literally everyone except those wealthy enough to ride it out (read: nobody posting to Slashdot) will end up being fucked.

                It's a disease that has infected politics on the right for decades.

        • I was pretty anxious but if fox business says the sky is falling then everything must be ok.

          • Yeah all that math and factual data was obviously just made up by fascist right wingers.

            Or you could actually read the article and learn something.

            • And you're a typical right winger trying to evade by using numbers taken out of context.

              Fox News took a Delta of a Delta and some oddball numbers and tried to use that to fear monger. It's similar to how they will show you a graph that's technically correct but has been doctored to exaggerate the point they're trying to make.

              I get that you're having fun shit posting and all but don't you ever get tired of being lied to and don't you think that it's ever going to bite you in the ass? I mean you're her
              • NPR: Advertisers are major wealth management firms, popular b2b SaaS providers, and the estates of various former captains of industry.
                Bloomberg: Advertisers are major wealth management firms, luxury auto manufacturers.
                Fox Business: cash for gold, reverse mortgages, overpriced flip phones for the elderly, boner pills, hair club for men, medicare mobility scooters.

                I’m gonna listen to Fox News guys.

            • I’m actually pretty concerned with how the wealthy seem to be preparing for a major recession. But you can’t take Fox News content at face value. It can be a good tool for revealing the motives of the ruling class when you find the difference between reality and what they want you to believe.

              Just because the article has some math and figures doesn’t mean it’s telling the truth. Right wing media usually takes care to delve into information outside of the general public’s unde

        • They have a vested interest and encouraging Americans to believe that the economy is doing poorly so they can win the 2024 election. You're going to have to find a better source if you want to fear monger effectively
        • You should really find a second source that is saying the same things if you want to be taken seriously. Fox news media properties are basically a propaganda network.

          Nothing that lists Rupert Murdoch in their org chart is anything I'll trust any more without second source confirmation from a completely different news organization.

    • Outside of gaming/content creation, desktops are becoming dinosaurs period. 80% of back-end office types and even professionals (doctors, lawyers) can get by with a laptop, even an ipad or surface. Currently, most of Intel's product line is comprised of discrete CPUs (Core i5-7-9, etc, though they have products in the mobile, SoC space, but that's currently dominated by vendors like Qualcomm and Samsung).

      Gamers are never going to make up for that shortfall.

    • It's true, I am currently using a Mini PC (like, 5" x 4") with a AMD Ryzen 5600H, 6c12t, 4.2GHz, 32GB RAM, 500GB M2 SSD, 1TB SSD, 3 screens 4K yada yada.
      I will be able to use this things certainly for 10+ years. Why upgrade?

    • by Z80a ( 971949 )

      A lot of companies were betting on riding the covid lockdown economy, but it died with the actual thing

    • But they have an additional problem as a cpu maker that's only getting worse every year. Most people can easily get by with a 5-10 year old cpu for daily use. Email, light surfing, document creation, small office spreadsheets etc don't need a 24 core 3 ghz cpu. Yes, my pedantic ocd friends, there are always some people with unlimited cpu need but they're not the core market.

      Step outside an office and look at the gamers as the next largest buyers. For many years gaming has been about gpu first, cpu just need

    • Yep, I recently upgraded to a used Skylake i7-6700K from my i7-3770. Found a new old stock MSI Z170A Titanium motherboard, paired it with the 6700K, 32GB of RAM, dual Samsung SSDs, and an nvme as the boot drive. Still plugging along with my 960GTX as well. I'm sure this system will be my daily driver for the next 5 years. Cost was about 1/4 the price of new modern offerings. Runs Windows 7 PRO SP2, Mint, and POP! OS with ease. I'll probably never buy a new CPU again. Last time I did it was an i7-2600 (that
    • by hencar ( 7991772 )
      I would say that GPUs are not the reason that we don't need faster CPUs. It is rather the lack of faster CPUs that has made gamers and heavy computations like machine learning move to GPUs. It is also the lack of increase in clock-speed that has made CPU makers put more cores into their CPUs. 30 years ago, you would if you upgraded your computer every third year get a computer that was twice as fast as your old computer. That is no longer true, and now people no longer upgrade their computers every third y
      • Clock speed isn't everything, and Pentium 4 should have proven that beyond any doubt.

        Intel started cranking clock speed during Pentium 4 because they made the pipelines so deep they could get away with it. However, they quickly discovered that there were very real performance penalties should their branch prediction and speculative execution fail to predict / speculate the next instruction properly, because you had to flush the entire 21+ stage pipeline in order to get the right instruction / data in there

        • Getting back to CPUs, the reason you haven't seen clock speeds move very much, is because Intel, AMD, Samsung, Qualcomm, Nvidia, etc. have been focusing on something that matters far more: instructions per clock.

          They have been focusing on efficiency like instructions per clock, but they did it because the room to increase clock speed has been steadily shrinking. We're very much rubbing up against a barrier where the current process nodes won't get you any faster without producing a tonne of heat and eating a tonne of power. Like it or not, silicon simply won't give you an 7-8GHz clock without liquid nitrogen cooling and a shit load of power no matter how tightly you pack in the transistors.

  • Better bang for the buck.
    Better bang anyway

    • Better bang for the buck. Better bang anyway

      Not when it comes to video cards. Both AMD and NVidia are still charging crypto prices, while Intel's Arc 770 LE is much cheaper for the same "bang for the buck." It's why I'm buying a second one next month - people using Flight Simulator 2020 are reporting it's more than "good enough", and with 16 gb of RAM per card, 2 cards outperforms an NVidia 4090 "toaster/air fryer" card at half the price.

      • Intel's Arc is cheaper because it has a ton of driver issues. I am not talking about a hit of 5-10% in performance, I am talking about various issues such as not seeing connected monitors or major performance issues in current AAA games.
        • Intel's Arc is cheaper because it has a ton of driver issues. I am not talking about a hit of 5-10% in performance, I am talking about various issues such as not seeing connected monitors or major performance issues in current AAA games.

          Here - let me fix that for you - it HAD a lot of driver issues. MSFS is the current leader in demanding high-end performance to get the most out of it, and the Arc 770LE runs 2 screens at 4096x2160 just fine. Which is why I'm waiting day by day to get an email from my current supplier that I can pass by and pick up another on which I will dedicate to the other 2 screens. A 100" 8192x4320 video wall is an awesome thing to fly with, which is what my friends will experience the day after I get it.

          • What matters is how it runs games. And in that area, Intel Arc isn't so good.
            • What matters is how it runs games. And in that area, Intel Arc isn't so good.

              You haven't been keeping up with the times, I see. Arc runs games just fine on newer hardware with ReBAR (resizeable BAR) enabled on machines with enough ram to actually use it. And they (Intel) keep on issuing updates on a regular basis to make it even better.

              And once game developers implement XeSS [intel.ca] any advantage held by NVidia and AMD will be over. But that's okay - keep paying a crazy premium instead of taking advantage of Intel muscling its way into the market with non-crypto-era video card pricing.

    • by mobby_6kl ( 668092 ) on Friday January 27, 2023 @11:47AM (#63244685)

      It's not, Intel is better right now, on the desktop CPU side at least.

    • by gweihir ( 88907 )

      Indeed. AMD always had much better CPU engineering than Intel and now they have chip-making that is on-par. That makes for a better offer from AMD.

  • Poor Intel - can't even make money even during a global chip shortage. Having an *excess* of inventory right now pretty much tells you no one wants to buy Intel.

    Arm vendors are (still) backed up by at least an extra 6-12 months over their usual lead times, and yet Intel can't find a way to muscle into that space, after what, two years of it? Anecdotally, no SBC that's small and low power is Intel based - or at least, if there are any, they're 5-10 times the price of something similar with Arm on it. That's

    • by Anonymous Coward
      The chip shortage was never about PC/server CPUs. It's about microcontrollers and all sorts of ancilliary devices for motherboards and embedded devices. Spend some time trying to buy a microcontroller and you'll see. No shortage of PCs/tablets/phones. Maybe at the peak, but again, almost certainly due to the bits-and-pieces chips, nit CPU
    • Comment removed based on user account deletion
  • by laughingskeptic ( 1004414 ) on Friday January 27, 2023 @09:55AM (#63244487)
    They are running at their lowest production levels in a decade and only assign 4% of their 17% decline to factory under-utilization. That sounds like impressive management of manufacturing costs to me. For many companies in this situation, this would be the long pole on the loss chart.
  • by xack ( 5304745 ) on Friday January 27, 2023 @09:55AM (#63244491)
    Because Intel wants to sell you new crap. And this is also they keep making new versions of SSE and AVX, so your perfectly good processor is now "obsolete". An Intel Atom or even Pentium 4 could still be useful if it wasn't for arbitrary crap like this.
    • Because Intel wants to sell you new crap. And this is also they keep making new versions of SSE and AVX, so your perfectly good processor is now "obsolete".

      That's not how anything works.

      An Intel Atom or even Pentium 4 could still be useful if it wasn't for arbitrary crap like this.

      The Pentium IV was a 32 bit processor. It wasn't until Core 2 that Intel had anything amd64 compatible. 32 bit is over. Nobody wants to support it any more. This has nothing to do with SSE. Windows 11 refuses to run on anything without a current TPM, although if you trick it into installing anyway it runs fine. None of this has anything to do with intel.

      • Yes, I agree, 32bit had a good run (introduced in 1985 I believe with the 386DX) but 64bit is the only way forward now. I wonder when 128bit will become the norm? I do have a problem with WinTel pulling their shit with 7th gen processors though. There is no reason that Windows 7 wouldn't run just fine on them.
        • I wonder when 128bit will become the norm?

          Most likely never. There is no need for it. The reason we abandoned 32-bit was because: 1) sometimes you have integers that are bigger than 4,294,967,296 while not being a BigNum and 2) Because we wanted to use more than 4GB of address space (which is the limit for 32-bit pointers). The thing is, 64-bit allows for integers values up to 18,446,744,073,709,551,616 and pointers that can address 1,677,721 terrabytes of address space. For larger integer values, you are

          • BTW it's worth mentioning that floating-point units in x86 were always 64-bit.
          • Also, "Wintel" pulled what? Microsoft chose to not support CPUs made earlier than 1st January 2017 from Intel and AMD. If you are going to use a highly dubious term even back when it was common, at least try to make sure it fits the facts.

            There was never anything dubious about it, Microsoft and Intel were deeply in bed [hbs.edu] from early days. The relationship appears to have weakened somewhat, but it's worth keeping in mind that Intel is the primary beneficiary of Microsoft changes which require new hardware.

            • Umm... no. For example, AMD CPUs historically had a endorsement from Microsoft printed on the box that the CPU will in-fact run Windows. You see, Microsoft wants hardware to be a commodity, they aren't Apple to foolishly tie themselves to a single CPU vendor (Intel) and then, when that CPU vendor disappoints, take the huge risk of chip design themselves (see SPARC or PowerPC for why this is high-risk, even if it looks a good idea initially). But Microsoft wants CPU vendor to duke it out on who will make the
              • Microsoft and Intel were deeply in bed from early days.

                Umm... no. For example, AMD CPUs historically had a endorsement from Microsoft printed on the box that the CPU will in-fact run Windows.

                Found the guy who doesn't know marketing exists, who doesn't know corporations do stuff to get around antitrust law without doing anything substantive...

                Intel is primary beneficiary anymore only as much as their market share allows, there is no special treatment.

                So you're saying it's not happening now, which I doubt, but that has no bearing on whether it's been historically true. Try to stay on topic.

                So, the term "Wintel" was highly dubious even back then, and it's even more dubious now that Windows has an ARM version and all Linux and Unix servers are x86-64.

                Oh yes, we've seen Windows' commitment to ARM and lack of special treatment. [appleinsider.com]

                You are pretending things which happened and are still happening aren't happening. That's not compelling.

                • Found the guy who doesn't know marketing exists, who doesn't know corporations do stuff to get around antitrust law without doing anything substantive...

                  The burden of proof lies with the person making a claim. There is no evidence this has happened other than basic collaboration, which is something you want for a memory-protected OS, especially an OS that had to use special CPU features to support older real-mode apps in protected mode (see: https://devblogs.microsoft.com... [microsoft.com]). Similar meetings happen with

                  • BTW Nvidia dropping the ball when it comes to driver support for their ARM SoCs is especially annoying, because the experience with their discrete GPUs shows they have no problem delivering good and timely driver support when they care. They just couldn't be arsed for their ARM SoCs. And it's the reason Qualcomm ate their lunch as a result.
        • Has Intel done Focus Group surveys to discover the real reason? They saw Broadcon with all-in-one cpus, and they saw apple also laying down the hammer. They saw AMD in bed with TSMC. I remember CPU bugs in Intel, and I am waiting to hear if they have fixed them, When that all failed they tried a low end budget CPU to staunch market share losses.Sorry Intel, you have failed to wow me.And MS's inability to let me do a painless hardware upgrade - means a lost sale. I am not upgrading, to run win 10 or 11. Wha
      • It wasn't until Core 2 that Intel had anything amd64 compatible.

        Incorrect; they began including it with Pentium 4.

        • I sit corrected. I didn't even know about that whole generation of P4s. My lady used to have one of the prior generation mobile P4s, and it was a beast of the thing. The HP laptop it was in was one of the thickest I've ever seen.

    • Although I take your point about planned obsolescence, I must protest the claim that Atom CPUs had, or indeed ever had, any use that people would ever want to return to.

    • Very little software really requires any new CPU instructions. The reason for that is because doing so will drastically cut into your market of who can run your software, so the software makers won't require anything until it's pretty much ubiquitous.

      I would guess the vast majority of software out there will run on a Pentium 4 so long as it's one of the later ones that does EM64T. It may not run very well, but it'll still work.

      Funny enough, Intel actually removed AVX512 support from their 12th and 13th ge

  • by rlwinm ( 6158720 ) on Friday January 27, 2023 @10:01AM (#63244505)
    So I have a very, very powerful laptop I use for CAD work (mainly PCB layout) and a pretty powerful server box doing storage, compiling, etc. My laptop is 4 years old and my server is almost 14 years old. While I do wait occasionally for a lengthy compile or VHDL simulation - my machines are for the most part fast enough.

    Sure something faster would be nice. But I tend to only upgrade when something fails beyond repair. Markets saturate. Even the smartphone market has saturated. Sometimes it's not even a question of "good enough" but rather "getting your money's worth."
    • There comes a point where saving time (and electricity) is worth the expense of upgrading. So, when an upgrade kind of "pays for itself", it's logical to repurpose/sell/give away older kit.

      Anyone still running a Sandy Bridge or earlier really needs to think about upgrading.

      • Sandy Bridge is a reasonable cutoff point. I have an Ivy Bridge desktop and it's still fine even for hobby-level programming, CAD, photo and video editing. Sure, a new PC would be an order of magnitude faster but in practice it doesn't really matter if my vacation video takes 3 or 30 minutes to render if that can be done while I'm off doing something else. Likewise with electricity, a new $1000 PC will never pay for itself.

        Still, I was about to make a 13700k build but some parts went out of stock just as I

    • You can also get some really decent refurb systems for pretty cheap. Quad cores with 16 gig of ram and a SSD drive between 100 to 200 bucks. I have several of these and they have worked very well for several years now. Fit the bill for what I need, and if one dies, no need to cry about it.

      • by rlwinm ( 6158720 )
        Yup! This is great advice. I often buy refurb systems. Heck, my DNS/DHCP/NAT/VPN/misc network OpenBSD box at the office is a refurb. And even doing some VPN crypto work it still has tons of CPU to spare.
  • by gnasher719 ( 869701 ) on Friday January 27, 2023 @11:04AM (#63244601)
    I'm just curious, how would revenue and everything be different if Apple hadn't switched its Macs to ARM, and for every Mac sold Intel had sold the equivalent Intel CPUs?
    • by UnknowingFool ( 672806 ) on Friday January 27, 2023 @11:54AM (#63244709)
      Millions of Intel CPUs a year alone would have been sold to Apple customers. The loss in revenue/profit was not the only fallout of Apple leaving; the publicity did hurt Intel's reputation. The reputation was hurt more with Intel's almost juvenile reaction [tomshardware.com] that made Intel appear jealous that their former partner had moved on after a breakup that was entirely their fault. It is rumored that Apple has been unhappy with Intel processors for a few generations noting at one point Apple not Intel was the leading reporter of hardware bugs in Intel's QA software. The underlying assertion was that Apple was uncovering issues that Intel QA should have found/fixed before releasing hardware to a customer for production.
      • by Ed Tice ( 3732157 ) on Friday January 27, 2023 @12:03PM (#63244727)
        I think that there is more than this. Many software vendors (my employer included) tended to not have ARM support even though there was demand from datacenter customers. We had good OS support (Windows, Linux, MacOS, FreeBSD, NetBSD, Solaris, HP/UX, AIX, and probably some more I'm forgetting) but none of them were ARM architecture. Suddenly with Apple making their own ARM processor we have to have a product for ARM. Once that happens, an ARM/Linux build is a much smaller hurdle. Apple moving to ARM will also ultimately enable more ARM in the datacenter.
        • It would seem that the obvious benefactor of this would be NVidia who had been pushing an ARM server solution for a while. However I think NVidia's solution was ARM paired with their GPUs which would be beneficial in only some scenarios.
    • Intel doesn't release that detail of sales but people have estimated apple accounted for 2-4% of Intel sales; so its not a huge amount but losing 4% of sales still hurts. However I think there is more damage then just losing 4% of your sales. If apple made the switch to ARM based processors , well that sort of shows others they can too. Great reviews of apple chips may make even PC laptop buyers looking at an arm laptop (there are surface arm laptops) and as more arm chips are sold well more software maker
  • The bloated Operating Systems of today are the only reason why there is any significant demand for more powerful processors. Each upgrade of the OS brings dwindling performance for the end user, with very few improvements, if any, in the functionality of the system.

    If I take an 8 year old Core i5 with 8GB of memory, I can run a system written for another processor, through emulation, and the OS for that processor will boot nearly instantaneously (say, System 7.5.3 using Basilisk II). All of the applications

  • Would love for Intel to break the Nvidia/AMD duopoly who right now are robbing us blind. Arc is a good start but with Intel's engineering and fabs at its disposal there's no reason why it can't compete in the long run.

  • by Billyzee ( 9266653 ) on Friday January 27, 2023 @12:37PM (#63244801)
    It's been tough for Intel but I see them slowly recovering. The truth is that Intel cannot be allowed to fail. They are just too important.
  • I overbought initially, a Dell T3610 circa 2014, and am still using it for cpu and gpu intensive work (Adobe Creative Cloud, not gaming -- I don't game). I'm just now thinking of upgrading, but to a more performant but still legacy server. Unless you're a cutting edge, 250 fps gamer, current hardware is a waste of money. (And now that I write that, gaming is GPU intensive anyway.)

    (In Adobe's case, they seem to actually be going backwards -- a very useful, compute intensive feature -- shake reduction -- w

  • Hey Fella,
    Hows about tightening up the Windows 11 CPU restrictions by say... as much as you can. We need to shift inventory and my fine friend, you are in a perfect position to help us out. All those 15 month old CPU's that barely meet your W11 CPU spec could... with a simple patch be made unable to run Windows 11 after the next but one update?
    How about it then?

    Give my love to the wife and kids, my mansion in Aspen is available whenever you need it.
    Yours
    Intel CEO

    Naturally the above is illega

    • "What Intel giveth, Microsoft taketh away". How old is that, and it still applies?

      As for me and my house, we run Linux so we watch in amusement. MX and Mint, mostly, average PC age about 8 years.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...