Intel: Just You Wait. Again (mondaynote.com) 39
Analyst Jean-Louis Gassee, writing at Monday Note about Intel's habit of requesting investors that they wait for the company to catch up to the competition: Concurrently, the company's revenue for its new IFS foundry business decreased by 24% to an insignificant $118M, with a $140M operating loss gingerly explained as "increased spending to support strategic growth." Other Intel businesses such as Networking (NEX) products and Mobileye -- yet another Autonomous Driving Technology -- add nothing promising to the company's picture. This doesn't prevent [Intel CEO Pat] Gelsinger from once again intoning the Just You Wait refrain. This time, the promise is to "regain transistor performance and power performance leadership by 2025."
Is it credible?
We all agree that the US tech industry would be better served by Intel providing a better alternative to TSMC's and Samsung's advanced foundries. Indeed, We The Taxpayers are funding efforts to stimulate our country's semiconductor sector at the tune of $52B. I won't comment other than to reminisce about a difficult late 80s conversation with an industry CEO when, as an Apple exec, I naively opposed an attempt to combat the loss of semiconductor memory business to foreign competitors by subsidizing something tentatively called US Memories. But, in this really complicated 2023 world, what choices do we actually have?
For years I've watched Intel's repeated mistakes, the misplaced self-regard, the ineffective leadership changes for this Silicon Valley icon, for the inventor of the first commercial microprocessor, only to be disappointed time and again as the company failed to shake the Wintel yoke -- while Microsoft successfully diversified. I fervently hope Pat Gelsinger succeeds. His achievement would resonate deeply, it would bring to mind another historic turnaround: Steve Jobs' 1997 return to the Apple he had "left" in 1985.
Is it credible?
We all agree that the US tech industry would be better served by Intel providing a better alternative to TSMC's and Samsung's advanced foundries. Indeed, We The Taxpayers are funding efforts to stimulate our country's semiconductor sector at the tune of $52B. I won't comment other than to reminisce about a difficult late 80s conversation with an industry CEO when, as an Apple exec, I naively opposed an attempt to combat the loss of semiconductor memory business to foreign competitors by subsidizing something tentatively called US Memories. But, in this really complicated 2023 world, what choices do we actually have?
For years I've watched Intel's repeated mistakes, the misplaced self-regard, the ineffective leadership changes for this Silicon Valley icon, for the inventor of the first commercial microprocessor, only to be disappointed time and again as the company failed to shake the Wintel yoke -- while Microsoft successfully diversified. I fervently hope Pat Gelsinger succeeds. His achievement would resonate deeply, it would bring to mind another historic turnaround: Steve Jobs' 1997 return to the Apple he had "left" in 1985.
Patience is a virtue (Score:5, Interesting)
Not just low margin (Score:1)
Re: (Score:3)
why do they think they could compete with Samsung and TSMC?
More importantly - will Intel be able to convince anyone in the tech industry that they are capable of competing with TSMC and Samsung?
They certainly need to do something... they lost Apple, and Microsoft has signaled they're probably gonna lose Windows too - so will they even have a market for their CPUs in 5-6 years? If they can't grab a significant chunk of the fab business, what's left?
Re: (Score:2)
Microsoft doesn't make much hardware, and if they end up dumping Intel in the slate tablets that won't be the end of the world.
Intel isn't going to "lose windows", since that would indicate Microsoft dropping support for x86, x86-64, etc. which includes AMD and is something like 90+ % of their market.
Re: (Score:1)
Keep going with ARC/Battlemage etc (Score:2)
If you don't give up, you might just get in there.
Both ATi and Nvidia took a while to actually produce a good 3D video card.
Re: (Score:2)
RISC-V needs a standard computer that use it to be a proper replacement to x86, or it turns into the same mess ARM is
Re: (Score:2)
Their problem isn't mistakes (Score:5, Funny)
And we're hitting the limits on what can be done with silicon so there isn't going to be a great leap in power that opens up new computing paradigms to the run of the mill user. AI will eat a lot of cycles but that stuff runs on GPUs and Intel's still way behind on those.
Re: (Score:2)
Next step will be tiles/chiplets
CPUs made out tiny cheap modular individual dies glued together like a super megazord chip.
They've been trying that for decades (Score:2)
They might get it this time, but even if they do I don't see it solving the cratering demand problem.
Re: (Score:2)
It is a synchronization issue. For example, there are still some operations where all cpu cores have to wait for one doing a thing. Hence the maximum you want is direct direct die-to-die communication, nothing indirect. It costs too much performance and becomes too difficult. Yes, this _could_ be solved on software side, but the tech to do that is generally not there yet and you have to do it manually. Most "coders" do not even begin to have the skills for that.
GPUs are different in that a lot of their work
Re: (Score:2)
There's the entire ryzen line that basically swims the spoils of that.
Re: (Score:3)
That's why I don't believe the US government has alien technology in a shed, if they did then Intel would have already brought out affordable diamondine optical processors or somesuch.
Which brings us to, who is going to spend enough money finally to break us through to the next technology? Or at least another, better, affordable semiconductor.
Re: (Score:2)
What makes you think there even is a "next technology"? The search for that is as old as semiconductors, whit zero results so far. It is quite plausible that there is no other competing tech in store in this physical universe that is fundamentally faster than semiconductors as we currently know.
As to faster semiconductors, that is going on all the time. It may run into a wall pretty soon as well though. Fabs for current cutting-edge tech are so expensive that in the near future all the world together may be
Re: (Score:2)
What makes you think there even is a "next technology"?
Optical's already superior in some ways, so there's that if nothing else.
Re: (Score:2)
It is not really. The only real application for optical is interconnect, not computing. And even there it cannot compete, because it cannot be scaled down enough. My take is that for computing, optical can ultimately reach maybe 1% of the performance of electrical (both are semiconductor, incidentally) at the same cost.
Re: (Score:2)
It is not really
Switching is two to three orders of magnitude faster. It's hard to see how that could not be a benefit. The question is how far the price can be driven down.
Re: (Score:2)
No. The question is how much can you pack in a given volume and how much power does it need. Price is a distant second. You do not win anything if you can only build a 4 bit processor that is 100x as fast as a 64 bit processor with other tech. In fact, in most workloads, the 64 bit CPU will be faster. Also, the lightspeed limit applies to electronic and optical all the same. And interconnect has been the main limit on computing speeds for about two decades in semiconductors. Make a CPU physically larger, an
Re: (Score:2)
No. The question is how much can you pack in a given volume and how much power does it need. Price is a distant second. You do not win anything if you can only build a 4 bit processor that is 100x as fast as a 64 bit processor with other tech.
If you could make it 100x as fast and the processor had to be 100x the volume (of the die, preferably, and not the whole package) then you could definitely still sell that so long as the overall solution had only 100x the TCO or less. You would still be getting several racks' worth of performance in one box.
And interconnect has been the main limit on computing speeds for about two decades in semiconductors. Make a CPU physically larger, and interconnect latency increases because more distance needs to be covered.
The interconnection speeds are limited by the fact that you have an electrical interface there, have to deal with capacitance, etc. Presumably you'll need optical memory as well in order to make the opti
Re: (Score:2)
Which brings us to, who is going to spend enough money finally to break us through to the next technology? Or at least another, better, affordable semiconductor.
It's the lithography, not the semiconductor that's the problem. Silicon has plenty of life left: the feature size is the hard thing and neither smaller features, nor hitting the atomic wall are helped with other semiconductors.
Currently optical EUV is the best there is, and those are astoundingly expensive. But it would also be astoundingly expensiv
Re: (Score:2)
Intel relied on their own perceived "exceptionalism" far too long. That one, such as it was, is now completely gone and they have nothing.
Yes, AMD is also making losses with CPUs, but only in some areas. And AMD has both a far more modern design and design process and can actually easily outsource manufacturing to whoever just happens to have the best manufacturing process. In addition, it really is not only x86 anymore.
Had Intel realized 10 years ago that they cannot compete on manufacturing anymore, they
Re: (Score:2)
Intel relied on their own perceived "exceptionalism" far too long. That one, such as it was, is now completely gone and they have nothing.
They relied on their superior process technology, and on skipping effective security checks in their processors. The process technology really was superior, until it wasn't.
But instead they were fat and lazy and so full of themselves that they did not notice how times were changing. They are not the only ones. A certain US plane maker comes to mind, for example.
Boing was OK until they merged with McDD, which already had full-blown corporate malaise.
Re: (Score:2)
Dude what
Intel was king of the datacenter not too long ago. They are losing the DC market rapidly to AMD. Do you even pay attention to this market? At all?
You ever heard of Sapphire Rapids or Genoa?
Just You Wait (Score:1)
'Enry 'Iggins, just you wait.
EU (Score:2)
That will be the result of MAGA, other countries looked at the USA as a potential liability and they also felt making themselves "Great Again" was a good idea.
Re:EU (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Just remember the ARM chip was a UK design and now it powers most of the phone market.
Re: (Score:2)
Yeah, ARM is just a design. We didn't manufacture the actual chips. The Register article is correct, there's no point in pouring money into spinning up fabs in the UK. We're too small a country, the barriers to entry are too high and the prospective gains too few. Fabs here would never, ever be price-competitive.
Taiwan has always been convenient because it (generally) shortened supply chains, and kept the exploitative work practices and ecological damage "over there".
Now with China eyeing Taiwan like a hung
Re: (Score:2)
Jean-Louis Gassee? He's a legend! (Score:3)
Oh shut up (Score:2)
Business ghouls... (Score:2)
Oregon's waiting... (Score:1)
Intel is screwed (Score:1)
And they did all the screwing themselves. First, for a long time, they screwed their customers. Then, they began screwing their own organization, because they could not get enough. And now there is no way fixing this. Leadership failure over a long, long time.
The only way Intel will ever amount to anything again is if China invades Taiwan and takes TMSC out of the picture. But even then, Samsung will have the lead on them.