Chip Industry Had Worst Sales Year Since Dot-Com Bubble Burst (bloomberg.com) 47
The semiconductor industry last year suffered its worst annual slump in almost two decades, hurt by a trade war between the largest chip producer, the U.S., and the largest consumer, China. From a report: Revenue fell 12% to $412 billion in 2019, the Semiconductor Industry Association said Monday in a statement. That's the biggest drop since 2001, when industry sales slumped 32% as the dot-com bubble burst. The rate of decline last year abated with sales growing slightly in the fourth quarter from the preceding three-month period, the industry association said. For that to continue, China and the U.S. need to build on the phase one trade agreement announced last month.
producer, consumer (Score:2)
largest chip producer, the U.S., and the largest consumer, China.
Isn't it the other way around?
Has nothing to do with CHINA (Score:2, Insightful)
The chip industry is having poor sales because frankly, their products really suck right now. Intel gave up on their 5G modem effort, fail. All of the CPU's with speculative execution have hardware security issues, fail.
Things are so bad, they're even shipping devices pre-installed with malware and adware these days (Windows 10/Android, fail). Still others have the nerve to lock down their devices into pay-walled gardens where the end user has no control over their own devices (Apple/Android, fail).
Who the
The chip industry is having poor sales because... (Score:2)
People are satisfied with the cell phones and desktop computers they already have.
Re: (Score:2)
Isn't it the other way around?
China is the biggest consumer by far.
China is the biggest producer by volume only if you consider Taiwan to be part of China.
America is the biggest producer by value.
But not for long. America's politicization of trade makes them an unreliable source. American companies will either move their production to Asia or device manufacturers will look elsewhere for their semiconductors.
Re:producer, consumer (Score:5, Informative)
Taiwan is def. an independent entity from mainland china; free elections, distinctly different government, military, trade rules, visa processes etc. Taiwan have ethnically similar humans compared to mainland china but otherwise they are as independent from mainland china as the philippines are.
Re: (Score:1)
I have an idea. (Score:3, Funny)
Chip Industry Had Worst Sales Year Since Dot-Com Bubble Burst
I like my chips (crisps for all you UK folks) Jalapeño flavored, maybe they should try that to help boost sales. (Don't know what the Dot-Com Bubble has to do with anything, it was nothing compared to the potato famine.)
Re: (Score:1)
They are talking about digital computer chips, not potato chips.
Re: (Score:3)
I don't know why. Potato chips taste much better than computer chips. Computer chips don't crisp as nicely either.
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2, Funny)
I don't get it. What does "woosh" mean? The article is about COMPUTER chips and you are talking about POTATO chips. Did you not read the article?
Re: (Score:2)
Yes, I read the summary and article. "Whoosh" [urbandictionary.com] means my obviously dumb joke sailed over your head.
[ and *whoosh* for this one too -- cheers :-) ]
Re: (Score:2)
Ooooh *whooooosh*
Re: (Score:2)
Did you not read the article?
You must be new here
Re: (Score:2)
Well, one computer company used to be called Apple Computer, Inc., so maybe he thought there was such a thing as Potato Computer, Inc.
You know, the company that puts a bag of potato chips in the packaging box of every computer they sell?
Does this account for scandals? (Score:3)
Revenue fell 12% to $412 billion in 2019
Does this account for the fact that memory manufacturers got busted for price fixing (again)?
per wikipedia: [wikipedia.org]
On 27 April 2018, Hagens Berman filed a class-action lawsuit against Samsung, Hynix, and Micron in U.S. District Court alleging the trio engaged in DRAM price fixing causing prices to skyrocket through 2016 and 2017.[5] Between June 2016 and January 2018 the price of DRAM nearly tripled.[6]
more at El Reg [theregister.co.uk]
Re: (Score:2)
They weren't "busted" they were accused in a civil suit and from what i can tell there yet to be a judgement on it.
Re: (Score:2)
From the story
Memory chips were the hardest hit. Prices of those commodity chips fell as production outran demand. Memory revenue dropped 33% from 2018 led by declines in computer memory.
So if they are attempting a price control. They're doing an awfully shitty job of it by making too many. In fact from your Reg piece.
The suit claims that, after a global decline in the price of RAM chips, the three companies began in June of 2016 to deliberately limit their output of DRAM storage chips.
So it seems reality contradicts that statement somewhat. However, there is the ideal that perhaps they were caught and decided to let it go a bit. I think at that point they can use a defense that was recently used in the US Senate of "no harm, no foul". (c'mon that's a joke)
In all seriousness, it does seem that indeed some price fixing was in place but that
Re: (Score:2)
Could be. Here's from the SIA newsbrief: [semiconductors.org]
Sales of memory products were down significantly in 2019 by dollar value, decreasing 32.6 percent compared to 2018. Memory unit volume increased slightly, however.
This bubble is scarier (Score:2, Interesting)
Much scarier. The dotcom bubble was driven by a ton of retail investors dumping their life savings into pets.com. The current everything bubble is that, plus the Fed dumping $50b of liquidity indirectly to hedge funds via repo every day. And that cash goes into driving mega caps and wildly overvalued stocks like Tesla into the stratosphere.
So what's the scary thing? At the tail end of a bubble retail eventually runs out of cash. In this case the Fed can keep expanding the balance sheet for years to kee
Re: (Score:2)
Re: (Score:2)
Blame tablets (Score:5, Interesting)
Part of the problem is that there's fundamentally zero real year-to-year improvement in CPU performance anymore, because the industry went from trying to replace PCs with tablets to turning PCs INTO crippled tablets. All Intel ever talks about anymore is reducing power consumption so Apple and Microsoft can make glued-together 5mm slabs of lucite and silicon that pretend to be real computers.
Give us back CPUs that can do real, honest-to-god multi-chip SMP, so we can pack 2-4 CPUs into a computer and run ALL of them at balls-to-the-wall max turbo 4-5GHz speed, instead of having the CPU pretend it's a Raspberry Pi.
If Intel feels retro, it can resurrect the Pentium II candybar idea, and pack 2-4 SMP-enabled i7/i9-class CPUs into a sealed watertight candybar with peltier-assisted liquid cooling (preventing condensation by keeping the cold part sealed inside a dry gas atmosphere to prevent condensation, and only exposing the searing-hot outside part to the world).
Then, lean on Microsoft to give us a desktop that would have made Aero Glass jealous, with realtime-raytraced translucency and lens effects to give our RTX GPU a good workout when it doesn't have anything better to do, and let us feel like we got our money's worth after spending $3,000 on a new computer for the first time in 20 years.
The point is, if all anything aspires to be anymore is a glorified shitty thin tablet with an even shittier keyboard, there's nothing to drive the relentless arms race towards faster and more powerful computers. And off on the horizon, we HAVE something that genuinely needs all the computing power we can throw at it: dynamic Javascript-driven web sites... er, just kidding, I mean, "Virtual and Augmented Reality".
Today's AR and VR is egregiously and tragically underpowered compared to what it really NEEDS to even APPROACH "not a party trick or toy" status. If you're going to use something in the future like an 8K-resolution Oculus Rift, you're going to need WAY more than a shitty netbook or overgrown tablet to drive it... you're going to need the equivalent of a maxed-out i9 server just to provide realtime tracking of your gaze and motion relative to your surroundings. Anything less (or a framerate slower than 400fps), and you'll never overcome the "slosh" problem that makes even the VR/AR elite go worship the porcelain god occasionally.
And no, "cloud computing" won't save it. Not even if you scream "5G! 5G! 5G!" all day. Well, unless the "cloud" is more like "fog", and consists of a server farm at most a mile or two away run by your ISP ("fog" is basically a cloud at ground level that's nearby and/or surrounds you, instead of being nebulously off in the distance). Anything further, and latency just kills you and makes your world "slosh" again.
Re: (Score:2)
All Intel ever talks about anymore is reducing power consumption.
That's quite funny when you realize that US data centres alone dissipate roughly 140 billion kilowatt-hours annually. And rising fast.
https://www.nrdc.org/experts/p... [nrdc.org]
Re: Blame tablets (Score:2)
> US data centres alone dissipate roughly 140 billion kilowatt-hours annually
So... in other words, the same amount of electricity (divided by 300 million Americans) per year that typical 3-ton central A/C unit in Florida running with an average 50% duty cycle consumes in slightly over a week?
140 billion kWh / 300,000,000 Americans = 466.7kWh per American per year
3-ton central A/C drawing 20A @ 220v:
20A x 220v = 4.4kW/h
4.4kW/h @ 50% duty = 2.2kW/h
467kWh / 2.2kW/h = 212 hours
212 hours / 24 hours/day = 8.83
Re: (Score:2)
Re: Blame tablets (Score:2)
AMD's improvements only help for software that explicitly goes out of its way to take full advantage of multithreading. In most random Windows use, you'll see a bigger boost from two SMP CPUs maxed out at 4-5GHz from a single core apiece than from a single chip with 16+ cores, but a thermal budget that limits each core to only 1.8-2.8GHz apiece.
My point is, massively increasing performance at this point requires the kind of heat & power budgets we had 20 years ago. We had a decade or so (starting with t
Re: (Score:1)
Puntium CPU (Score:1)
So, their chips are down.
Not surprising, given performance stagnation (Score:3)
For many applications, performance has been delegated to the GPU anyway, and CPU chips are more of a conduit to leveraging their throughput, and so even if more impressive chips were available, their marginal utility of that extra power might be low.
If NVidia's sales are also down by the industry average, I'd be convinced. They at least have been continuing to keep the flops cruising upward.
Not good news (Score:1)