Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses

Chip Industry Had Worst Sales Year Since Dot-Com Bubble Burst (bloomberg.com) 47

The semiconductor industry last year suffered its worst annual slump in almost two decades, hurt by a trade war between the largest chip producer, the U.S., and the largest consumer, China. From a report: Revenue fell 12% to $412 billion in 2019, the Semiconductor Industry Association said Monday in a statement. That's the biggest drop since 2001, when industry sales slumped 32% as the dot-com bubble burst. The rate of decline last year abated with sales growing slightly in the fourth quarter from the preceding three-month period, the industry association said. For that to continue, China and the U.S. need to build on the phase one trade agreement announced last month.
This discussion has been archived. No new comments can be posted.

Chip Industry Had Worst Sales Year Since Dot-Com Bubble Burst

Comments Filter:
  • largest chip producer, the U.S., and the largest consumer, China.

    Isn't it the other way around?

    • by Anonymous Coward

      The chip industry is having poor sales because frankly, their products really suck right now. Intel gave up on their 5G modem effort, fail. All of the CPU's with speculative execution have hardware security issues, fail.

      Things are so bad, they're even shipping devices pre-installed with malware and adware these days (Windows 10/Android, fail). Still others have the nerve to lock down their devices into pay-walled gardens where the end user has no control over their own devices (Apple/Android, fail).

      Who the

    • Isn't it the other way around?

      China is the biggest consumer by far.

      China is the biggest producer by volume only if you consider Taiwan to be part of China.

      America is the biggest producer by value.

      But not for long. America's politicization of trade makes them an unreliable source. American companies will either move their production to Asia or device manufacturers will look elsewhere for their semiconductors.

    • China is big on assembly. Most chip production is in Taiwan, US and Europe (ST and Infineon). China is scrambling to develop their own IC production technology, and then non-Chinese companies will start to steadily loose the hold on the Chinese market.
  • by fahrbot-bot ( 874524 ) on Tuesday February 04, 2020 @01:23PM (#59689932)

    Chip Industry Had Worst Sales Year Since Dot-Com Bubble Burst

    I like my chips (crisps for all you UK folks) Jalapeño flavored, maybe they should try that to help boost sales. (Don't know what the Dot-Com Bubble has to do with anything, it was nothing compared to the potato famine.)

  • by Gravis Zero ( 934156 ) on Tuesday February 04, 2020 @01:34PM (#59689986)

    Revenue fell 12% to $412 billion in 2019

    Does this account for the fact that memory manufacturers got busted for price fixing (again)?

    per wikipedia: [wikipedia.org]

    On 27 April 2018, Hagens Berman filed a class-action lawsuit against Samsung, Hynix, and Micron in U.S. District Court alleging the trio engaged in DRAM price fixing causing prices to skyrocket through 2016 and 2017.[5] Between June 2016 and January 2018 the price of DRAM nearly tripled.[6]

    more at El Reg [theregister.co.uk]

    • by geek ( 5680 )

      They weren't "busted" they were accused in a civil suit and from what i can tell there yet to be a judgement on it.

    • From the story

      Memory chips were the hardest hit. Prices of those commodity chips fell as production outran demand. Memory revenue dropped 33% from 2018 led by declines in computer memory.

      So if they are attempting a price control. They're doing an awfully shitty job of it by making too many. In fact from your Reg piece.

      The suit claims that, after a global decline in the price of RAM chips, the three companies began in June of 2016 to deliberately limit their output of DRAM storage chips.

      So it seems reality contradicts that statement somewhat. However, there is the ideal that perhaps they were caught and decided to let it go a bit. I think at that point they can use a defense that was recently used in the US Senate of "no harm, no foul". (c'mon that's a joke)

      In all seriousness, it does seem that indeed some price fixing was in place but that

    • Could be. Here's from the SIA newsbrief: [semiconductors.org]

      Sales of memory products were down significantly in 2019 by dollar value, decreasing 32.6 percent compared to 2018. Memory unit volume increased slightly, however.

  • by Anonymous Coward

    Much scarier. The dotcom bubble was driven by a ton of retail investors dumping their life savings into pets.com. The current everything bubble is that, plus the Fed dumping $50b of liquidity indirectly to hedge funds via repo every day. And that cash goes into driving mega caps and wildly overvalued stocks like Tesla into the stratosphere.

    So what's the scary thing? At the tail end of a bubble retail eventually runs out of cash. In this case the Fed can keep expanding the balance sheet for years to kee

  • Blame tablets (Score:5, Interesting)

    by Miamicanes ( 730264 ) on Tuesday February 04, 2020 @02:32PM (#59690314)

    Part of the problem is that there's fundamentally zero real year-to-year improvement in CPU performance anymore, because the industry went from trying to replace PCs with tablets to turning PCs INTO crippled tablets. All Intel ever talks about anymore is reducing power consumption so Apple and Microsoft can make glued-together 5mm slabs of lucite and silicon that pretend to be real computers.

    Give us back CPUs that can do real, honest-to-god multi-chip SMP, so we can pack 2-4 CPUs into a computer and run ALL of them at balls-to-the-wall max turbo 4-5GHz speed, instead of having the CPU pretend it's a Raspberry Pi.

    If Intel feels retro, it can resurrect the Pentium II candybar idea, and pack 2-4 SMP-enabled i7/i9-class CPUs into a sealed watertight candybar with peltier-assisted liquid cooling (preventing condensation by keeping the cold part sealed inside a dry gas atmosphere to prevent condensation, and only exposing the searing-hot outside part to the world).

    Then, lean on Microsoft to give us a desktop that would have made Aero Glass jealous, with realtime-raytraced translucency and lens effects to give our RTX GPU a good workout when it doesn't have anything better to do, and let us feel like we got our money's worth after spending $3,000 on a new computer for the first time in 20 years.

    The point is, if all anything aspires to be anymore is a glorified shitty thin tablet with an even shittier keyboard, there's nothing to drive the relentless arms race towards faster and more powerful computers. And off on the horizon, we HAVE something that genuinely needs all the computing power we can throw at it: dynamic Javascript-driven web sites... er, just kidding, I mean, "Virtual and Augmented Reality".

    Today's AR and VR is egregiously and tragically underpowered compared to what it really NEEDS to even APPROACH "not a party trick or toy" status. If you're going to use something in the future like an 8K-resolution Oculus Rift, you're going to need WAY more than a shitty netbook or overgrown tablet to drive it... you're going to need the equivalent of a maxed-out i9 server just to provide realtime tracking of your gaze and motion relative to your surroundings. Anything less (or a framerate slower than 400fps), and you'll never overcome the "slosh" problem that makes even the VR/AR elite go worship the porcelain god occasionally.

    And no, "cloud computing" won't save it. Not even if you scream "5G! 5G! 5G!" all day. Well, unless the "cloud" is more like "fog", and consists of a server farm at most a mile or two away run by your ISP ("fog" is basically a cloud at ground level that's nearby and/or surrounds you, instead of being nebulously off in the distance). Anything further, and latency just kills you and makes your world "slosh" again.

    • All Intel ever talks about anymore is reducing power consumption.

      That's quite funny when you realize that US data centres alone dissipate roughly 140 billion kilowatt-hours annually. And rising fast.

      https://www.nrdc.org/experts/p... [nrdc.org]

      • > US data centres alone dissipate roughly 140 billion kilowatt-hours annually

        So... in other words, the same amount of electricity (divided by 300 million Americans) per year that typical 3-ton central A/C unit in Florida running with an average 50% duty cycle consumes in slightly over a week?

        140 billion kWh / 300,000,000 Americans = 466.7kWh per American per year

        3-ton central A/C drawing 20A @ 220v:

        20A x 220v = 4.4kW/h

        4.4kW/h @ 50% duty = 2.2kW/h

        467kWh / 2.2kW/h = 212 hours

        212 hours / 24 hours/day = 8.83

    • Comment removed based on user account deletion
      • AMD's improvements only help for software that explicitly goes out of its way to take full advantage of multithreading. In most random Windows use, you'll see a bigger boost from two SMP CPUs maxed out at 4-5GHz from a single core apiece than from a single chip with 16+ cores, but a thermal budget that limits each core to only 1.8-2.8GHz apiece.

        My point is, massively increasing performance at this point requires the kind of heat & power budgets we had 20 years ago. We had a decade or so (starting with t

    • Lower power per FLOP means that you can do more calculation in a given volume. Thermal power is an important constraint for e.g. self driving cars, computer vision and robotics. The whole cloud computing thing is a result of the fact that we have off-loaded computing from mobile devices to more powerful servers.
  • So, their chips are down.

  • by DavenH ( 1065780 ) on Tuesday February 04, 2020 @04:11PM (#59690898)
    Isn't a slowdown expected, now that for multiple cycles, Intel and other chipmakers have failed to produce any chips worthy of upgrading to? Without acknowledging that, it's easy to misdiagnose the slump to economic problems or trade wars. It can be some of both, of course, but the article doesn't even mention what seems like an obvious component.

    For many applications, performance has been delegated to the GPU anyway, and CPU chips are more of a conduit to leveraging their throughput, and so even if more impressive chips were available, their marginal utility of that extra power might be low.

    If NVidia's sales are also down by the industry average, I'd be convinced. They at least have been continuing to keep the flops cruising upward.

  • Not good news. I also have business related to sales and in general all is good. I have good profit. I even thinking about using special mobile wallet from https://walletfactory.com/ [walletfactory.com] to improve functionality of payment. I think this will also boost my results. What can you say about this?

...there can be no public or private virtue unless the foundation of action is the practice of truth. - George Jacob Holyoake

Working...