Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses

Nvidia Hits $1 Trillion in Market Value on Booming AI Demand (reuters.com) 59

Nvidia on Tuesday became the first chipmaker to join the trillion-dollar club, as the company bets on a surge in demand for its AI chips that power chatbot sensation ChatGPT and many other applications. From a report: The gaming and AI chip company's shares rose 4.2%. Taiwan Semiconductor Manufacturing Co is the next largest chipmaker globally, valued at about $535 billion. Meta Platforms, valued at about $670 billion as of last close, clinched the trillion-dollar market capitalization milestone in 2021, while Apple, Alphabet, Microsoft and Amazon are the other U.S. companies that are part of the club.
This discussion has been archived. No new comments can be posted.

Nvidia Hits $1 Trillion in Market Value on Booming AI Demand

Comments Filter:
  • by ac22 ( 7754550 ) on Tuesday May 30, 2023 @09:25AM (#63561035)

    Nvidia market cap $1.02 trillion
    Intel market cap $122.84 billion

    Nvidia revenues $26.97 billion
    Intel revenues $63.05 billion

    • by CEC-P ( 10248912 )
      Maybe they should then copy Intel and make their own chip fabrication plant. I mean, they won't, but I thought I'd point out that they REALLY should since they're a glorified $1 trillion reseller but precariously worth more than ebay, Newegg, etc,
      • I feel it's totally unfair to have the prices of devices I need for mining driven up by uses like helping people do things. My freinds tell me I need to look up "irony". What's that? Is it Some new crypto currency I haven't heard of it? How do you mine it without a GPU

    • by gweihir ( 88907 )

      This is just AI hype bullshit. At least 25x overvalued, I would say.

    • There is still so much money just looking for somewhere productive to park
    • by UMichEE ( 9815976 ) on Tuesday May 30, 2023 @12:18PM (#63561479)

      Kroger market cap $33B
      Kroger revenue $148B
      The value of a company is more complicated than just its revenue.

      Market cap is just the total value of all of the shares of a company. People are willing to pay more for a company that has a recent track record of growing (Nvidia) than a company that has a recent track record for shrinking (Intel). And ultimately, it's profits and not revenue that are paid back to investors, which is why Kroger's enormous revenue numbers don't impress investors as much as Nvidia's larger (and growing profits).

      But is Nvidia overvalued relative to Intel? I think so. Nvidia is a great company and both people I've met that work there and people I've met that work at competitors tell me that the company's execution and vision is great. Still, it's not entirely clear that Nvidia will completely dominate the AI market long term or be able to maintain the insane margins that they experience currently. Already, larger players (e.g. Microsoft, Google) are fabbing their own chips for AI acceleration and AMD and Intel are both competing in the market. It's like Tesla, in my opinion, it's not entirely clear whether Tesla will be able to maintain its EV market dominance (and margins) once all the legacy automakers have been in the market for a little while. But certainly, it's hard to imagine Tesla not continuing to be an important player in that market.

      • Intel has been fumbling while nvidia has been executing. This doesn't necessarily mean nvidia's not overvalued, but there are at least rational reasons for the possibly irrational valuation.

        Nvidia is also an important player in the EV market, since you bring that up. AI and EV are both pretty buzzy.

      • Still, it's not entirely clear that Nvidia will completely dominate the AI market long term or be able to maintain the insane margins that they experience currently. Already, larger players (e.g. Microsoft, Google) are fabbing their own chips for AI acceleration and AMD and Intel are both competing in the market.

        This is the $1 trillion dollar question, whether some competitor can challenge Nvidia. It's been several years, and the answer is still no, at least not yet. AMD has decent GPUs but the support SW is a non-starter. Intel has inferior HW and SW. It's not like they haven't been trying for many years. Google has also been doing the TPU for many years. Amazon and Microsoft are also trying to develop their own HW. Tesla does have their own HW. There are many startups.

        That these competitors exist is not a

    • Yeah it is when you look at more than 1 set of figures.

      NVIDIA net profit $9.7bn
      Intel net profit $8bn

      NVIDIA revenue y/y change average trailing 4Q -7%
      Intel revenue y/y change average trailing 4Q -50%

      And that's before you get into:
      Latest hot trend: Serviced by NVIDIA's core product portfolio
      Latest hot trend: Intel is now the last major chip manufacturer to actually consider any AI hardware in their CPU line-up behind the entire industry.

      Single year end revenue figures alone paint Intel in a very different lig

  • Not a chipmaker (Score:5, Interesting)

    by monkeyxpress ( 4016725 ) on Tuesday May 30, 2023 @09:31AM (#63561051)

    Nvidia is not a chipmaker. Hilariously, they use TSMC to make most of their chips, so it's pretty weird to say they are a bigger chipmaker than TSMC.

    I really have no idea bout Nvidia these days. They made so much money out of the hype fest that was crypto mining that it's hard to believe their marketing department/stock spruikers have not completely taken over the business.

    AFAIK, people used their GPUs because they were the only mass produced matrix processors around, but as the CNN field matures, there will presumably be much more refined hardware solutions that start appearing. Whether Nvidia has any real secret sauce or ability to insert themselves into the market I guess we will soon see.

    • by ac22 ( 7754550 )

      Yes, pretty bizarre that one of TSMC's customers (Nvidia) is apparently worth double what TSMC themselves are worth.

      I won't say that US tech stocks are overvalued, but they certainly command a very high value compared to other countries and sectors.

      • Re:Not a chipmaker (Score:4, Informative)

        by timeOday ( 582209 ) on Tuesday May 30, 2023 @11:50AM (#63561387)
        Foxconn is only worth a lousy $77B to Apple's $2,790B.
      • Not really bizarre. TSMC makes components for other companies. As such they could be worth less than their customers. Since they make components, each unit is less than the product that component is in. For example, TSMC makes NVidia RTX GPU chips which then NVidia or partners manufacture graphics cards with memory, PCBs, connectors, other chips for more than the GPU cost.
      • Yes, pretty bizarre that one of TSMC's customers (Nvidia) is apparently worth double what TSMC themselves are worth.

        Why is that bizarre and why do you consider the two competitors? One provides a service to the other. Their relative sizes are completely irrelevant, but even if they weren't irrelevant it is almost universal that service companies have a lower value than the companies they service.

        Just because you can make good shovels doesn't mean your profit more from the gold in someone else's mine.

    • Re:Not a chipmaker (Score:5, Insightful)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday May 30, 2023 @10:36AM (#63561207) Homepage Journal

      AFAIK, people used their GPUs because they were the only mass produced matrix processors around

      People used their GPUs because of CUDA. Nvidia made it easy to use them, which is why nvidia cards are preferred for compute over AMD cards even though OpenCL exists, and even when the AMD cards are cheaper.

      Whether Nvidia has any real secret sauce or ability to insert themselves into the market I guess we will soon see.

      They don't have to insert themselves into the market, that part happened already. What they have to do now is keep their performance competitive. They have already done all the other parts.

    • Exactly! Theyâ(TM)re fabless!
    • by ceoyoyo ( 59147 )

      AFAIK, people used their GPUs because they were the only mass produced matrix processors around

      Might be worth googling "AMD graphics card".

      People use Nvidia because they put a lot of work into their CUDA libraries. The competitors came late to the party with OpenCL, which is a committee effort, and the partners didn't really seem to put the same effort into it.

      Lots of people would love to use AMD cards for various reasons, but Nvidia cards are much better supported.

    • by tlhIngan ( 30335 )

      Nvidia is not a chipmaker. Hilariously, they use TSMC to make most of their chips, so it's pretty weird to say they are a bigger chipmaker than TSMC.

      Yeah. nVidia is a fabless chipmaker is the correct term. Others in the same group include Apple, Broadcom, Qualcomm, Marvell, AMD and others.

      Though to say they're bigger than TSMC isn't too unusual. After all, Apple is technically bigger than TSMC as well.

      Chipmaking is extremely expensive - this isn't like the 80s where it seems everyone and their dog owns a fa

      • Maybe we should come up with some clever (and short) way to refer to companies which only do chip design, and don't actually build any. They also don't build cards, as Foxconn does that. They only do design, marketing, and... hell, do they even do distribution? If I was contracting out all of the assembly, I'd want to contract out the distribution too so I never had to touch any hardware except for doing QA and other analysis... though I suppose they could contract that too :)

    • but as the CNN field matures, there will presumably be much more refined hardware solutions that start appearing

      I was telling myself that for a few years, and then I bought another NVIDIA card because I got tired of waiting for that more refined hardware. The rest of the industry is very much on the back foot in this market. NVIDIA has a huge head start not just in terms of hardware performance but also in terms of integration with software and development tools thanks for pushing CUDA long before it was the latest hot stuff.

  • Too bad, as of their recent presentation, their CEO doesn't seem to know what GPUs do, who their customers are, what their products are, and most importantly who cares about gaming or card availability or prices when they've got AI things to do!
    • by Junta ( 36770 )

      As a tech customer, it's disappointing.

      As a business plan, it's pretty smart. "AI" is a torrent of hype right now, and for nVidia, it's just an easy win without a lot of room for bad consequences. They spew stuff like "AI means you won't need specially trained programmers anymore" and fuels the fire where companies either directly toss a million or two at nVidia hardware to give it a go, or they all demand from their respective cloud vendors, demanding nVidia specifically, who in turn have to toss many mil

      • I don't know about business utility of AI, but I know training AI models is reliable today. You can be sure they learn the dataset. There's no trick, just optimisation and modelling.
        • by Junta ( 36770 )

          The main point is that it hardly even matters if it works or doesn't, either way nVidia gets paid well and is not accountable for anything.

          Training works, but the applicability of the approach varies from scenario to scenario. Currently, depending on how much you'll drink the Kool-aid every single human profession can be changed over to running on nVidia GPUs (or on the flip side, you'll be able to make $300k/year as a 'prompt engineer', if that's the end of the hype that needs to target you).

  • Dropping cash or dropping trou? Twitter and Slashdot...
  • by xack ( 5304745 ) on Tuesday May 30, 2023 @09:53AM (#63561101)
    Their traditional gpus for gaming and workstation are being let down by lack of focus to the point where new products are worse than the previous generation. Unless they use their AI windfall to focus their core products again they could be in deep trouble. They already had the crypto bubble, learn your lesson!
    • Re: (Score:3, Informative)

      by Gavino ( 560149 )
      The thing is, gaming and workstation GPUs are no-longer their core. The datacentre is.
      • And for the average person to get familiar with their ecosystem, the only reasonable GPU they can buy to experiment are the gaming GPUs.

        Screw them up too much, and you will find it harder to get developers who at least have some experience with Nvidia's ecosystem.

        Same thing for AMD. Their ROCm ML stuff did not really work with consumer gear when I checked last, so I didn't even bother looking that deeply at their ML / GPU gear. I understand it is improving now, but it will be some time before I look in ag

        • for the average person to get familiar with their ecosystem, the only reasonable GPU they can buy to experiment are the gaming GPUs.

          You can learn how that works with a cheap old GPU though, so that's no skin off their nose. Plus, I don't know if you noticed, but part of the premise of this AI stuff is reducing programmer count. Nvidia is really not worried about there being enough programmers who know how to use their stuff.

    • Mobile and console gaming is the majority if the market now. Mobile is diverse and console is a matter of securing sweet heart deals for Nintendo or Sony, now that Xbox has fallen by the wayside.

    • new products are worse than the previous generation

      Citation required. Even talking purely on a value proposition their current gen performance per dollar is better than the previous gen, and that's before you take into account inflation.

      They already had the crypto bubble, learn your lesson!

      The crypto bubble earned them a metric fuckton of revenue. They learnt the lesson, and will ride the express hype train Profitville every opportunity they can get.

      That you can't get a budget GPU is not of NVIDIA's concern. They are too busy lighting cigars on fire with $100 bills.

    • Their traditional gpus for gaming and workstation are being let down by lack of focus to the point where new products are worse than the previous generation.

      no, it's only the 4060, which is still faster (higher fps) and more efficient (frames per watt) than the 3060, except for a couple of games where the 4060 has slightly lower fps (which is ridiculous, i agree).

      but that's it.
      all their other cards are much better in every way (except price).

  • Comment removed based on user account deletion
    • I do think there will be a lot of work in making neural nets more efficient, but they'll still be performing a fantastic number of FLOPS compared to something like a query from a well-structured and indexed database. But that said, consuming and producing "messy" data like speech and written language probably can't be done all that cheaply. When you ask somebody to add 3+2 I suppose 99.9% of the neural firings in their brain is sensory and motor to hear the question and say the answer.
      • Comment removed based on user account deletion
        • Python is just an interface with the neural net which is written in C.
          • Comment removed based on user account deletion
            • No, that's not how it works. Imagine you are compressing a 4K 100-hour-long video using an advanced video encoder like x265 (which is written in C++ and assembly) in "veryslow" mode, and you are complaining about the fact the UI that lets you change some settings and press a button to generate the x265 CLI command is written in Python. The UI is a drop in the lake when it comes to resources consumed, the slowness comes from the fact you have an enormous dataset and you are running some very complex algorith
              • Comment removed based on user account deletion
                • Having chosen a different language for the upper logic interfaces would have been acceptable, a much quicker language.

                  Python sucks in a lot of ways but few of them are relevant for the job it's doing here, and performance is not one of them. All it's handling is telling the software what to do, and it's irrelevant while that's happening. So yes, Python is slow, but the part it's doing is the part that could be done by literally anything. Performance is literally the least relevant of the applicable ways in which it sucks in this case. You get sucked into the Python package ecosystem and have to install yet another dependen

    • Efficient programming has been done for generations. Old mainframe code was limited to very limited memory and speed, so it HAD to be written to run efficiently. And it takes effort to write efficient code. I've read old FORTRAN code where the author was making a serious effort to save individual bytes, let alone kilobytes. And it WAS efficient and fast.

      But to a "business perspective", "efficiency" is about ROI. Now we have GB of RAM and TB of storage, whereas we used to be stuck with kB of RAM (maybe less!

    • AI is using a lot of AI in its stack. From chip layout AIs to optimising the neural network kernels on new compute with different caching and communication speed structure. You don't have to code by hand to make the most efficient code.
  • This valuation puts Nvidia's P/E ratio at 210! For context, Apple's current market cap is $2.79T with a P/E ratio of 30.13. Even Tesla's P/E ratio is "only" 58.81. The only way Nvidia's valuation makes any sense is if this is part of a short squeeze. As others have stated, Nvidia doesn't even manufacture their own chips and their entire business depends on approximately two companies on the planet doing the manufacturing for them. In addition to that, I don't think it's that difficult for a company to
    • Nvidia doesn't even manufacture their own chips and their entire business depends on approximately two companies on the planet doing the manufacturing for them.

      That is irrelevant. There's no reason valuations should be limited to companies who produce things for others, especially when those companies lack the core competence to do what their customers are doing. NVIDIA may be dependent on TSMC but that has zero to do with their ability to capitalise on the latest trend du jour, something they are very good at.

      I don't think it's that difficult for a company to design their own specialized chips for AI considering Tesla has already done it and I feel pretty confident other companies will be doing the same.

      You don't seem to know the difference between providing general purpose compute devices, and designing a highly specialised ASIC capable of doing one and on

      • NVIDIA may be dependent on TSMC but that has zero to do with their ability to capitalise on the latest trend du jour

        All it would take is for China to invade Taiwan and Nvidia will have few, if any, ways of manufacturing their designs. Worse yet, those companies depend on equipment produced by a single company, ASML, which can only crank out a small quantity of EUV machines per year to replace any equipment damaged in a conflict. Semiconductor manufacturing is the single most important industry at the mom

  • A PC with a Graphics Card used for AI/Crypto/additional calculations and not actual graphics, is in general a SIMD (Single Instruction Multiple Data) system much like the MASPAR system of the 1980's and 1990's. Being that we are using these for non-graphics related tasks, I would assume that we could make a better architecture that would aid in AI calculations much quicker than GPU perhaps tuned for the tasks that it is heavy on and not wasting space and money on features that will not be used.

  • And yet, NVIDIA is the least "diverse" of the big tech companies: https://www.trueup.io/diversit... [trueup.io] But don't tell that to all those investment firms that have been duped into believing that "diversity" supposedly makes corporations stronger and push corporations to waste time and money getting all kinds of useless DEI certificates from private consultants (while potentially compromising internal meritocracy in order to get those certificates).
  • Today Nvidia dropped below $1 Trillion.
    On Waning AI Demand, presumably.

    No Slushdit Story?

If I want your opinion, I'll ask you to fill out the necessary form.

Working...