Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Businesses AI

Nvidia Posts Record Revenue Up 265% On Booming AI Business (cnbc.com) 27

In its fourth quarter earnings report today, Nvidia beat Wall Street's forecast for earnings and sales, causing shares to rise about 10% in extended trading. CNBC reports: Here's what the company reported compared with what Wall Street was expecting for the quarter ending in January, based on a survey of analysts by LSEG, formerly known as Refinitiv:

Earnings per share: $5.16 adjusted vs. $4.64 expected
Revenue: $22.10 billion vs. $20.62 billion expected

Nvidia said it expected $24.0 billion in sales in the current quarter. Analysts polled by LSEG were looking for $5.00 per share on $22.17 billion in sales. Nvidia CEO Jensen Huang addressed investor fears that the company may not be able to keep up this growth or level of sales for the whole year on a call with analysts. "Fundamentally, the conditions are excellent for continued growth" in 2025 and beyond, Huang told analysts. He says demand for the company's GPUs will remain high due to generative AI and an industry-wide shift away from central processors to the accelerators that Nvidia makes.

Nvidia reported $12.29 billion in net income during the quarter, or $4.93 per share, up 769% versus last year's $1.41 billion or 57 cents per share. Nvidia's total revenue rose 265% from a year ago, based on strong sales for AI chips for servers, particularly the company's "Hopper" chips such as the H100, it said. "Strong demand was driven by enterprise software and consumer internet applications, and multiple industry verticals including automotive, financial services and health care," the company said in commentary provided to investors. Those sales are reported in the company's Data Center business, which now comprises the majority of Nvidia's revenue. Data center sales were up 409% to $18.40 billion. Over half the company's data center sales went to large cloud providers. [...]

The company's gaming business, which includes graphics cards for laptops and PCs, was merely up 56% year over year to $2.87 billion. Graphics cards for gaming used to be Nvidia's primary business before its AI chips started taking off, and some of Nvidia's graphics cards can be used for AI. Nvidia's smaller businesses did not show the same meteoric growth. Its automotive business declined 4% to $281 million in sales, and its OEM and other business, which includes crypto chips, rose 7% to $90 million. Nvidia's business making graphics hardware for professional applications rose 105% to $463 million.

This discussion has been archived. No new comments can be posted.

Nvidia Posts Record Revenue Up 265% On Booming AI Business

Comments Filter:
  • "People get rich over over-hyped bullshit and yacht builders now backed up for years"
  • by HBI ( 10338492 ) on Wednesday February 21, 2024 @08:18PM (#64258458)

    Someday, in business schools, Nvidia will be used as an example of surfing repeated bubbles - crypto and now AI. Probably a cautionary tale.

    For now, a lot of new cars and other luxury expenses for Nvidia people.

    • >AI
      Talk about an 'emperor has no clothes' technology. Looking at you Gemini. It's so bad that it's comical. Jensen had better not blow all his money on leather jackets, because the only people seemingly convinced at a bright future for 'AI' are the firms producing it.

    • Probably a cautionary tale.

      I doubt it. Unless you're analysis is "Caution, correctly predicting and being best placed to take advantage of repeated bubbles can make your company filthy rich and influential."

      NVIDIA's success wasn't a fluke on account of simply being in the GPU business. They positioned themselves and their cards to take advantage of bubbles before they occurred. Say what you want about their price gouging of PC gamers, but business schools will hold this up as a shining example of how to run a business.

      We mocked them

      • by sinij ( 911942 )
        Is it luck or planning? At least with AI you could construct a plausible scenario where they repurposed crypto mining hardware for the task. For the original crypto mining bubble, I can't see how you could make similar case. Keep in mind, chip orders are planned years in advance.
      • by HBI ( 10338492 )

        Predicting their future success requires two things to be true:

        1) LLMs will continue to require greater and greater processing power for larger and larger corpuses of data
        2) Competitors won't devote effort to designing simliar chips that undercut them on price and/or performance

        I don't think you can say either will be true with any confidence. I'm pretty sure both of them are false for a variety of reasons. I expect their functionality to be commoditized and we may find that it is functionality that is no

  • $2 trillion market cap totally makes sense under the assumption that NVDA somehow retains market dominance for at least the next 30 years. Tulips, anyone?
    • by thegarbz ( 1787294 ) on Thursday February 22, 2024 @04:07AM (#64259100)

      NVIDIA isn't selling tulips. They are selling shovels to people who grow and sell tulips. So as one tulip fad fades (crypto) the next can start (though that said AI acceleration has actual practical purposes so I don't see the justification in comparing that to a tulip)

    • by ranton ( 36917 )

      I agree their stock is too high right now, but there are defensible reasons why their stock has went up by many multiples in the last 18 months. NVidia's stock is currently 313% higher than it was in Nov 2022, right before ChatGPT kicked off the current AI craze. NVidia's revenue in Q4 2023 was also 274% higher than in Q3 2022, right before the craze. So there is at least some justification for the higher stock price.

      But NVidia still enjoys a huge P/S ratio. Using a discounted cash flow valuation method, NV

  • I just bought an AMD 7900XTX! Thanks NVidia!!

  • by icejai ( 214906 ) on Thursday February 22, 2024 @05:11AM (#64259206)

    I'm not sure I completely understand the hate AI gets by people who think it's a fad, or bubble. There are certain eras that we can point to that clearly define significant shifts in computing, and technology in general.
    First, is the digitization of information. ie. Room-sized servers, the ability to perform thousands of computations per second, etc.
    Second, the personal digitization of information. ie. Households, normal businesses can now benefit from those thousands and millions of computations per second.
    Third, the high-speed transmission of information. ie. 56K modems, cable-internet, DSL, changed the internet from simply serving pages into serving media.
    Fourth, another personal shift in digitization of information. ie. Smartphones. Incredibly powerful computers the size of a calculator.
    Fifth, another shift in transmission of information. ie. Wireless internet. Mobile networks, widespread WiFi, etc.

    This new era brought on by all the machine learning is the era of "analysis/interpretation of information". All previous eras can be simplified as advances in storage, calculation, and transmission of information. But to have interpretation? It's a really big step, and I really don't see how the world as it is today (with the way technology and computation completely dominates our lives currently) will ever forgo the desire to benefit from discovering new information in the mountains and mountains of information humanity generates on a daily basis, by automating analysis.

    We generate oodles of data, and the only feasible way of even scratching the surface of making sense of it all is with machine learning. And the compute required to do so is absolutely astronomical. With that said, the only bubble here may be an Nvidia bubble. Nvidia is definitely enjoying monopoly prices and profits at the moment. It's only a matter of time before other chip designers catch up.

    But this new technique for automating the interpretation of information? The same information that's already generated, and will continue to be generated?

    Like servers, PCs, high-speed internet, wireless internet, and smartphones. This new tool is here to stay.

    • Garbage in--->Absolutely Astronomical Compute--->Garbage out.
      • by sinij ( 911942 ) on Thursday February 22, 2024 @08:22AM (#64259412)

        Garbage in--->Absolutely Astronomical Compute--->Garbage out.

        It is worse than GIGO, as garbage encompasses agendas and biases of people that train it.

        • by icejai ( 214906 )

          I completely agree. How *do* you train a neural model?

          The best practices of training, application, and reliability are a loooooong way off from anything approaching what we desire as acceptable. But this problem (conceptually) is no different from coming up with best practices for "training" a child into adulthood. ie. Widespread disagreement, overflowing with personal anecdotes, actual studies are limited in scope and full of variance, but general best practices have been determined. At least with a machin

          • by sinij ( 911942 )
            Thing is, as any parent knows, children have a built-in contrarian algorithm. If you try to raise a child and in doing so push something, anything, too hard, there is a good chance that child will grow into an adult doing exactly the opposite. The same is not true for AI.
            • by icejai ( 214906 )

              Kids have a "contrarian" algorithm, they have a "self" algorithm. ie. They have their own algorithm to achieve their own goals. And often, those goals conflict with the goals of their parents. Thus, their algorithm may *appear* to be "contrarian", but it's an illusion, because what's really going on is the kids is simply doing whatever they can to get what they want. And if the output of a child's general AI repeatedly conflicts with the goals of a parent's general AI, then the scenario we have is one where

    • The tool is here to stay but we can't actually afford it. It's increasing our energy consumption while our energy production still has too much ecosystem rape built in. We're reaching the reasonable limits of improvements in silicon IC technology and have no practicable plan to move to another tech that will continue to permit process shrinks, reductions in power consumption, etc. AI as we know it today is decades in the making, but the rate of improvement of the hardware has decreased dramatically over the

      • by icejai ( 214906 )

        Silicon doesn't need to shrink. Machine learning doesn't work well because it's fast, it works because it's "embarrassingly parallel". The chips for compute don't have to get smaller or faster, they just need to get "wider".

        As for training requirements, we can take the example of driving. I would argue that there is at least one more large discovery to be made when it comes to training algorithms --as big as deep nets, and transformers. For example, currently, humans seeking out a driver's license definitel

        • "Silicon doesn't need to shrink. Machine learning doesn't work well because it's fast, it works because it's "embarrassingly parallel". The chips for compute don't have to get smaller or faster, they just need to get "wider"."

          "Wider" means "more cores" which without a shrink means "more power".

          • by icejai ( 214906 )

            Wait. I don't understand.
            Society can't afford it (the energy for specialized computation?), environmentally?

            I think if a society's environment suffers from dirty power generation and lack of green energy, I'm gonna need a bit more to be convinced it's a "too many ML ASICs" issue, and not a "government policy" issue.

            • It doesn't matter which of those things it is, because government isn't going to get its shit together and get away from greed and corruption any time soon.

    • by ceoyoyo ( 59147 )

      AI pushes a bunch of people's buttons:

      1) People who put their lives into becoming experts in a particular type of model. A friend of mine, an expert in Bayesian graph models put it this way: "deep neural networks are annoyingly effective. Much more so than they have any right to be." That's the magic of depth.

      2) People who get their news from the hype peddlers. If all you do is read summaries on Slashdot, there's not much difference between Pets.com, Bitcoin and current AI.

      3) People who don't undersand what

    • It probably does not help that a lot of companies are not only flubbing their deployment of AI models, but have also enabled a lot of scams and less then ethical business practices.
      On top of the weirdo cult types trying to peddle the idea of building a robot god or some bs.
      For me right now though, I just personally hate Nvidia. This is a company notorious for price gouging and pushing shoddy quality control for their products. There is a reason why the Nvidia house fire meme has persisted for well over a de

God doesn't play dice. -- Albert Einstein

Working...