Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD

AMD Likely To Offer Details on AI Chip in Challenge To Nvidia (reuters.com) 18

Advanced Micro Devices on Tuesday is expected to reveal new details about an AI "superchip" that analysts believe will be a strong challenger to Nvidia, whose chips dominate the fast-growing artificial intelligence market. From a report: AMD Chief Executive Lisa Su will give a keynote address at an event in San Francisco on the company's strategy in the data center and AI markets. Analysts expect fresh details about a chip called the MI300, AMD's most advanced graphics processing unit, the category of chips that companies like OpenAI use to develop products such as ChatGPT. Nvidia dominates the AI computing market with 80% to 95% of market share, according to analysts.

Last month, Nvidia's market capitalization briefly touched $1 trillion after the company said it expected a jump in revenue after it secured new chip supplies to meet surging demand. Nvidia has few competitors working at a large scale. While Intel and several startups such as Cerebras Systems and SambaNova Systems have competing products, Nvidia's biggest sales threat so far is the internal chip efforts at Alphabet's Google and Amazon's cloud unit, both of which rent their custom chips to outside developers.

This discussion has been archived. No new comments can be posted.

AMD Likely To Offer Details on AI Chip in Challenge To Nvidia

Comments Filter:
  • by ranton ( 36917 ) on Tuesday June 13, 2023 @10:32AM (#63598580)

    IMHO NVidia's competitive advantage is in their software, not really their hardware. They don't own enough of the manufacturing process for their hardware to be that different from their competitors. While they have a slight edge in hardware now there is little guarantee that will continue. Their edge appears to be in the software ecosystem built around their hardware. This is where I think AMD will need to improve if they really want to take NVidia on.

    And I think AMD and other competitors will be able to traverse past NVidia's software moat in time, which is why I am surprised NVidia's stock is trading at such a high level.

    • I am surprised NVidia's stock is trading at such a high level

      Watch out, you're going to be branded as an evil short seller by the slashcommie wannabes.

      • by ranton ( 36917 )

        Watch out, you're going to be branded as an evil short seller by the slashcommie wannabes.

        Ha, I'm a firm believer that the market can remain irrational far longer than I can remain solvent. I think Tesla should be trading at 10% it's current valuation as well (which would still be almost triple Ford's revenue / market cap ratio) but I'm not looking to short them any time soon either. I'll stick with my low fee index funds and wait until retirement.

    • Comment removed based on user account deletion
      • If only it even ran properly on Linux (hint: I'm on Linux, have an ADM GPU, and no luck).
      • I'm on linux, I'm not a pro, but I might occasionally have to run CUDA code, so I went with NVidia (and I regret it every day with my half-working suspend). AMD advertises ROCm as easy to convert from CUDA and they provide nearly-automated tools. But if it's so easy that it (according to them) takes just a couple hours to port a project, they would only need a very small team to submit patches and port hundreds of OSS projects at github/gitlab/sourceforge/... Many OSS communities are happy with extending to

      • by ceoyoyo ( 59147 )

        AMD needs to get their cards relatively seamlessly supported in the major deep learning packages.

        nVidia had CUDA working pretty well when they were being written, so that's what got supported. Now AMD is playing catchup, so they're going to have to do a lot of that work themselves, or make some else really really interested in doing it for them.

        Docker containers and supporting specific versions of specific operating systems isn't going to cut it.

    • by gweihir ( 88907 )

      And I think AMD and other competitors will be able to traverse past NVidia's software moat in time, which is why I am surprised NVidia's stock is trading at such a high level.

      Indeed. In a sense, it is already happening with some very impressive optimization people have come up with. It is even quite possible that specialized hardware will turn out to be essentially irrelevant as ChatAI seems to be strongly subject to diminishing returns for larger models.

      The NVidia stock price is just a mix of hype and "greater fool theory".

    • IMHO NVidia's competitive advantage is in their software, not really their hardware. They don't own enough of the manufacturing process for their hardware to be that different from their competitors. While they have a slight edge in hardware now there is little guarantee that will continue. Their edge appears to be in the software ecosystem built around their hardware. This is where I think AMD will need to improve if they really want to take NVidia on.

      And I think AMD and other competitors will be able to traverse past NVidia's software moat in time, which is why I am surprised NVidia's stock is trading at such a high level.

      The insight that much of Nvidia's current advantage lies in its software is likely correct. However, the million dollar question is why Nvidia's software moat has held up this long. It's not like AMD and other companies realized in the last few months that the AI picks and shovels were a big business. They've known this for many years and have tried to make up the difference. So, why hasn't that difference closed by now? And why would the difference close in the near future? In the last few years, it

  • There's marketing in there somewhere...

  • by gweihir ( 88907 ) on Tuesday June 13, 2023 @03:43PM (#63599706)

    When it becomes available in sufficient quantity, approaches have usually been optimized and changed enough to make that hardware obsolete. One of the claims my CS 101 prof made 35 years back and I have seen it pan out time and again.

    I mean, people are currently already training the new models on normal PCs and it does not take forever. They are using the models on phones with some small restrictions and still reasonable responsiveness. This whole "AI Hardware" push is yet another hype by those that do not understand the tech.

    • by ceoyoyo ( 59147 )

      Yup, remember when you used to pay extra for an FPU? Lol. Nobody has those anymore!

      Also, vector processing units, dedicated graphics processing units, hardware interfaces....

      • by gweihir ( 88907 )

        Obviously I am talking about cutting-edge non-mature algorithms. Hence your argument has no merit. Context matters.

        • by ceoyoyo ( 59147 )

          Multiply-add is pretty mature. Hell, matrix multiplication is too.

          • by gweihir ( 88907 )

            You just sound like somebody clueless that cannot admit being clueless. Pathetic.

            • by ceoyoyo ( 59147 )

              You should probably look up what this card we're talking about does before you get any more mouthy. AI Accelerators are generally SIMD processors with small instruction sets designed to do linear algebra. GPUs, basically. This one, like nVidia's, is literally a GPU.

              From AMD's website:

              "The AMD Instinct MI200 series accelerators are the newest data center GPUs from AMD"

  • nVidia even has some scare wording in their consumer grade GPUs that they "pose a fire risk" as compared to their datacenter GPUs.

    If AMD wants to kick nVidia's ass, they need to do three things:
    * Make a GPU roughly as good (it doesn't have to be better) than a 4090.
    * Make a version of tensorflow that works with it for linux, windows, and mac.
    * Give it gobs of memory. As in 24G or more.

    This last is super important as the key feature of the nVidia high end cards is not their performance, but their mem

It is clear that the individual who persecutes a man, his brother, because he is not of the same opinion, is a monster. - Voltaire

Working...