Forgot your password?
typodupeerror
AI China Cloud

Alibaba Creates AI Chip To Help China Fill Nvidia Void 29

Alibaba, China's largest cloud-computing company, has developed a domestically manufactured, versatile inference chip to fill the gap left by U.S. restrictions on Nvidia's sales in China. The Wall Street Journal reports: Previous cloud-computing chips developed by Alibaba have mostly been designed for specific applications. The new chip, now in testing, is meant to serve a broader range of AI inference tasks, said people familiar with it. The chip is manufactured by a Chinese company, they said, in contrast to an earlier Alibaba AI processor that was fabricated by Taiwan Semiconductor Manufacturing. Washington has blocked TSMC from manufacturing AI chips for China that use leading-edge technology.

[...] Private-sector cloud companies including Alibaba have refrained from bulk orders of Huawei's chips, resisting official suggestions that they should help the national champion, because they consider Huawei a direct rival in cloud services, people close to the firms said. China's biggest weakness is training AI models, for which U.S. companies rely on the most powerful Nvidia products. Alibaba's new chip is designed for inference, not training, people familiar with it said. Chinese engineers have complained that homegrown chips including Huawei's run into problems when training AI, such as overheating and breaking down in the middle of training runs. Huawei declined to comment.
This discussion has been archived. No new comments can be posted.

Alibaba Creates AI Chip To Help China Fill Nvidia Void

Comments Filter:
  • Eventually I will have a wall of solar batteries, powering AI chips in the middle with a large screen on the inside.

    And it could be anywhere without any difference at all.

    I won't need the real world.

    • by gtall ( 79522 )

      Brilliant! We won't have to feed and water then.

      • Circular economy, man, mouth to ass, like you it to Mars already.

        • Circular economy, man, mouth to ass, like you it to Mars already.

          Ah yes - mouth to ass as sycophant to Trump. I do hear that economic circle is getting smaller and smaller though. Just like a tightening sphincter.

    • Because you expected them(?) to serve pi to the nth degree... but they serve streaming Steaming framegen farming slop. But surely GOOG won't cut 30% of middle-mommagers if AAPL has a round crust.
  • LLM inference is perfect for older nodes. You just need a ton of space for wide memory buses and very little compute.

    Of course given that NVIDIA choses to not even create inference only systems, it says a bit about the market size of training vs inference.

    • Quite obvious where does the market difference come from.

      Training is the "AGI, AGI, AGI! pour a boatload of money now, and we win everything!11!" outlook of shitstains like the zuck, sam altman and similar.

      Inference is what can perhaps be useful some day in some way to many, but isn't really at the moment.

      The second category isn't very likely to shell money now, while the first will burn all they can

      • In other words, chips for training are an economical empty-calorie sugar rush. Chips for inference are more nutritious in the long term.

      • ... shitstains like the zuck, sam altman and similar.

        You're being unfair to the shitstains which, after all, have no will of their own and can be removed with a laundering or two. Zuck, Altman, and their ilk know that they're stinking disease vectors, yet refuse to change their shitty ways.

      • by allo ( 1728082 )

        You also need to train the useful AI. If DeepSeek doesn't train new models, you'll be laughing in a few years that their model doesn't know the current american president. The models also did not hit the ceiling in intelligence yet. You don't need to strive for AGI/ASI for wanting to be able to train models fast/efficiently.

    • by Slayer ( 6656 )

      Of course given that NVIDIA choses to not even create inference only systems, it says a bit about the market size of training vs inference.

      Inference may well be a huge market in future, think autonomous cars. Their total compute will quickly outnumber anything we've thrown at training so far. And yes, NVIDIA does provide proper platforms for inference [nvidia.com].

  • that this happend, who ever could have seen this?
    • Before:

      Oh no China is buying american AI technology! This is bad for our economy. Lets ban them buying our stuff!

      After:

      China now manufactures AI technology cheaper than America.

      Flawless victory for america.

  • by Baron_Yam ( 643147 ) on Saturday August 30, 2025 @08:42AM (#65626162)

    Politics will always ultimately lose to fundamental economics. If there's an economic unit big enough to provide for its own needs, you can't force them to depend on you for them. You can reduce their efficiency a bit by refusing trade so they have to do it themselves, but unless you're willing to go to war you can't stop them from doing it themselves if they have the will to do so.

    The Chinese are not stupid, they have more than enough population, and they have the natural resources. They've also spent the last several decades having the world hand them technological advances and they've been making their own additions to those for some time.

    The horse is out of the barn, you cannot dictate to China what technology it can or cannot have. It won't be long until they will regularly be producing novel improvements before others.

    • Politics will always ultimately lose to fundamental economics. If there's an economic unit big enough to provide for its own needs, you can't force them to depend on you for them.

      This is true. However, this further emphasizes the current advantage that Nvidia has over all US companies and over all Chinese companies. There isn't a single company in the US or China that doesn't hate Nvidia for overcharging for their GPUs. All big companies in both the US and China are trying to make their own AI chips, and a few are trying to make GPUs. Yet, not even one of these has been able to replace Nvidia yet. Not for training, but even for inference, Nvidia has significant sales. A majori

  • Is he? (Score:4, Interesting)

    by hcs_$reboot ( 1536101 ) on Saturday August 30, 2025 @08:58AM (#65626174)
    Isn’t it obvious that cutting China off from U.S. technology only pushes them to innovate and close the gap?
    By helping China in this way (even if indirectly), maybe our man isn’t a Russian agent after all.
  • and then it kills them while they are crossing the street, and hides the bodies so the press doesn't learn of it.

  • by MpVpRb ( 1423381 ) on Saturday August 30, 2025 @11:08AM (#65626344)

    ...from developing tech is futile and counterproductive
    There are LOTS of really talented Chinese engineers and scientists who are good at working around restrictions
    Cooperation would be better

    • given the current methods.

      Recommended reading Empire of AI by Karen Hao. [wikipedia.org].

      How much money can be made cramming all copyrighted works and the postings on the Internet into a machine that statistically predicts the next word in a sentence or pixel in an image. That must have guardrails installed post training to not be a biased, racist, a-hole since the Internet is statistically full of such ideas.

      Yes, it will find all kinds of patterns and it also can be tweaked to produce alternate results by using different

  • The Cloud: multiple virtual machines runing on a PC cluster.

    Cloud-computing chips: customized chips based on GPUs and designed to efficiently perform matrix algebra.
  • Around 2017, I started looking at Huawei as an alternative to Cisco. I have been abandoning Cisco in favor of Huawei ever since.

    It all started when the US government started advertising that Huawei was so good that without government intervention, American companies would not be able to compete against them.

    I researched them quite extensively and even visited their headquarters. And the US is right. Huawei products are really good and their customer support was way better than Cisco's. And with the inflatio

VMS is like a nightmare about RXS-11M.

Working...