Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI

Asus Will Offer Local ChatGPT-Style AI Servers For Office Use (arstechnica.com) 9

An anonymous reader quotes a report from Ars Technica: Taiwan's Asustek Computer (known popularly as "Asus") plans to introduce a rental business AI server that will operate on-site to address security concerns and data control issues from cloud-based AI systems, Bloomberg reports. The service, called AFS Appliance, will feature Nvidia chips and run an AI language model called "Formosa" that Asus claims is equivalent to OpenAI's GPT-3.5.

Asus hopes to offer the service at about $6,000 per month, according to Bloomberg's interview with Asus Cloud and TWS President Peter Wu. The highest-powered server, based on an Nvidia DGX AI platform, will cost about $10,000 a month. The servers will be powered by Nvidia's A100 GPUs and will be owned and operated by Asus. The company hopes to provide the service to 30 to 50 enterprise customers in Taiwan at first, then expand internationally later in 2023. "Nvidia are a partner with us to accelerate the enterprise adoption of this technology," Wu told Bloomberg. "Before ChatGPT, the enterprises were not aware of why they need so much computing power."

According to Asus, the "Formosa Foundation Model" that will run on the AFS Appliance is a large language model that generates text with traditional Chinese semantics. It was developed by TWS, a subsidiary of Asustek. Like ChatGPT, it will offer AI-powered text generation and coding capabilities. Despite the growing demand for AI-training chips, Bloomberg reports that companies like Asus hope to secure a share of the market by offering "holistic AI systems" that offer a complete AI solution in a service package. Asus claims that its existing partnership with Nvidia will ensure that there's no supply shortage of Nvidia's chips as the AFS Appliance service rolls out.

This discussion has been archived. No new comments can be posted.

Asus Will Offer Local ChatGPT-Style AI Servers For Office Use

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Thursday June 01, 2023 @06:06PM (#63568673)

    Even if the hardware costs $25K it's better to own it then rent it at that price.

    • Unless you discover that AI does not improve your efficiency in less than four months.
    • by EvilSS ( 557649 )

      Even if the hardware costs $25K it's better to own it then rent it at that price.

      Yea, add a zero to that $25K and you will be in the ballpark for 1 server.

      • That's only due to nVidia intentionally crippling their consumer GPUs by disabling NVLINK.

        So much like diamonds, it's only done to force their more expensive chips down peoples throats due to artificial scarcity. What we need, is a decent competitor to CUDA from AMD or Intel.

        • by EvilSS ( 557649 )
          Doesn't matter why, the fact is they are expensive as hell. And do you honestly think any competitor won't separate their consumer and enterprise GPU lines and charge huge markups on the enterprise gear?
  • ... and certainly raises more security concerns than it answers. "Local" would be an appliance that needs no connection to the Internet, and which is actually owned and operated by the buyer.
    • by gweihir ( 88907 )

      Indeed. And one that gets wiped completely when you give it back. But I guess the same people that are interested in an artificial well-spoken moron do not understand IT security either.

  • Until I see comparisons between GPT3.5 and Formosa, this is just an ad.
  • This is undeniably an excellent piece of writing. We are indebted to you for taking the time to explain all of this in such detail for all of us. Your website address is: play snake [play-snake.co] It served as an excellent guidance!

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...