Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Cloud Microsoft AI

Microsoft is Trying To Lessen Its Addiction To OpenAI as AI Costs Soar (theinformation.com) 18

Microsoft's push to put artificial intelligence into its software has hinged almost entirely on OpenAI, the startup Microsoft funded in exchange for the right to use its cutting-edge technology. But as the costs of running advanced AI models rise, Microsoft researchers and product teams are working on a plan B. The Information: In recent weeks, Peter Lee, who oversees Microsoft's 1,500 researchers, directed many of them to develop conversational AI that may not perform as well as OpenAI's but that is smaller in size and costs far less to operate, according to a current employee and another person who recently left the company. Microsoft's product teams are already working on incorporating some of that Microsoft-made AI software, powered by large language models, in existing products, such as a chatbot within Bing search that is similar to OpenAI's ChatGPT, these people said.

[...] Microsoft's research group doesn't have illusions about developing a large AI like GPT-4. The team doesn't have the same computing resources as OpenAI, nor does it have armies of human reviewers to give feedback about how well their LLMs answer questions so engineers can improve them. Undeniably, OpenAI and other developers -- including Google and Anthropic, which on Monday received $4 billion from Amazon Web Services -- are firmly ahead of Microsoft when it comes to developing advanced LLMs. But Microsoft may be able to compete in a race to build AI models that mimic the quality of OpenAI software at a fraction of the cost, as Microsoft showed in June with the release of one in-house model it calls Orca.

This discussion has been archived. No new comments can be posted.

Microsoft is Trying To Lessen Its Addiction To OpenAI as AI Costs Soar

Comments Filter:
  • by ebunga ( 95613 ) on Tuesday September 26, 2023 @04:20PM (#63879233)

    Just a reminder, the costs here are ultimately measured in watt hours. They're turning coal into plagiarism at an alarming rate.

    • by ShanghaiBill ( 739463 ) on Tuesday September 26, 2023 @05:33PM (#63879403)

      OpenAI's LLMs are running in data centers in Ohio and Iowa.

      Ohio gets 35% of its energy from coal. Iowa gets 25%.

      At least it's not in WV, where 90% comes from coal.

      • The state gets power from there, but what about the data centers?

        Almost no large scale modern data centers are designed without massive amounts of renewable energy. Many are designed with solar and wind and as a mid-way with fuel cells. For cooling, they'll try anything and often operate at 50C... And as someone who spends much time in high temperature datacenters, they are super uncomfortable.

        You should follow Mark Russonovich. He often posts really cool stuff about improving data center efficiency.
        • Co-locating renewables and data centers is stupid.

          Renewables should be placed where they perform best. Ohio is often cloudy or snowy. It's not the best place for wind either.

  • by phantomfive ( 622387 ) on Tuesday September 26, 2023 @04:21PM (#63879235) Journal

    Microsoft may be able to compete in a race to build AI models that mimic the quality of OpenAI software at a fraction of the cost, as Microsoft showed in June with the release of one in-house model it calls Orca.

    It is unlikely that Microsoft has in-house talent to build a decent AI model.

    • Re:No they can't (Score:4, Insightful)

      by ShanghaiBill ( 739463 ) on Tuesday September 26, 2023 @05:37PM (#63879409)

      It is unlikely that Microsoft has in-house talent to build a decent AI model.

      Microsoft has long had an amazing research department.

      However, they are almost as famous as Xerox for ignoring the ideas and squandering the opportunities those researchers produce.

      • The only place where Microsoft research has really paid off is in their developer tools.

        In terms of AI research, their most notable development is Tay [wikipedia.org].
        • I'm ignorant of what they publish externally, but we have an internal team that is publishing some absolutely cracking kit. It's not as impactful as foundation models, but for here's a better hammer kit, I've been blown away.

          e.g. Github copilot is SSS+,
          e.g. The age progression series here was very cool.
          https://github.com/Azure/gen-c... [github.com]

          • That's really great, but copilot and stable diffusion are built on external AI research, it's not Microsoft homegrown stuff.
  • For scraping any data it can find, laws be damned.
  • by larryjoe ( 135075 ) on Tuesday September 26, 2023 @05:19PM (#63879359)

    Microsoft earlier this year announced further investments in OpenAI, reportedly to the tune of $10 billion [nytimes.com] over several years. That's on top of the $3 billion already invested. So, Microsoft is definitely planning on continuing to work with OpenAI in the future. This idea of having your researchers look at ideas in a complementary area is a good thing. OpenAI is looking at literally bigger and better systems. Microsoft is probing smaller and still useful systems. Makes sense to look at both.

    • I think the dirty [not so] secret of the LLMs and machine learning is finally popping at a rate that it cannot be ignored. I will admit I only really started to understand the problem in the last month or two; I had previously thought that it was just a training issue which was essentially one-time. What I hear now is that the chips coming in 3 years will be able to solve today's scaling problems... of course in 3 years it the silicon will be at least two years from the problems they are facing then.

      It se

  • âoeHi there! It looks like youâ(TM)re trying to create a lightweight chat AIâ¦â
  • They could try maybe, I don't know, not automatically printing an AI answer on every web search where 99% of the time the person doesn't want and doesn't read the AI answer (which most of the time is "I don't understand what you're saying because you typed a search term not a question")? That should save some costs. It'd also improve the user experience considerably, since Bing results pages now suddenly change in size after you open them and the links you want move out of your reach.

"The medium is the massage." -- Crazy Nigel

Working...