Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Businesses

OpenAI Launches a ChatGPT Plan For Enterprise Customers 16

An anonymous reader quotes a report from TechCrunch: Seeking to capitalize on ChatGPT's viral success, OpenAI today announced the launch of ChatGPT Enterprise, a business-focused edition of the company's AI-powered chatbot app. ChatGPT Enterprise, which OpenAI first teased in a blog post earlier this year, can perform the same tasks as ChatGPT, such as writing emails, drafting essays and debugging computer code. But the new offering also adds "enterprise-grade" privacy and data analysis capabilities on top of the vanilla ChatGPT, as well as enhanced performance and customization options. That puts ChatGPT Enterprise on par, feature-wise, with Bing Chat Enterprise, Microsoft's recently launched take on an enterprise-oriented chatbot service.

ChatGPT Enterprise provides a new admin console with tools to manage how employees within an organization use ChatGPT, including integrations for single sign-on, domain verification and a dashboard with usage statistics. Shareable conversation templates allow employees to build internal workflows leveraging ChatGPT, while credits to OpenAI's API platform let companies create fully custom ChatGPT-powered solutions if they choose. ChatGPT Enterprise, in addition, comes with unlimited access to Advanced Data Analysis, the ChatGPT feature formerly known as Code Interpreter, which allows ChatGPT to analyze data, create charts, solve math problems and more, including from uploaded files. For example, given a prompt like "Tell me what's interesting about this data," ChatGPT's Advanced Data Analysis capability can look through the data -- financial, health or location information, for example -- to generate insights.

Advanced Data Analysis was previously available only to subscribers to ChatGPT Plus, the $20-per-month premium tier of the consumer ChatGPT web and mobile apps. To be clear, ChatGPT Plus is sticking around -- OpenAI sees ChatGPT Enterprise as complementary to it, the company says. ChatGPT Enterprise is powered by GPT-4, OpenAI's flagship AI model, as is ChatGPT Plus. But ChatGPT Enterprise customers get priority access to GPT-4, delivering performance that's twice as fast as the standard GPT-4 and with an expanded 32,000-token (~25,000-word) context window. Context window refers to the text the model considers before generating additional text, while tokens represent raw text (e.g. the word "fantastic" would be split into the tokens "fan," "tas" and "tic"). Generally speaking, models with large context windows are less likely to "forget" the content of recent conversations.
Crucially, OpenAI said that it "won't train models on business data sent to ChatGPT Enterprise or any usage data and that all conversations with ChatGPT Enterprise are encrypted in transit and at rest," notes TechCrunch.

"OpenAI says that its future plans for ChatGPT Enterprise include a ChatGPT Business offering for smaller teams, allowing companies to connect apps to ChatGPT Enterprise, 'more powerful' and 'enterprise-grade' versions of Advanced Data Analysis and web browsing, and tools designed for data analysts, marketers and customer support."

A blog post introducing ChatGPT Enterprise can be found here.
This discussion has been archived. No new comments can be posted.

OpenAI Launches a ChatGPT Plan For Enterprise Customers

Comments Filter:
  • But they'll keep it until they can sneak in a license change

  • A semi reliable poster suggests it's ball-park of $100k base + optionals. probably out of price range for most smaller orgs, when the ROI may not be quantifiable by any particular killer use-case. It may be a while before bean-counters treat "AI consultants" as just another software package to be expensed, but we'll see, I guess. May be enough to keep the lights on at OpenAI even if mom and pop can't get a subscription for the employees at their brownstone shingle hanger firms. I could be wrong, But hey,
    • by blue trane ( 110704 ) on Monday August 28, 2023 @08:15PM (#63804698) Homepage Journal

      Why does capitalism incentivize selling subscriptions to centralized, controlled production, rather than standalone technology that would let us each customize and run our own chatbots locally? Why is control prized over the general welfare?

      • Why does capitalism incentivize selling subscriptions to centralized, controlled production, rather than standalone technology that would let us each customize and run our own chatbots locally? Why is control prized over the general welfare?

        Cost to copy.

        We're used to products being made of matter, and sometimes quite complicated matter. A TV set or VCR requires a fair bit of labor to make, and once you have the working design it still requires labor to make copies. Books used to be that way: it used to be labor intensive to copy a book, and the book maker had expensive equipment and significant expertise.

        Nowadays the products in question cost *zero* to copy. Once you've put in the labor of writing a book, it costs essentially nothing to copy t

        • Or maybe LLMâ(TM)s and tech of that sort werenâ(TM)t meant to be monetized in such a way? Itâ(TM)s not like there are people out there profiting from K-Means or anything. Is OpenAI compensating the people that have researched LLMâ(TM)s over the years?
        • What if the actual engineers got a strong basic income and could just develop stuff they wanted instead of writing a lot of gatekeeping code to enforce enclosure over what we all know deep down just shouldn't be enclosed in the first place?

          • You have a very myopic view of what would happen if you 'give them a basic income.' Even liberal economist extraordinaire Paul Krugman says that UBI is a) too much money, b) disincentivizes work, and c) thinks targeted programs (income based welfare) work better.
      • Because compute and energy aren't cheap. The startup costs are virtually the same for a few users or a bazillion users. You can only amortize/democratize the cost of inference, and that has a very specific point of profitability where the cost and price functions intersect. There aren't any companies that have figured out how to do what you want. I'm sure some are trying.
    • The enterprise saas places I've been at and bought from mostly had a lower per seat price for smaller orgs.

      Better to get $25k out of a small org if support costs are small than get nothing.

  • by iAmWaySmarterThanYou ( 10095012 ) on Monday August 28, 2023 @10:23PM (#63804860)

    So now gpt hallucinations will be enterprise ready and integrate with AD!

Love makes the world go 'round, with a little help from intrinsic angular momentum.

Working...