Mixtral 8x22B Description

Mixtral 8x22B is our latest open model. It sets new standards for performance and efficiency in the AI community. It is a sparse Mixture-of-Experts model (SMoE), which uses only 39B active variables out of 141B. This offers unparalleled cost efficiency in relation to its size. It is fluently bilingual in English, French Italian, German and Spanish. It has strong math and coding skills. It is natively able to call functions; this, along with the constrained-output mode implemented on La Plateforme, enables application development at scale and modernization of tech stacks. Its 64K context window allows for precise information retrieval from large documents. We build models with unmatched cost-efficiency for their respective sizes. This allows us to deliver the best performance-tocost ratio among models provided by the Community. Mixtral 8x22B continues our open model family. Its sparse patterns of activation make it faster than any 70B model.

Pricing

Pricing Starts At:
Free
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
Mistral AI
Year Founded:
2023
Headquarters:
France
Website:
mistral.ai/news/mixtral-8x22b/

Media

Mixtral 8x22B Screenshot 1
Recommended Products
Secure your business by securing your people. Icon
Secure your business by securing your people.

Over 100,000 businesses trust 1Password

Take the guesswork out of password management, shadow IT, infrastructure, and secret sharing so you can keep your people safe and your business moving.
Try it free for 14 days

Product Details

Platforms
SaaS
Type of Training
Documentation
Customer Support
24/7 Live Support
Online

Mixtral 8x22B Features and Options

Mixtral 8x22B Lists

Mixtral 8x22B User Reviews

Write a Review
  • Previous
  • Next