Mixtral 8x7B Description
Mixtral 8x7B has open weights and is a high quality sparse mixture expert model (SMoE). Licensed under Apache 2.0. Mixtral outperforms Llama 70B in most benchmarks, with 6x faster Inference. It is the strongest model with an open-weight license and the best overall model in terms of cost/performance tradeoffs. It matches or exceeds GPT-3.5 in most standard benchmarks.
Pricing
Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes
Integrations
Company Details
Company:
Mistral AI
Year Founded:
2023
Headquarters:
France
Website:
mistral.ai/news/mixtral-of-experts/
Recommended Products
Product Details
Platforms
Windows
Mac
Linux
On-Premises
Type of Training
Documentation