Mixtral 8x7B Description

Mixtral 8x7B has open weights and is a high quality sparse mixture expert model (SMoE). Licensed under Apache 2.0. Mixtral outperforms Llama 70B in most benchmarks, with 6x faster Inference. It is the strongest model with an open-weight license and the best overall model in terms of cost/performance tradeoffs. It matches or exceeds GPT-3.5 in most standard benchmarks.

Pricing

Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
Mistral AI
Year Founded:
2023
Headquarters:
France
Website:
mistral.ai/news/mixtral-of-experts/

Media

Mixtral 8x7B Screenshot 1
Recommended Products
Secure your business by securing your people. Icon
Secure your business by securing your people.

Over 100,000 businesses trust 1Password

Take the guesswork out of password management, shadow IT, infrastructure, and secret sharing so you can keep your people safe and your business moving.

Product Details

Platforms
Windows
Mac
Linux
On-Premises
Type of Training
Documentation

Mixtral 8x7B Features and Options