Mixtral 8x7B Description

Mixtral 8x7B has open weights and is a high quality sparse mixture expert model (SMoE). Licensed under Apache 2.0. Mixtral outperforms Llama 70B in most benchmarks, with 6x faster Inference. It is the strongest model with an open-weight license and the best overall model in terms of cost/performance tradeoffs. It matches or exceeds GPT-3.5 in most standard benchmarks.

Pricing

Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
Mistral AI
Year Founded:
2023
Headquarters:
France
Website:
mistral.ai/news/mixtral-of-experts/

Media

Mixtral 8x7B Screenshot 1
Recommended Products
Our Free Plans just got better! | Auth0 by Okta Icon
Our Free Plans just got better! | Auth0 by Okta

With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your secuirty. Auth0 now, thank yourself later.
Try free now

Product Details

Platforms
Windows
Mac
Linux
On-Premises
Type of Training
Documentation

Mixtral 8x7B Features and Options

Mixtral 8x7B Lists

Mixtral 8x7B User Reviews

Write a Review
  • Previous
  • Next