MPT-7B Description

Introducing MPT-7B - the latest addition to our MosaicML Foundation Series. MPT-7B, a transformer that is trained from scratch using 1T tokens of code and text, is the latest entry in our MosaicML Foundation Series. It is open-source, available for commercial purposes, and has the same quality as LLaMA-7B. MPT-7B trained on the MosaicML Platform in 9.5 days, with zero human interaction at a cost $200k.

You can now train, fine-tune and deploy your private MPT models. You can either start from one of our checkpoints, or you can start from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

Pricing

Pricing Starts At:
Free
Pricing Information:
Open source
Free Version:
Yes

Integrations

Reviews

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Company Details

Company:
MosaicML
Year Founded:
2021
Headquarters:
United States
Website:
www.mosaicml.com/blog/mpt-7b

Media

MPT-7B Screenshot 1
Recommended Products
1Password makes it easy to store and share passwords anywhere, anytime Icon
1Password makes it easy to store and share passwords anywhere, anytime

More than a password manager.

Protect yourself, your family, or your global workforce with simple security, easy secret sharing, and actionable insight reports.
Start Today

Product Details

Platforms
SaaS
Windows
Mac
Linux
On-Premises
Type of Training
Documentation

MPT-7B Features and Options

MPT-7B User Reviews

Write a Review
  • Previous
  • Next