Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Description

Dolly is an economical large language model that surprisingly demonstrates a notable level of instruction-following abilities similar to those seen in ChatGPT. While the Alpaca team's research revealed that cutting-edge models could be encouraged to excel in high-quality instruction adherence, our findings indicate that even older open-source models with earlier architectures can display remarkable behaviors when fine-tuned on a modest set of instructional training data. By utilizing an existing open-source model with 6 billion parameters from EleutherAI, Dolly has been slightly adjusted to enhance its ability to follow instructions, showcasing skills like brainstorming and generating text that were absent in its original form. This approach not only highlights the potential of older models but also opens new avenues for leveraging existing technologies in innovative ways.

Description

We are excited to present MPT-7B, the newest addition to the MosaicML Foundation Series. This transformer model has been meticulously trained from the ground up using 1 trillion tokens of diverse text and code. It is open-source and ready for commercial applications, delivering performance on par with LLaMA-7B. The training process took 9.5 days on the MosaicML platform, requiring no human input and incurring an approximate cost of $200,000. With MPT-7B, you can now train, fine-tune, and launch your own customized MPT models, whether you choose to begin with one of our provided checkpoints or start anew. To provide additional options, we are also introducing three fine-tuned variants alongside the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the latter boasting an impressive context length of 65,000 tokens, allowing for extensive content generation. These advancements open up new possibilities for developers and researchers looking to leverage the power of transformer models in their projects.

Description

Mistral 7B is a language model with 7.3 billion parameters that demonstrates superior performance compared to larger models such as Llama 2 13B on a variety of benchmarks. It utilizes innovative techniques like Grouped-Query Attention (GQA) for improved inference speed and Sliding Window Attention (SWA) to manage lengthy sequences efficiently. Released under the Apache 2.0 license, Mistral 7B is readily available for deployment on different platforms, including both local setups and prominent cloud services. Furthermore, a specialized variant known as Mistral 7B Instruct has shown remarkable capabilities in following instructions, outperforming competitors like Llama 2 13B Chat in specific tasks. This versatility makes Mistral 7B an attractive option for developers and researchers alike.

API Access

Has API

API Access

Has API

API Access

Has API

Screenshots View All

Screenshots View All

Screenshots View All

Integrations

302.AI
AI-FLOW
Amazon Bedrock
CSS
Concierge AI
Go
Gopher
Kiin
LibreChat
Literal AI
Mammouth AI
Mathstral
Mistral AI
Pipeshift
StackAI
SydeLabs
Verta
Weave
Yaseen AI
thisorthis.ai

Integrations

302.AI
AI-FLOW
Amazon Bedrock
CSS
Concierge AI
Go
Gopher
Kiin
LibreChat
Literal AI
Mammouth AI
Mathstral
Mistral AI
Pipeshift
StackAI
SydeLabs
Verta
Weave
Yaseen AI
thisorthis.ai

Integrations

302.AI
AI-FLOW
Amazon Bedrock
CSS
Concierge AI
Go
Gopher
Kiin
LibreChat
Literal AI
Mammouth AI
Mathstral
Mistral AI
Pipeshift
StackAI
SydeLabs
Verta
Weave
Yaseen AI
thisorthis.ai

Pricing Details

Free
Free Trial
Free Version

Pricing Details

Free
Free Trial
Free Version

Pricing Details

Free
Free Trial
Free Version

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Vendor Details

Company Name

Databricks

Founded

2013

Country

United States

Website

databricks.com

Vendor Details

Company Name

MosaicML

Founded

2021

Country

United States

Website

www.mosaicml.com/blog/mpt-7b

Vendor Details

Company Name

Mistral AI

Founded

2023

Country

France

Website

mistral.ai/

Product Features

Product Features

Alternatives

mT5 Reviews

mT5

Google

Alternatives

Dolly Reviews

Dolly

Databricks

Alternatives

Command R Reviews

Command R

Cohere AI
Alpaca Reviews

Alpaca

Stanford Center for Research on Foundation Models (CRFM)
Command R+ Reviews

Command R+

Cohere AI
GPT-J Reviews

GPT-J

EleutherAI
Llama 2 Reviews

Llama 2

Meta
Pixtral Large Reviews

Pixtral Large

Mistral AI
PaLM Reviews

PaLM

Google
Falcon-40B Reviews

Falcon-40B

Technology Innovation Institute (TII)
Alpaca Reviews

Alpaca

Stanford Center for Research on Foundation Models (CRFM)