Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Average Ratings 0 Ratings

Total
ease
features
design
support

No User Reviews. Be the first to provide a review:

Write a Review

Description

Introducing an open-source AI model that can be fine-tuned, distilled, and deployed across various platforms. Our newest instruction-tuned model comes in three sizes: 8B, 70B, and 405B, giving you options to suit different needs. With our open ecosystem, you can expedite your development process using a diverse array of tailored product offerings designed to meet your specific requirements. You have the flexibility to select between real-time inference and batch inference services according to your project's demands. Additionally, you can download model weights to enhance cost efficiency per token while fine-tuning for your application. Improve performance further by utilizing synthetic data and seamlessly deploy your solutions on-premises or in the cloud. Take advantage of Llama system components and expand the model's capabilities through zero-shot tool usage and retrieval-augmented generation (RAG) to foster agentic behaviors. By utilizing 405B high-quality data, you can refine specialized models tailored to distinct use cases, ensuring optimal functionality for your applications. Ultimately, this empowers developers to create innovative solutions that are both efficient and effective.

Description

On June 23, 2025, Microsoft unveiled Mu, an innovative 330-million-parameter encoder–decoder language model specifically crafted to enhance the agent experience within Windows environments by effectively translating natural language inquiries into function calls for Settings, all processed on-device via NPUs at a remarkable speed of over 100 tokens per second while ensuring impressive accuracy. By leveraging Phi Silica optimizations, Mu’s encoder–decoder design employs a fixed-length latent representation that significantly reduces both computational demands and memory usage, achieving a 47 percent reduction in first-token latency and a decoding speed that is 4.7 times greater on Qualcomm Hexagon NPUs when compared to other decoder-only models. Additionally, the model benefits from hardware-aware tuning techniques, which include a thoughtful 2/3–1/3 split of encoder and decoder parameters, shared weights for input and output embeddings, Dual LayerNorm, rotary positional embeddings, and grouped-query attention, allowing for swift inference rates exceeding 200 tokens per second on devices such as the Surface Laptop 7, along with sub-500 ms response times for settings-related queries. This combination of features positions Mu as a groundbreaking advancement in on-device language processing capabilities.

API Access

Has API

API Access

Has API

Screenshots View All

Screenshots View All

Integrations

1min.AI
Agenta
Alpaca
Azure AI Foundry Agent Service
Azure Marketplace
BlueFlame AI
Deep Infra
Firecrawl
Gopher
Hermes 3
HumanLayer
Jspreadsheet
Klee
Not Diamond
Ragas
Revere
Simplismart
SurePath AI
Tune Studio
webAI

Integrations

1min.AI
Agenta
Alpaca
Azure AI Foundry Agent Service
Azure Marketplace
BlueFlame AI
Deep Infra
Firecrawl
Gopher
Hermes 3
HumanLayer
Jspreadsheet
Klee
Not Diamond
Ragas
Revere
Simplismart
SurePath AI
Tune Studio
webAI

Pricing Details

Free
Free Trial
Free Version

Pricing Details

No price information available.
Free Trial
Free Version

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Deployment

Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Customer Support

Business Hours
Live Rep (24/7)
Online Support

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Types of Training

Training Docs
Webinars
Live Training (Online)
In Person

Vendor Details

Company Name

Meta

Founded

2004

Country

United States

Website

llama.meta.com

Vendor Details

Company Name

Microsoft

Founded

1975

Country

United States

Website

blogs.windows.com/windowsexperience/2025/06/23/introducing-mu-language-model-and-how-it-enabled-the-agent-in-windows-settings/

Product Features

Alternatives

Athene-V2 Reviews

Athene-V2

Nexusflow

Alternatives

CodeQwen Reviews

CodeQwen

Alibaba
Yi-Large Reviews

Yi-Large

01.AI
Pixtral Large Reviews

Pixtral Large

Mistral AI
Falcon-7B Reviews

Falcon-7B

Technology Innovation Institute (TII)
Falcon-40B Reviews

Falcon-40B

Technology Innovation Institute (TII)