Average Ratings 0 Ratings
Average Ratings 0 Ratings
Description
LLM Gateway is a completely open-source, unified API gateway designed to efficiently route, manage, and analyze requests directed to various large language model providers such as OpenAI, Anthropic, and Gemini Enterprise Agent Platform, all through a single, OpenAI-compatible endpoint. It supports multiple providers, facilitating effortless migration and integration, while its dynamic model orchestration directs each request to the most suitable engine, providing a streamlined experience. Additionally, it includes robust usage analytics that allow users to monitor requests, token usage, response times, and costs in real-time, ensuring transparency and control. The platform features built-in performance monitoring tools that facilitate the comparison of models based on accuracy and cost-effectiveness, while secure key management consolidates API credentials under a role-based access framework. Users have the flexibility to deploy LLM Gateway on their own infrastructure under the MIT license or utilize the hosted service as a progressive web app, with easy integration that requires only a change to the API base URL, ensuring that existing code in any programming language or framework, such as cURL, Python, TypeScript, or Go, remains functional without any alterations. Overall, LLM Gateway empowers developers with a versatile and efficient tool for leveraging various AI models while maintaining control over their usage and expenses.
Description
The Microsoft MCP Gateway serves as an open-source reverse proxy and management interface for Model Context Protocol (MCP) servers, facilitating scalable and session-aware routing along with lifecycle management and centralized oversight of MCP services, particularly within Kubernetes setups. Acting as a control plane, it adeptly directs requests from AI agents (MCP clients) to the corresponding backend MCP servers while maintaining session affinity, effectively managing multiple tools and endpoints through a singular gateway that prioritizes authorization and observability. Additionally, it empowers teams to deploy, update, and remove MCP servers and tools through RESTful APIs, enabling the registration of tool definitions and the management of these resources with security measures such as bearer tokens and role-based access control (RBAC). The architecture distinctly separates the management of the control plane, which includes CRUD operations on adapters, tools, and metadata, from the data plane's routing capabilities, which support streamable HTTP connections and dynamic tool routing, thus providing advanced features like session-aware stateful routing. This design not only enhances operational efficiency but also fosters a more secure environment for managing AI services.
API Access
Has API
API Access
Has API
Integrations
ChatGPT
Claude
DeepSeek
Docker
Gemini Enterprise Agent Platform
Go
Google AI Studio
Groq
Java
Microsoft Azure
Integrations
ChatGPT
Claude
DeepSeek
Docker
Gemini Enterprise Agent Platform
Go
Google AI Studio
Groq
Java
Microsoft Azure
Pricing Details
$50 per month
Free Trial
Free Version
Pricing Details
Free
Free Trial
Free Version
Deployment
Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook
Deployment
Web-Based
On-Premises
iPhone App
iPad App
Android App
Windows
Mac
Linux
Chromebook
Customer Support
Business Hours
Live Rep (24/7)
Online Support
Customer Support
Business Hours
Live Rep (24/7)
Online Support
Types of Training
Training Docs
Webinars
Live Training (Online)
In Person
Types of Training
Training Docs
Webinars
Live Training (Online)
In Person
Vendor Details
Company Name
LLM Gateway
Country
United States
Website
llmgateway.io
Vendor Details
Company Name
Microsoft
Founded
1975
Country
United States
Website
microsoft.github.io/mcp-gateway/