Add mcp-gateway/OPENAI_INTEGRATION.md
This commit is contained in:
parent
057e40fba6
commit
d1aa195a97
1 changed files with 159 additions and 0 deletions
159
mcp-gateway/OPENAI_INTEGRATION.md
Normal file
159
mcp-gateway/OPENAI_INTEGRATION.md
Normal file
|
|
@ -0,0 +1,159 @@
|
|||
# OpenAI-Compatible Integration for MCP Gateway
|
||||
|
||||
This guide shows how to add OpenAI-compatible endpoints to your MCP Gateway so Open-UI and other OpenAI clients can use your MCP tools.
|
||||
|
||||
## Quick Setup (2 steps)
|
||||
|
||||
### Step 1: Import the OpenAI routes in `gateway-proxy/gateway_proxy.py`
|
||||
|
||||
At the top of the file, add:
|
||||
|
||||
```python
|
||||
from openai_routes import chat_completions, list_models
|
||||
```
|
||||
|
||||
### Step 2: Add the routes to your Starlette app
|
||||
|
||||
In the `app = Starlette(routes=[...])` section, add these routes before the closing bracket:
|
||||
|
||||
```python
|
||||
# OpenAI-compatible endpoints
|
||||
Route("/v1/models", list_models, methods=["GET"]),
|
||||
Route("/v1/chat/completions", chat_completions, methods=["POST"]),
|
||||
```
|
||||
|
||||
So your routes section should look like:
|
||||
|
||||
```python
|
||||
app = Starlette(
|
||||
routes=[
|
||||
# Well-known discovery (Claude tries both with and without /mcp suffix)
|
||||
Route("/.well-known/oauth-protected-resource", well_known_protected_resource, methods=["GET"]),
|
||||
Route("/.well-known/oauth-protected-resource/mcp", well_known_protected_resource, methods=["GET"]),
|
||||
Route("/.well-known/oauth-authorization-server", well_known_oauth_authorization_server, methods=["GET"]),
|
||||
Route("/.well-known/oauth-authorization-server/mcp", well_known_oauth_authorization_server, methods=["GET"]),
|
||||
Route("/.well-known/openid-configuration", well_known_oauth_authorization_server, methods=["GET"]),
|
||||
|
||||
# OAuth endpoints at /oauth/* (spec-standard)
|
||||
Route("/oauth/register", oauth_register, methods=["POST"]),
|
||||
Route("/oauth/authorize", oauth_authorize, methods=["GET", "POST"]),
|
||||
Route("/oauth/token", oauth_token, methods=["POST"]),
|
||||
|
||||
# OAuth endpoints at root (Claude may construct these from base URL)
|
||||
Route("/register", oauth_register, methods=["POST"]),
|
||||
Route("/authorize", oauth_authorize, methods=["GET", "POST"]),
|
||||
Route("/token", oauth_token, methods=["POST"]),
|
||||
|
||||
# MCP endpoint (OAuth-protected)
|
||||
Route("/mcp", handle_mcp, methods=["GET", "HEAD", "POST", "DELETE"]),
|
||||
|
||||
# OpenAI-compatible endpoints
|
||||
Route("/v1/models", list_models, methods=["GET"]),
|
||||
Route("/v1/chat/completions", chat_completions, methods=["POST"]),
|
||||
|
||||
# Monitoring
|
||||
Route("/health", health, methods=["GET"]),
|
||||
Route("/status", status, methods=["GET"]),
|
||||
],
|
||||
lifespan=lifespan,
|
||||
)
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Update the route handler to pass required parameters. In `gateway_proxy.py`, modify the route definitions:
|
||||
|
||||
```python
|
||||
from functools import partial
|
||||
|
||||
# Near the top where other routes are defined:
|
||||
chat_completions_handler = partial(
|
||||
chat_completions,
|
||||
mcp_gateway_url="http://localhost:4444/mcp",
|
||||
access_tokens=ACCESS_TOKENS
|
||||
)
|
||||
|
||||
list_models_handler = partial(
|
||||
list_models,
|
||||
gateway_config={"backends": BACKENDS}
|
||||
)
|
||||
```
|
||||
|
||||
Then update the routes:
|
||||
|
||||
```python
|
||||
Route("/v1/models", list_models_handler, methods=["GET"]),
|
||||
Route("/v1/chat/completions", chat_completions_handler, methods=["POST"]),
|
||||
```
|
||||
|
||||
## Using with Open-UI
|
||||
|
||||
Once integrated and running:
|
||||
|
||||
1. **Gateway URL**: `https://mcp.wilddragon.net` (or your gateway URL)
|
||||
2. **API Base URL in Open-UI**: `https://mcp.wilddragon.net/v1`
|
||||
3. **Model**: `mcp-gateway`
|
||||
4. **Bearer Token**: Use your generated token
|
||||
|
||||
### In Open-UI Admin Panel:
|
||||
|
||||
1. Go to Admin → Settings → Connections
|
||||
2. Scroll to "Direct Connections"
|
||||
3. Add new direct connection:
|
||||
- **API Base URL**: `https://mcp.wilddragon.net/v1`
|
||||
- **API Key**: Your bearer token
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### `GET /v1/models`
|
||||
List available models. Returns the MCP Gateway as a model option.
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
curl -H "Authorization: Bearer YOUR_TOKEN" \
|
||||
https://mcp.wilddragon.net/v1/models
|
||||
```
|
||||
|
||||
### `POST /v1/chat/completions`
|
||||
OpenAI-compatible chat completions endpoint.
|
||||
|
||||
**Example:**
|
||||
```bash
|
||||
curl -X POST \
|
||||
-H "Authorization: Bearer YOUR_TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"model": "mcp-gateway",
|
||||
"messages": [
|
||||
{"role": "user", "content": "What tools are available?"}
|
||||
]
|
||||
}' \
|
||||
https://mcp.wilddragon.net/v1/chat/completions
|
||||
```
|
||||
|
||||
## What's Currently Supported
|
||||
|
||||
✅ **Available tools listing** - Shows all available MCP tools
|
||||
✅ **Bearer token authentication** - Uses your existing OAuth tokens
|
||||
✅ **Streaming responses** - OpenAI-compatible streaming format
|
||||
✅ **Standard OpenAI request/response format**
|
||||
|
||||
## Next Steps (Optional Enhancements)
|
||||
|
||||
1. **Tool Calling**: Implement actual tool execution via MCP
|
||||
2. **System Prompts**: Add custom system prompts for tool use
|
||||
3. **Session Management**: Preserve MCP session state across requests
|
||||
4. **Error Handling**: More detailed error messages for tool failures
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**"Unauthorized" error**: Make sure your bearer token is valid. Check that it's in `ACCESS_TOKENS` on the gateway.
|
||||
|
||||
**"No user message found"**: Ensure the request includes at least one message with role `"user"`.
|
||||
|
||||
**Tools not showing**: Check that your MCP backends are initialized and responding to `tools/list` requests.
|
||||
|
||||
## Files
|
||||
|
||||
- `gateway-proxy/openai_routes.py` - OpenAI-compatible route handlers
|
||||
- `openai_adapter.py` - Adapter class (optional, for more advanced usage)
|
||||
Loading…
Reference in a new issue