Remove mcp-gateway/OPENUI_SCHEMA_FIX.md
This commit is contained in:
parent
a20f6018eb
commit
f1506dc81f
1 changed files with 0 additions and 164 deletions
|
|
@ -1,164 +0,0 @@
|
|||
# OpenUI Tool Discovery Schema Fix
|
||||
|
||||
## Problem Analysis
|
||||
|
||||
OpenUI fails to load any tools from the MCP Gateway with errors like:
|
||||
```
|
||||
"Email Digest string not found"
|
||||
"DocType string not found"
|
||||
```
|
||||
|
||||
### Root Causes
|
||||
|
||||
1. **Missing OpenAI-compatible endpoints** — Gateway only exposes MCP protocol (`/mcp`), not OpenAI API endpoints (`/v1/tools`, `/v1/models`, `/v1/chat/completions`)
|
||||
|
||||
2. **Complex JSON schemas break OpenUI parsing** — MCP tool schemas use features OpenAI API doesn't support:
|
||||
- `type` as array: `["string", "null"]` instead of single type
|
||||
- `anyOf` / `oneOf` with mixed content
|
||||
- Tuple validation with `items` as array
|
||||
- Tool names getting truncated due to improper escaping
|
||||
|
||||
3. **No schema normalization** — While `gateway_proxy.py` has `_normalize_schema()` for MCP protocol, the OpenAI adapter doesn't simplify schemas properly for OpenUI
|
||||
|
||||
4. **Missing `/v1/tools` endpoint** — OpenUI expects a dedicated tools discovery endpoint, not just embedded in `/v1/chat/completions`
|
||||
|
||||
## Solution: Schema Conversion & New Endpoints
|
||||
|
||||
### Step 1: Use Fixed OpenAI Routes Handler
|
||||
|
||||
Replace the basic `openai_routes.py` with `openai_routes_fixed.py` which includes:
|
||||
|
||||
- **`_simplify_schema()`** — Recursively converts complex schemas to OpenAI-compatible format
|
||||
- **`convert_mcp_tool_to_openai()`** — Properly maps MCP tool definitions to OpenAI function schema
|
||||
- **`/v1/tools` endpoint** — Dedicated tools discovery (required by OpenUI)
|
||||
- **Error handling** — Gracefully handles malformed tool definitions
|
||||
|
||||
### Step 2: Update gateway_proxy.py
|
||||
|
||||
Add the following import after existing imports (around line 27):
|
||||
|
||||
```python
|
||||
from .openai_routes_fixed import convert_mcp_tool_to_openai, list_models, tools, chat_completions
|
||||
```
|
||||
|
||||
Add these routes to the routes list (before the closing bracket, around line 820):
|
||||
|
||||
```python
|
||||
# OpenAI-compatible API endpoints (for Open-UI and other OpenAI clients)
|
||||
Route("/v1/models", list_models, methods=["GET"]),
|
||||
Route("/v1/tools", lambda r: tools(r, TOOL_DEFINITIONS), methods=["GET"]),
|
||||
Route("/v1/chat/completions", lambda r: chat_completions(r, TOOL_DEFINITIONS), methods=["POST"]),
|
||||
```
|
||||
|
||||
## Schema Conversion Examples
|
||||
|
||||
### Before (MCP format - causes errors)
|
||||
```json
|
||||
{
|
||||
"name": "erpnext_get_document",
|
||||
"description": "Retrieve a single document",
|
||||
"inputSchema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"doctype": {
|
||||
"type": ["string", "null"],
|
||||
"description": "DocType name"
|
||||
},
|
||||
"name": {
|
||||
"type": ["string", "null"],
|
||||
"description": "Document name"
|
||||
}
|
||||
},
|
||||
"required": ["doctype", "name"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### After (OpenAI format - works in OpenUI)
|
||||
```json
|
||||
{
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": "erpnext_get_document",
|
||||
"description": "Retrieve a single document",
|
||||
"parameters": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"doctype": {
|
||||
"type": "string",
|
||||
"description": "DocType name"
|
||||
},
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "Document name"
|
||||
}
|
||||
},
|
||||
"required": ["doctype", "name"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
### Phase 1: Apply Fixes
|
||||
- [ ] Deploy `openai_routes_fixed.py` to `gateway-proxy/`
|
||||
- [ ] Update `gateway_proxy.py` with import and routes (see Step 2 above)
|
||||
- [ ] Restart gateway container: `docker-compose restart gateway-proxy`
|
||||
|
||||
### Phase 2: Verify
|
||||
- [ ] Test `/v1/models` endpoint returns valid OpenAI model list
|
||||
- [ ] Test `/v1/tools` endpoint returns tools with proper OpenAI schema
|
||||
- [ ] Verify no truncated tool names in response
|
||||
- [ ] Check gateway logs for errors: `docker logs mcp-gateway`
|
||||
|
||||
### Phase 3: Test in OpenUI
|
||||
- [ ] Add gateway URL to OpenUI: `http://mcp.wilddragon.net:8000`
|
||||
- [ ] Verify tools appear in model selector
|
||||
- [ ] Try searching for tools by name
|
||||
- [ ] Call a tool to verify end-to-end functionality
|
||||
|
||||
## Files Involved
|
||||
|
||||
| File | Purpose | Status |
|
||||
|------|---------|--------|
|
||||
| `gateway-proxy/openai_routes_fixed.py` | Fixed OpenAI schema conversion | ✅ Created |
|
||||
| `gateway-proxy/gateway_proxy.py` | Main gateway with route definitions | ⏳ Needs updates |
|
||||
| `openai_adapter.py` | (Can be removed - superseded by openai_routes_fixed.py) | Optional |
|
||||
|
||||
## Testing Commands
|
||||
|
||||
```bash
|
||||
# Test tool discovery
|
||||
curl http://localhost:8000/v1/tools
|
||||
|
||||
# Test models endpoint
|
||||
curl http://localhost:8000/v1/models
|
||||
|
||||
# Check specific tool schema (example)
|
||||
curl http://localhost:8000/v1/tools | jq '.data[] | select(.function.name == "erpnext_get_document")'
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Tools still not loading
|
||||
1. Check that `openai_routes_fixed.py` is in the correct directory
|
||||
2. Verify import statement was added to `gateway_proxy.py`
|
||||
3. Check for import errors in gateway logs
|
||||
4. Ensure routes were added before the closing `]` bracket
|
||||
|
||||
### Tools loading but parameters wrong
|
||||
1. Check that `_simplify_schema()` is being called
|
||||
2. Verify property names aren't being corrupted
|
||||
3. Test individual tool schema with `jq` to inspect conversion
|
||||
|
||||
### OpenUI still showing errors
|
||||
1. Verify `/v1/tools` returns valid JSON (use `jq` to validate)
|
||||
2. Check browser console for actual error messages
|
||||
3. Ensure no truncation of tool names in the response
|
||||
|
||||
## Additional Notes
|
||||
|
||||
- The `lambda r: tools(r, TOOL_DEFINITIONS)` syntax is required because `TOOL_DEFINITIONS` is a module-level variable that gets populated during initialization
|
||||
- Schema conversion happens at request time, so adding new backends will automatically make their tools available via OpenAI API
|
||||
- Error handling is intentionally lenient (malformed tools are skipped rather than failing the whole request)
|
||||
Loading…
Reference in a new issue