The MCP LLM Bridge translates between Model Context Protocol (MCP) servers and OpenAI-compatible language models, converting MCP tool specifications into OpenAI function schemas and routing function invocations back to MCP tools. It provides a bidirectional protocol layer that enables any OpenAI-compatible LLM—whether cloud-based like GPT-4o or local implementations like Ollama and LM Studio—to access and execute MCP-compliant tools through a standardized interface. This solves the problem of connecting MCP tool ecosystems to LLMs that use OpenAI's function-calling interface rather than MCP's native protocol.
claude mcp add --transport stdio bartolli-mcp-llm-bridge uvx mcp-llm-bridge