A straightforward bridge that lets Claude call any OpenAI-compatible chat completion API through a single `chat` tool. You configure it with API keys and endpoints for services like OpenAI, Perplexity, Groq, or xAI, then Claude can query those models directly during conversations. Works well when you want to compare responses across different LLMs or tap into specialized models for specific tasks. The setup is clean: just point it at any OpenAI SDK-compatible endpoint with the right environment variables. You can run multiple instances to connect several providers simultaneously, giving Claude access to your entire LLM toolkit.
claude mcp add --transport stdio pyroprompts-any-chat-completions-mcp uvx any-chat-completions-mcp