Bridges Claude with locally running LM Studio instances through the standard OpenAI-compatible API. Exposes nine tools for health checks, model listing, chat completions, text generation, embeddings, and stateful multi-turn conversations. Handles both simple prompt-response workflows and persistent sessions where system prompts stay locked across conversation turns. Supports flexible deployment via Python, Docker, or direct GitHub installation. Reach for this when you want to route specific queries to your private models while staying in Claude's interface, especially useful for sensitive data that shouldn't hit external APIs or when you need specialized local models for embeddings and RAG workflows.
claude mcp add --transport stdio infinitimeless-lmstudio-mcp uvx lmstudio-mcp