Sentry MCP provides middleware between AI coding assistants and the Sentry error tracking platform, exposing tools for debugging workflows such as searching issues and events, retrieving traces, and analyzing performance data. The server supports both remote deployment and stdio transport for self-hosted Sentry instances, with AI-powered search capabilities that require configuration of an LLM provider (OpenAI or Anthropic) to translate natural language queries into Sentry's query syntax. It solves the problem of enabling AI coding agents like Claude and Cursor to programmatically access and investigate Sentry error data during development and debugging sessions.
claude mcp add --transport stdio getsentry-sentry-mcp uvx sentry-mcp