A prompt management server that exposes CRUD operations for LLM prompts through MCP tools like add_prompt, get_prompt, and apply_template. It handles template variable substitution, supports multiple storage backends including DynamoDB and S3, and can run in stdio mode for Claude Desktop or HTTP mode for REST integrations. The cognitive architecture layers add context-aware prompt recommendations and cross-domain pattern matching. You'd reach for this when building applications that need centralized prompt storage, versioning, and intelligent template management across different AI workflows, especially if you're already using AWS infrastructure or need Stripe payment integration for prompt marketplace features.
claude mcp add --transport stdio sparesparrow-mcp-prompts uvx mcp-prompts