MCP Setup
Contxt includes a built-in MCP (Model Context Protocol) server. This lets AI tools load your project's context automatically — no copy-pasting, no manual prompts.
What is MCP?
MCP is an open protocol that lets AI applications access external tools and data sources. Contxt implements an MCP server that exposes your project's memory as tools the AI can call.
Available MCP Tools
When connected via MCP, your AI tool gets access to these tools:
| Tool | Description |
|---|---|
| contxt_suggest | Get relevant context for a task (Smart Suggest) |
| contxt_search | Search across all memory entries |
| contxt_get_context | Load current active context |
| contxt_get_decisions | List decisions (with optional filters) |
| contxt_get_patterns | List patterns (with optional filters) |
| contxt_log_decision | Log a new decision from the AI session |
| contxt_log_pattern | Save a new pattern from the AI session |
| contxt_update_context | Update active working context |
| contxt_end_session | End and log the current session |
Generic MCP Configuration
For any MCP-compatible client, add Contxt as a server:
{"mcpServers": {"contxt": {"command": "contxt","args": ["mcp", "serve"],"env": {"CONTXT_PROJECT": "/path/to/your/project"}}}}
The CONTXT_PROJECT environment variable tells the MCP server which project to load. If omitted, it uses the current working directory.
Starting the MCP Server Manually
contxt mcp serveThis starts the MCP server on stdio (standard input/output), which is the transport most AI tools expect.
contxt mcp serve --port 3100Start on HTTP for tools that use HTTP transport.
Verifying the Connection
Once configured, ask your AI tool: "What decisions have been made in this project?" If Contxt is connected, it will call contxt_get_decisions and return your logged decisions.