Your AI's next
breakthrough, powered
by memory

Contxt gives AI coding agents persistent, versioned, project-scoped memory — so they work like a teammate, not a stranger.

~/my-saas-app
$contxt init
✓ Initialized in ./my-saas-app
Stack detected: Next.js · Prisma · Postgres
$contxt push
✓ Synced 14 decisions · 8 patterns · active context
→ mycontxt.ai/kareem/my-saas-app
$contxt load--task "add Stripe webhooks"
✓ 5 entries loaded · 812 tokens
Filtered 24 irrelevant entries (saved 3,388 tokens)
81% fewer tokens

Context that persists
across every session

Your architecture. Your decisions. Always there.

Every AI coding session starts from zero. You spend 40% of your prompts re-explaining your architecture, your patterns, your past decisions. Your AI doesn't remember that you use Prisma, that you prefer server actions over API routes, or that you already decided against Redis.

The result? Wasted tokens. Generic suggestions. Code that doesn't fit your stack. You become a human clipboard, copying context into every conversation.

Contxt solves this. Your AI gets persistent, project-scoped memory. Push your architecture once — every tool remembers it. Forever.

40%
of prompts wasted on re-explaining context
3.2×
faster iteration with persistent memory
5K+
tokens saved per session on average
0
manual copy-paste. Contxt loads automatically.
#01 Workflow

Push once.
Never repeat yourself.

Store decisions, patterns, and active context in your project's .contxt/ directory. One command syncs it to the cloud. Every AI tool you use can pull from it.

  • Decisions (why you chose Prisma over TypeORM)
  • Patterns (your API error handler template)
  • Context (current blockers, active features)
Push to cloud
Sync happens automatically
~/.claude/claude_desktop_config.json
{
"mcpServers": {
"contxt": {
"command": "contxt",
"args": ["mcp"]
}
}
}
#02 Integrations

Works with every AI coding tool

Contxt uses MCP (Model Context Protocol) — the open standard for AI context. One integration, every tool. Claude Code, Cursor, Copilot, Windsurf — they all read from the same source of truth.

Claude CodeCursorCopilotWindsurf
#03 Structure

Git-like branching.
Time travel included.

Want to experiment with a new approach? Create a branch. Try it out. Merge it back or throw it away. Every change is versioned — you can revert to any point in your project's memory history.

$contxt branch create experiment
$contxt decision add --title "Try approach B"
$contxt branch merge experiment
main
Decision: Use Prisma
Pattern: API errors
experiment
Decision: Try approach B

The relevance engine

Contxt doesn't dump everything into your prompt. It intelligently scores each memory entry and surfaces only what matters for your current task.

Task Signal
QUERY
"implement Stripe webhook handler"
ACTIVE FILES
src/app/api/webhooks/route.ts
src/lib/stripe.ts
CONSTRAINTS
Max 2000 tokens
Ranked Results
5 entries • 1,847 tokens
Decision: Use Stripe for payments
0.94
Keyword overlap • File match • Type priority
432 tokens
Pattern: API error handler
0.87
Recency • Type priority
318 tokens
Context: Payment flow blockers
0.82
Semantic similarity • Recency
245 tokens
Filtered 18 irrelevant entries • Saved 4,203 tokens

One integration.
Every tool.

Contxt uses MCP, the open standard for AI context. Install once, use everywhere.

Claude Code
Cursor
GitHub Copilot
Windsurf
Zed
Continue

Simple, transparent pricing

Start free. Scale as you grow.

Starter
$0
Forever free
  • 1 project
  • Up to 100 memory entries
  • Local + cloud sync
  • MCP integration
  • Basic search
Most Popular
Pro
$29
per month
  • Unlimited projects
  • Unlimited memory entries
  • Semantic search (AI-powered)
  • Branch & version history
  • Priority support
Enterprise
Custom
Let's talk
  • Everything in Pro
  • Team collaboration
  • Self-hosted deployment
  • SSO & advanced security
  • Dedicated support

Give your AI
a memory

Stop repeating yourself. Start shipping faster.