Skip to main content
  1. Documentation/

Quick Start

3 mins

You need two things: a Postgres database and an embedding provider API key. Then the wizard handles everything else.

1. Get a database #

Pick one:

  • Supabase – free tier, managed, no setup. Create a project and grab your URL + secret key from Settings > API Keys.
  • Neon – free tier, serverless Postgres. Create a database and grab the connection string.
  • Self-hosted Postgres – needs PostgreSQL 15+ with pgvector installed.

2. Run the wizard #

Supabase users:

uvx --from ogham-mcp ogham init

Neon or self-hosted Postgres users:

uvx --from 'ogham-mcp[postgres]' ogham init

The [postgres] extra installs the psycopg driver needed for direct Postgres connections.

The wizard walks you through:

  • Connecting to your database
  • Picking an embedding provider (OpenAI, Voyage, Ollama, Mistral)
  • Running the schema migration
  • Writing MCP client configs for Claude Code, Cursor, VS Code, and others

When it finishes, your clients are configured and ready to go.

3. Add to your MCP client #

The wizard configures your database, embedding provider, and API keys. Once it finishes, it prints the exact config to add to your client – including all the environment variables the server needs.

For Claude Code, the wizard runs claude mcp add for you automatically. For other clients, copy the config snippet it prints into the right file:

Client Config file
Claude Desktop (macOS) ~/Library/Application Support/Claude/claude_desktop_config.json
Cursor ~/.cursor/mcp.json
VS Code .vscode/mcp.json
Windsurf ~/.windsurf/mcp.json
Kiro ~/.kiro/settings/mcp.json
Codex ~/.codex/config.toml

Restart the client after editing config (Claude Code and Codex pick up changes automatically).

4. Try it #

Tell your agent:

“Remember that our Azure resource groups use the naming convention {env}-{service}

Then in the same client, or a different one:

“What do you know about our Azure resource naming conventions?”

It works across clients because they all share the same database.


Docker setup #

If you prefer Docker over uvx:

docker pull ghcr.io/ogham-mcp/ogham-mcp

Create a .env file with your database and embedding config:

SUPABASE_URL=https://your-project.supabase.co
SUPABASE_KEY=your-secret-key
EMBEDDING_PROVIDER=openai
OPENAI_API_KEY=sk-...

Run the schema migration first (paste sql/schema.sql into Supabase SQL Editor, or run sql/schema_postgres.sql against Neon/self-hosted).

Then add to your MCP client config:

{
  "mcpServers": {
    "ogham": {
      "command": "docker",
      "args": [
        "run", "--rm", "-i",
        "--env-file", "/absolute/path/to/.env",
        "ghcr.io/ogham-mcp/ogham-mcp"
      ]
    }
  }
}

If you use Ollama locally, add --add-host=host.docker.internal:host-gateway to the args so the container can reach Ollama on your host machine.


SSE mode (multiple agents) #

If you run multiple AI clients at the same time, SSE mode lets them share one server process:

ogham serve --transport sse --port 8742

Then point your clients at the URL instead of a command:

{
  "mcpServers": {
    "ogham": {
      "url": "http://127.0.0.1:8742/sse"
    }
  }
}

One server, one database pool, shared memory. Health check at http://127.0.0.1:8742/health.


Workflow skills (optional) #

Skills wire up common workflows so you don’t have to call MCP tools one at a time:

npx skills add ogham-mcp/ogham-mcp

This installs ogham-research (capture), ogham-recall (retrieval), and ogham-maintain (admin). See the skills docs for details.