Quick Start
Get Epitome running and connect your first AI agent in under 2 minutes.
Prerequisites
Depending on whether you use the hosted service or self-host, you will need different tools installed.
- Node.js 22+ — Runtime for the API server
- PostgreSQL 17 with pgvector 0.8+ — Primary datastore with vector search
- Docker (optional) — Simplest way to run everything locally
- OpenAI API key — Required for entity extraction and embeddings (gpt-5-mini + text-embedding-3-small)
Hosted Setup
The fastest way to get started is with the hosted version at epitome.fyi. There is nothing to install — you get a fully managed Epitome instance with your own isolated database schema.
- Visit epitome.fyi and sign in with GitHub or Google
- Copy your personal MCP URL from the dashboard Settings page
- Configure your AI agent to use that URL (see Connect Your First Agent below)
Your MCP URL will look something like this:
https://epitome.fyi/mcp/usr_abc123def456Self-Hosted Setup
Clone the repository and start the services with Docker Compose:
# Clone the repository
git clone https://github.com/gunning4it/epitome.git
cd epitome
# Copy and configure environment variables
cp .env.example .env
# Edit .env with your DATABASE_URL, JWT_SECRET, OPENAI_API_KEY, etc.
# Start everything with Docker Compose
docker compose up -d
# The API will be available at http://localhost:3000
# The dashboard will be available at http://localhost:5173Docker Compose starts three containers: the PostgreSQL database (with pgvector pre-installed), the Hono API server, and the React dashboard. The init.sql file runs automatically on first startup to create the shared schema, extensions, and bootstrap data.
If you prefer to run services individually (without Docker), see the Self-Hosting Guide for detailed instructions.
Connect Your First Agent
Epitome uses the Model Context Protocol (MCP) to communicate with AI agents. Configure your agent with Epitome's MCP server URL and it will gain access to 9 tools for reading/writing your profile, memories, tables, knowledge graph, and activity log.
Claude Desktop
Add the following to your Claude Desktop MCP configuration file (claude_desktop_config.json):
{
"mcpServers": {
"epitome": {
"url": "https://epitome.fyi/mcp/YOUR_MCP_TOKEN"
}
}
}Claude Code
Add Epitome as an MCP server in your project or global settings:
claude mcp add epitome \
--transport streamable-http \
https://epitome.fyi/mcp/YOUR_MCP_TOKENOpenClaw
Running agents locally with OpenClaw? Epitome works seamlessly as the shared memory layer — every local agent gets the same context. See the Architecture page for the full integration pattern.
Self-hosted
For self-hosted instances, replace the URL with your local server and use an API key for authentication:
{
"mcpServers": {
"epitome": {
"url": "http://localhost:3000/mcp",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}System Prompt Template
For the best experience, include the following in your agent's system prompt so it knows how to use Epitome effectively. This template tells the agent what tools are available and when to use them:
You have access to Epitome, the user's personal AI database. Use it to:
1. READ the user's profile at the start of every conversation with read_profile.
This gives you their name, preferences, family, work, health info, and more.
2. STORE important facts the user shares using store_memory.
Examples: "I just got promoted to VP", "My daughter's birthday is March 15",
"I'm allergic to shellfish". Always store with a descriptive collection name.
3. SEARCH past memories with search_memory before making recommendations
or answering questions where context matters. This uses semantic vector search.
4. QUERY structured data with query_table for organized information
like reading lists, project trackers, or habit logs.
5. INSERT structured records with insert_record when the user wants to
track something in a table format.
6. EXPLORE the knowledge graph with query_graph and get_entity_neighbors
to understand relationships between people, places, and things in the user's life.
7. LOG significant actions with log_activity so the user has an audit trail
of what their AI agents have done.
Guidelines:
- Always read the profile first in a new conversation.
- Store memories proactively when the user shares personal information.
- Search memories before giving personalized advice.
- Never fabricate information — if you don't know, search first.
- Respect the user's privacy: only store what they share directly.Verify It Works
Once your agent is configured, verify the connection by asking it to interact with Epitome:
- Read your profile: Ask the agent
"What do you know about me?"— it should callread_profileand return your profile data (or an empty profile if this is your first time). - Store a memory: Tell the agent something about yourself, like
"I love hiking in the Cascades". It should callstore_memory. - Search memories: In a new conversation, ask
"What are my outdoor hobbies?"— it should callsearch_memoryand find the hiking memory. - Check the dashboard: Open the Epitome dashboard and navigate to the Memories page. You should see the stored memory with its vector embedding and confidence score.
If the agent reports connection errors, see the Troubleshooting page for common issues and solutions.