ELARA
// mcp server — source available (BSL-1.1)

PERSISTENT MEMORY
FOR AI ASSISTANTS

You tell Claude about your project on Monday. On Friday, it still knows.

35 tools. 11 modules. Mood, memory, episodes, dreams, reasoning, corrections, email. Sessions stop dying. Context accumulates. Your AI remembers.

16,800+
Lines
35
Tools
11
Modules
91
Tests
8
Vector DBs

// LIVE DEMO

Watch Elara recall context across sessions, track bugs, and triage email.

claude code + elara
- - - - - - - - [ SYSTEM MODULES ] - - - - - - - -

// SUBSYSTEMS

Each module runs independently. Use what you need.

[MEM]

Semantic Memory

ChromaDB vector search. Recall by meaning, not keywords. Natural decay — recent stays detailed, old compresses.

[IDX]

Conversation Index

Every exchange indexed and cross-referenced. Search thousands of past conversations by semantic similarity.

[EPI]

Episodic Tracking

Sessions tracked as episodes with milestones, decisions, mood arcs. Two tracks: full for work, light for casual.

[MOD]

Mood & Presence

Persistent emotional state with natural decay. Configurable personality modes. Context-driven tone shifts.

[DRM]

Dream Processing

Offline pattern analysis. Weekly: project momentum, session patterns. Monthly: narrative threading across weeks.

[RSN]

Reasoning Trails

Problem-solving chains end-to-end. Hypotheses, evidence, dead ends, solutions. Similar problems find old trails.

[COR]

Corrections

Mistakes stored as rules. Checked before similar work. Never decay. Boot-loaded. No repeating errors.

[OVR]

Overwatch

Background daemon. Real-time conversation watching. Micro-ingests memories. Auto-connects things said days apart.

[BIZ]

Business Intel

Idea tracking, 5-axis scoring. Competitor analysis. Pitch tracking with win rates. Decision outcomes.

[GML]

Gmail Integration

Read, triage, send, archive emails. LLM-powered inbox classification. Semantic search across indexed messages.

- - - - - - - - [ INITIALIZATION ] - - - - - - - -

// BOOT SEQUENCE

Python 3.10+ required. Linux, macOS, Windows WSL.

01
Install from source
pip install elara-core
02
Bootstrap data directory
elara init
03
Connect to Claude Code
claude mcp add elara -- elara serve
elara-core // session
$ elara serve
[BOOT] Loading state... OK
[BOOT] ChromaDB connected (7 collections)
[BOOT] 70 memories, 1080 conversations indexed
[BOOT] Mood: neutral-warm (valence: 0.6)
[BOOT] 3 corrections loaded
[BOOT] Overwatch daemon: active
---
[MCP] Server ready on stdio
[MCP] 34 tools registered across 10 modules
 
> Waiting for client connection...
- - - - - - - - [ TOOL MANIFEST ] - - - - - - - -

// 35 MCP TOOLS

11 modules. Every tool exposed to your AI client.

Memory

remember recall recall_conversation conversations

Mood

mood mood_adjust imprint mode status

Episodes

episode_start episode_note episode_end episode_query

Goals

goal goal_boot correction correction_boot

Cognitive

reasoning outcome synthesis

Awareness

reflect insight intention observe temperament

Dreams

dream dream_info

Business

business briefing snapshot

Session

context handoff

Gmail

gmail

System

llm rebuild_indexes
- - - - - - - - [ ARCHITECTURE ] - - - - - - - -

// STRUCTURE

Clean modules. Independent systems.

elara-core/

├── core/           # paths, state, orchestration
├── daemon/         # persistent systems
│   ├── awareness/  # self-reflection, blind spots
│   ├── overwatch/  # real-time conversation watcher
│   └── episodic/   # episode tracking, compression
├── memory/         # chromadb, vector search
├── elara_mcp/      # mcp server + 35 tool defs
├── hooks/          # boot scripts, activity hooks
├── interface/      # web dashboard, storage
├── voice/          # tts integration
└── tests/          # 91 tests, full isolation
- - - - - - - - [ COMPATIBILITY ] - - - - - - - -

// WORKS WITH ANY MCP CLIENT

Elara uses the Model Context Protocol (MCP) — an open standard by Anthropic. Any client that speaks MCP gets all 35 tools automatically. The AI model doesn't matter. The client does.

[>_]

Claude Code

CLI — Anthropic

[:::]

Claude Desktop

APP — Anthropic

[//>]

Cursor

IDE — GPT-4o / Claude / Gemini

[~~~]

Windsurf

IDE — Cascade AI

[{;}]

VS Code + Cline

IDE — Any model (free)

[???]

Your Client

Anything that speaks MCP

// SETUP BY CLIENT

Step 1 is always the same. Step 2 depends on your client.

01
Install Elara (all clients)
pip install elara-core
elara init

[>_] Claude Code

claude mcp add elara -- elara serve

Requires Claude Pro, Max, or Team plan ($20-100/mo)

[:::] Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json

{"mcpServers": {"elara": {"command": "elara", "args": ["serve"]}}}

Requires Claude Pro or higher ($20+/mo). Restart app after editing.

[//>] Cursor

Settings → MCP → Add Server

{"command": "elara", "args": ["serve"]}

Cursor Pro ($20/mo). Uses GPT-4o, Claude, or Gemini — your choice.

[~~~] Windsurf

Edit ~/.codeium/windsurf/mcp_config.json

{"mcpServers": {"elara": {"command": "elara", "args": ["serve"]}}}

Windsurf Pro subscription. Restart after editing.

[{;}] VS Code + Cline

Install Cline extension → Settings → MCP Servers → Add

{"command": "elara", "args": ["serve"]}

Free. Bring your own API key (OpenAI, Anthropic, or local Ollama).

[???] Other MCP Clients

Any MCP-compatible client works. Point it at:

elara serve

Elara is model-agnostic. The MCP protocol handles the bridge.