From zero to persistent AI memory in under 2 minutes.
Requires Python 3.10+. Works on Linux, macOS, and Windows WSL.
pip install elara-core
pip isn't recognized, use python -m pip install elara-core.
Creates the data directory (~/.elara/) and default config.
elara init
Created /home/user/.elara/
Default config written. Ready to serve.
Pick your editor. Each connects differently.
claude mcp add elara -- elara serve
Done. Restart Claude Code and you'll see 34 tools loaded.
Add to .cursor/mcp.json in your project root:
{
"mcpServers": {
"elara": {
"command": "elara",
"args": ["serve"]
}
}
}
Restart Cursor. Check MCP panel for "elara" connection.
Add to ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"elara": {
"command": "elara",
"args": ["serve"]
}
}
}
Restart Windsurf. Elara tools appear in the Cascade panel.
In VS Code: Cline Settings → MCP Servers → Add:
{
"elara": {
"command": "elara",
"args": ["serve"]
}
}
The Cline extension will auto-detect the 34 tools.
Ask your AI something that uses memory. If it works, you'll see tool calls like these:
# Try any of these:
"Remember that I prefer dark themes"
→ elara_remember("User prefers dark themes", memory_type="fact")
"What do you remember about me?"
→ elara_recall("user preferences and facts")
"How are you feeling?"
→ elara_mood(detail="brief")