Getting Started
Up and running in
Up and running in
5 minutes
No Docker. No cloud account. No configuration required to start.
macOS / Linux / Windows Single binary No external deps
1
Install MuninnDB
One command installs the binary on macOS, Linux, or Windows:
bash
# macOS / Linux
curl -fsSL https://muninndb.com/install.sh | sh
# macOS (Homebrew)
brew install scrypster/tap/muninn
# Windows (PowerShell)
irm https://muninndb.com/install.ps1 | iex 2
Initialize and connect your AI tools
One command sets everything up — connects Claude Desktop, Cursor, VS Code, or Windsurf, generates a bearer token, and starts all services:
bash
muninn init
# Guided setup wizard — connects Claude Desktop, Claude Code, Cursor, OpenClaw, Windsurf, Codex, VS Code
# [1/3] Which AI tools would you like to connect?
# [2/3] Secure your MCP endpoint with a bearer token? [Y/n]
# [3/3] Start MuninnDB now? [Y/n]
#
# muninn started (pid 12345)
# MBP :8474 binary protocol
# REST :8475 JSON API
# gRPC :8477 gRPC API
# MCP :8750 AI tool integration
# UI :8476 http://localhost:8476 Web dashboard
Visit http://localhost:8476 for the visual dashboard — decay charts, relationship graphs, live activation log.
3
Store your first memory
Using the Go or Python SDK (REST on port 8475):
package main
import (
"context"
"fmt"
"github.com/scrypster/muninndb/sdk/go/muninn"
)
func main() {
ctx := context.Background()
client := muninn.NewClient("http://localhost:8475", "your-token")
// Store a memory engram
engID, err := client.Write(ctx, "default",
"user prefers dark mode",
"Always render UI in dark theme for this user",
[]string{"preference", "ui"})
fmt.Printf("Stored engram: %s\n", engID)
_ = err
} import asyncio
from muninn import MuninnClient
async def main():
async with MuninnClient("http://localhost:8475", token="your-token") as client:
# Store a memory engram
eng_id = await client.write(
vault="default",
concept="user prefers dark mode",
content="Always render UI in dark theme for this user",
tags=["preference", "ui"],
)
print(f"Stored engram: {eng_id}")
asyncio.run(main()) 4
Activate relevant memories
// Find the top 5 most relevant memories for a context
results, err := client.Activate(ctx, "default", []string{"what does the user want?"}, 5)
for _, r := range results.Engrams {
fmt.Printf("%.2f — %s\n", r.Score, r.Concept)
fmt.Printf(" Why: %s\n", r.Why)
}
// Output:
// 0.94 — user prefers dark mode
// Why: BM25 match (0.78) + Hebbian boost (0.16) # Find the top 5 most relevant memories for a context
results = await client.activate(
vault="default", context=["what does the user want?"])
for r in results.engrams:
print(f"{r.score:.2f} — {r.concept}")
print(f" Why: {r.why}")
# Output:
# 0.94 — user prefers dark mode
# Why: BM25 match (0.78) + Hebbian boost (0.16) That's it
MuninnDB automatically handles decay, Hebbian learning, and association building from here. Your memories improve the more they're used. Use muninn shell to explore interactively.