Quick Start
Up and running in 5 minutes. No Docker. No cloud account. No configuration required.
Go 1.23+ macOS / Linux / Windows No external deps
1. Install MuninnDB
Build from source until pre-built binaries land with v0.1.0:
bash
git clone https://github.com/scrypster/muninndb
cd muninndb
go build -o muninndb ./cmd/muninndb
./muninndb --version 2. Start the server
One command starts all four interfaces — binary protocol, gRPC, REST, and the web UI:
bash
muninndb serve
# MuninnDB v0.1.0 starting
# MBP :8747 ✓
# gRPC :8748 ✓
# REST :8749 ✓
# Web :8750 ✓ → http://localhost:8750
# Ready. Web UI
Visit http://localhost:8750 for the visual dashboard — decay charts, relationship graphs, live activation log.
3. Create an API key
bash
# Create an API key
curl -X POST http://localhost:8749/v1/keys \
-H "Content-Type: application/json" \
-d '{"name": "my-agent", "vault": "default"}'
# Returns:
# {"key": "mn_live_abc123...", "vault": "default"} 4. Store your first memory
mem := muninn.NewMemory("mn_live_abc123", "muninn://localhost:8747")
e, err := mem.Store(ctx, &muninn.StoreRequest{
Concept: "user prefers dark mode",
Content: "Always render UI in dark theme for this user",
Tags: []string{"preference", "ui"},
Confidence: 0.9,
})
fmt.Printf("Stored: %s\n", e.ID) import muninn
mem = muninn.Memory("mn_live_abc123", "muninn://localhost:8747")
e = mem.store(
concept="user prefers dark mode",
content="Always render UI in dark theme for this user",
tags=["preference", "ui"],
confidence=0.9,
)
print(f"Stored: {e.id}") 5. Activate relevant memories
Activate returns the N most cognitively relevant engrams for a given context — ranked by BM25 score, decay state, Hebbian associations, and graph depth:
results, err := mem.Activate(ctx, "what does the user want?", 5)
for _, r := range results.Engrams {
fmt.Printf("%.2f — %s\n", r.Score, r.Concept)
fmt.Printf(" Why: %s\n", r.Why)
}
// Output:
// 0.94 — user prefers dark mode
// Why: BM25 match (0.78) + Hebbian boost (0.16) results = mem.activate("what does the user want?", limit=5)
for r in results.engrams:
print(f"{r.score:.2f} — {r.concept}")
print(f" Why: {r.why}")
# Output:
# 0.94 — user prefers dark mode
# Why: BM25 match (0.78) + Hebbian boost (0.16) Done.
Decay, Hebbian learning, and association building happen automatically from here. Your memories improve the more they're used.