Engrams
An engram is the fundamental unit of storage in MuninnDB. The term comes from neuroscience — a memory engram is the physical trace left in a neural network when a memory is formed. In MuninnDB, an engram is a structured memory record with cognitive properties built in.
Think of an engram as a "row" that knows:
- How confident we are in the information it contains
- How relevant it currently is (decaying over time)
- What other engrams it's related to, and how strongly
- How many times it's been retrieved (affects stability)
What is an Engram?
Every engram stores a single piece of knowledge — a fact, a preference, a decision, an observation. Unlike database rows, engrams are designed to model how human memory actually works:
- Frequently recalled memories become more stable and resistant to decay
- Memories that are recalled together build stronger associations over time
- Conflicting memories reduce each other's confidence
- Old, unaccessed memories gradually fade toward a background level
Engram Fields
| Field | Type | Description |
|---|---|---|
| ID | ULID | 16-byte sortable unique ID. Monotonic, URL-safe. |
| Concept | string (512B) | What this engram is about. Used as the primary search key. |
| Content | string (16KB) | The information stored. Supports full-text indexing. |
| Confidence | float32 (0–1) | Bayesian posterior. How reliable is this information? |
| Relevance | float32 (0–1) | Current Ebbinghaus decay score. Continuously updated. |
| Stability | float32 | Decay resistance. Grows with each spaced retrieval. |
| AccessCount | uint32 | Number of times retrieved. Affects Hebbian associations. |
| State | enum | Lifecycle: PLANNING, ACTIVE, PAUSED, COMPLETED, ARCHIVED. |
| Tags | []string | User-defined labels for filtering and grouping. |
| Associations | []Association | Weighted edges to other engrams. 40 bytes fixed each. |
| Embedding | []float32 | Optional quantized vector (4× compressed). Requires embed plugin. |
Lifecycle States
Engrams have a lifecycle state that controls how they're treated by the activation engine:
PLANNING— Draft state. Not yet activated in queries by default.ACTIVE— Normal state. Included in ACTIVATE queries, subject to decay.PAUSED— Temporarily suspended. Not returned in queries, decay halted.BLOCKED— Dependency blocked. Similar to PAUSED.COMPLETED— Finished. Can be activated for historical context.ARCHIVED— Long-term storage. Excluded from normal activation.
Associations
Every engram can have weighted associations to other engrams. Associations are bidirectional and their weights evolve automatically through Hebbian learning — the more two engrams are retrieved together, the stronger their association becomes.
You can also set associations manually when storing an engram. The association weight is a float between 0.0 and 1.0. During ACTIVATE queries, Phase 5 (Graph Traversal) uses a BFS walk up to depth 2 to surface associated engrams, applying a hop penalty at each depth.
Storing Engrams
e, err := mem.Store(ctx, &muninn.StoreRequest{
Concept: "database is PostgreSQL-compatible",
Content: "The backend uses PostgreSQL 15 with pgvector extension",
Tags: []string{"infrastructure", "database"},
Confidence: 0.95,
State: muninn.StateActive,
}) e = mem.store(
concept="database is PostgreSQL-compatible",
content="The backend uses PostgreSQL 15 with pgvector extension",
tags=["infrastructure", "database"],
confidence=0.95,
state=muninn.STATE_ACTIVE,
) Reading Engrams
Point reads by ID return a single engram. ACTIVATE queries return the N most cognitively relevant engrams for a given context.
// Point read
e, err := mem.Get(ctx, engramID)
// Cognitive activation — finds most relevant engrams
results, err := mem.Activate(ctx, "what database do we use?", 5) # Point read
e = mem.get(engram_id)
# Cognitive activation — finds most relevant engrams
results = mem.activate("what database do we use?", limit=5)