Every concept connects to every other with weight, valence, and stability. The network thinks — it propagates, inhibits, decays, and learns.
reinforce_association creates bidirectional synapses. Stability grows 1.5× per reinforcement.apply_homeostasis normalises all weights toward a target mean — preventing runaway drift while preserving relative strengths.predict_next learns what typically follows a concept.Any AI that interacts with a person over time can use associative memory to model that person's emotional landscape and reason about it.
A user mentions they want to see owls. The agent extracts associations, stores them via the API, then days later — in a new session — retrieves exactly that signal to shape a personalised recommendation.
A single function call activates a concept. The network does the rest — propagating signal, resolving conflict, and surfacing the emotional context that shapes how the agent responds.
reinforce_association with a source, target, importance weight, and emotional valence. The system writes bidirectional synapses.propagate_activation with a concept and hop depth. Signal travels outward through synapses, attenuating at each step.apply_lateral_inhibition. Dominant concepts suppress their competitors. The network converges to a coherent emotional state.hedonic_state to get a scalar read on the current emotional valence across all active concepts.decay_activations with your chosen rate. Working memory fades naturally. Long-term synapses are untouched.// encounter a dog [ { "method": "propagate_activation", "params": { "concept": "dog", "strength": 0.9, "hops": 2 } }, { "method": "apply_lateral_inhibition" }, { "method": "get_activations" } ] // response { "fear": 0.900, "danger": 0.900, "alert": 0.810, "escape": 0.720, "safe": -0.720 // suppressed }
// same dog, safe context { "fear": 0.180, // ← was 0.900 "safe": 0.720 // ← home pre-activated safe }
Different surface forms of the same idea collapse to a single canonical concept. Your embedding model decides similarity. We store the registry, run the lookup, and cache every resolution.
Use any embedding provider. We accept vectors — you decide the model.
The Ebbinghaus forgetting curve, implemented in Postgres. Each reinforcement builds stability. Emotional valence adds a second multiplier.
Each reinforcement multiplies stability by 1.5×. A concept seen 5 times has stability ≈ 7.6.
High absolute valence makes a synapse decay 3× slower than a neutral equivalent with identical stability.
weight ×= (1.0 − decay_rate ÷ max(stability, 1.0) ÷ (1.0 + abs(valence) × 2.0) )
You call named functions over JSON-RPC. Batched. Authenticated with an API key. The Go middleware verifies your token, enforces quota, and routes to Postgres.
{
"jsonrpc": "2.0",
"method": "reinforce_association",
"params": {
"src": "dog",
"tgt": "fear",
"importance": 1.0,
"valence": -1.0
}
}