Semantic recall can surface archived/dormant memories #3

Closed
opened 2026-03-01 04:09:24 +00:00 by cal · 1 comment
Owner

Problem

recall() in client.py filters out memories below THRESHOLD_DORMANT (0.05) in the keyword path, but semantic_recall() (in embeddings.py) scores all memories in _embeddings.json against the query vector with no decay filter. Dormant/archived memories can re-appear in the merged results if their semantic similarity is high enough.

This is a systemic issue affecting all recall paths, not just auto-edges. However, auto-edges make it more impactful since edges are created silently — an auto-edge to a dormant memory could re-surface it against the decay system's intent.

Proposed Solution

Apply the same THRESHOLD_DORMANT filter in semantic_recall() before scoring, or filter the merged results in recall() after the keyword+semantic merge. The latter is simpler but the former avoids unnecessary computation.

Scope

  • embeddings.pysemantic_recall() needs access to decay state for filtering
  • client.py — alternatively, post-filter merged results in recall()
  • No MCP server changes needed
## Problem `recall()` in `client.py` filters out memories below `THRESHOLD_DORMANT` (0.05) in the keyword path, but `semantic_recall()` (in `embeddings.py`) scores all memories in `_embeddings.json` against the query vector with no decay filter. Dormant/archived memories can re-appear in the merged results if their semantic similarity is high enough. This is a systemic issue affecting all recall paths, not just auto-edges. However, auto-edges make it more impactful since edges are created silently — an auto-edge to a dormant memory could re-surface it against the decay system's intent. ## Proposed Solution Apply the same `THRESHOLD_DORMANT` filter in `semantic_recall()` before scoring, or filter the merged results in `recall()` after the keyword+semantic merge. The latter is simpler but the former avoids unnecessary computation. ## Scope - `embeddings.py` — `semantic_recall()` needs access to decay state for filtering - `client.py` — alternatively, post-filter merged results in `recall()` - No MCP server changes needed
cal added the
ai-working
label 2026-03-03 05:05:44 +00:00
Author
Owner

Fixed in PR #7: #7

Applied the THRESHOLD_DORMANT filter in semantic_recall() before scoring (Option A from the proposal). Loads _state.json and skips any memory with decay_score < 0.05 before computing cosine similarity — consistent with the keyword path in recall() and avoids unnecessary embedding computation for archived entries.

Fixed in PR #7: https://git.manticorum.com/cal/cognitive-memory/pulls/7 Applied the `THRESHOLD_DORMANT` filter in `semantic_recall()` before scoring (Option A from the proposal). Loads `_state.json` and skips any memory with `decay_score < 0.05` before computing cosine similarity — consistent with the keyword path in `recall()` and avoids unnecessary embedding computation for archived entries.
cal added
ai-pr-opened
and removed
ai-working
labels 2026-03-03 05:07:53 +00:00
cal closed this issue 2026-03-10 18:26:37 +00:00
Sign in to join this conversation.
No Milestone
No project
No Assignees
1 Participants
Notifications
Due Date
The due date is invalid or out of range. Please use the format 'yyyy-mm-dd'.

No due date set.

Dependencies

No dependencies set.

Reference: cal/cognitive-memory#3
No description provided.