Additive Semantic CMR

Semantic similarity as additive boost to retrieval

Additive Semantic CMR extends base CMR with pre-experimental semantic associations between items. When computing retrieval activations, semantic similarity to the last-recalled item is added to temporal context support.

The Mechanism

In base CMR, retrieval depends only on temporal context: \[a_i = (M^{CF} \mathbf{c})_i\]

Additive Semantic CMR adds a semantic component: \[a_i = (M^{CF} \mathbf{c})_i + s \cdot S_{last,i}\]

where: - \(S_{last,i}\) = semantic similarity between the last recalled item and item \(i\) - \(s\) = semantic_scale parameter

Why Additive?

The additive model reflects the idea that temporal and semantic cues provide independent sources of support:

  • Temporal: “What was studied near this item?”
  • Semantic: “What is related to this item?”

Adding them assumes both cues contribute to retrieval probability, and their effects don’t interact.

Mathematical Specification

Semantic Similarity Matrix

The model receives a pre-computed similarity matrix \(S\) where \(S_{ij}\) represents the semantic relationship between items \(i\) and \(j\):

\[S_{ij} = \text{similarity}(\text{item}_i, \text{item}_j)\]

Common sources: - Word embedding cosine similarity (Word2Vec, GloVe) - LSA (Latent Semantic Analysis) vectors - Co-occurrence statistics - Human similarity ratings

Retrieval Activations

Code
def activations(self):
    # Temporal support from MCF
    temporal_support = self.mcf.probe(self.context.state) * self.recallable

    # Semantic support from last recalled item
    if self.recall_total == 0:
        semantic_support = zeros  # No last recall
    else:
        last_item = self.recalls[self.recall_total - 1] - 1
        semantic_support = self.msem[last_item]  # S scaled by semantic_scale

    # Additive combination, then apply sensitivity
    combined = temporal_support + semantic_support
    return power_scale(combined, self.mcf_sensitivity) * self.recallable

The key: add before scaling. This means semantic support affects the competition before the choice sensitivity exponent is applied.

Parameters

Parameter Symbol Description
semantic_scale \(s\) Scaling factor for semantic similarity

All other parameters are inherited from base CMR.

The Semantic Scale

The semantic_scale parameter controls the relative influence of semantic vs. temporal cues:

Value Effect
0.0 Pure temporal CMR (no semantic influence)
0.5 Moderate semantic boost
1.0 Semantic and temporal contribute equally (if magnitudes match)
>1.0 Semantic dominates

Usage

Code
from jaxcmr.models.additive_semantic_cmr import CMR, make_factory
import jaxcmr.components.linear_memory as LinearMemory
import jaxcmr.components.context as TemporalContext
from jaxcmr.components.termination import PositionalTermination

# Create factory with semantic features
Factory = make_factory(
    LinearMemory.init_mfc,
    LinearMemory.init_mcf,
    TemporalContext.init,
    PositionalTermination,
)

# Initialize with dataset and word embeddings
factory = Factory(dataset, word_embeddings)

params = {
    "encoding_drift_rate": 0.5,
    "start_drift_rate": 0.5,
    "recall_drift_rate": 0.5,
    "learning_rate": 0.5,
    "primacy_scale": 2.0,
    "primacy_decay": 0.8,
    "shared_support": 0.05,
    "item_support": 0.25,
    "choice_sensitivity": 0.6,
    "semantic_scale": 0.3,  # New parameter
    "stop_probability_scale": 0.05,
    "stop_probability_growth": 0.2,
    "learn_after_context_update": True,
    "allow_repeated_recalls": False,
}

# Create model for a specific trial
model = factory.create_trial_model(trial_index=0, parameters=params)

The Factory Pattern

Semantic CMR requires per-trial similarity matrices (since each list has different items). The factory pattern handles this:

Code
class CMRModelFactory:
    def __init__(self, dataset, features):
        # Pre-compute similarity matrices for all trials
        self.trial_connections = build_trial_connections(
            dataset["pres_itemids"], features
        )

    def create_trial_model(self, trial_index, parameters):
        # Create model with trial-specific similarity matrix
        return CMR(
            list_length,
            parameters,
            connections=self.trial_connections[trial_index],
            ...
        )

Predictions

Semantic Clustering

With semantic_scale > 0: - Items semantically related to the last recall are more likely - Recall sequences show category clustering - Transitions favor semantic neighbors even when temporally distant

Interaction with Temporal Contiguity

The additive combination means: - Temporally adjacent AND semantically related items get double boost - Semantic cuing can “rescue” temporally distant items - Pure semantic transitions (ignoring temporal) become possible

Comparison: Additive vs Multiplicative

Aspect Additive Multiplicative
Formula \(a_{temp} + s \cdot a_{sem}\) \(a_{temp}^\tau \times a_{sem}^s\)
Interpretation Independent cues Gating/foraging
Zero temporal Semantic alone works Zero activation
Zero semantic Temporal alone works Temporal alone works

Theoretical Background

This model follows Polyn, Norman & Kahana (2009), where semantic associations were incorporated into CMR to explain:

  • Category clustering in recall
  • Semantic intrusions
  • The interplay of temporal and semantic organization

The additive formulation reflects the idea that semantic memory provides an additional retrieval route independent of episodic temporal context.

References

  • Polyn, S. M., Norman, K. A., & Kahana, M. J. (2009). A context maintenance and retrieval model of organizational processes in free recall. Psychological Review, 116(1), 129-156.
  • Sederberg, P. B., Howard, M. W., & Kahana, M. J. (2008). A context-based theory of recency and contiguity in free recall. Psychological Review, 115(4), 893-912.