SuperLocalMemory Logo
SuperLocalMemory
Your Choice

Three Operating Modes

You choose the privacy-accuracy tradeoff. Switch anytime. Your data stays consistent across all modes.

EU AI Act

Mode A

Local Guardian

Zero cloud calls. All memory operations — storage, encoding, retrieval, lifecycle — execute locally. Your data never leaves your machine under any circumstance.

74.8%

LoCoMo Retrieval — data stays local

60.4%

Pure zero-LLM (no LLM at any stage)

Best for: Regulated industries, air-gapped networks, privacy-conscious developers, EU compliance requirements.

switch to mode a
$ slm mode a

Mode B

Smart Local

Everything in Mode A, plus a local LLM via Ollama for answer synthesis and enhanced fact extraction. All processing stays on your machine — nothing sent to any cloud.

Local LLM enhanced

Best for: Developers who want composed answers but need data to stay local. Teams with 16GB+ RAM for local models.

switch to mode b
$ slm mode b

Mode C

Full Power

Maximum accuracy. Cloud LLM participates at every layer — fact extraction, answer synthesis, agentic multi-round retrieval. Data leaves your machine for processing. This is the configuration comparable to other memory systems in the field.

87.7%

LoCoMo — competitive with EverMemOS (92.3%)

Best for: Maximum accuracy when cloud access is organizationally approved. Comparable to industry standard.

switch to mode c
$ slm mode c

Feature Comparison

Feature A B C
Semantic search
BM25 keyword search
Entity graph traversal
Temporal retrieval
Fisher-Rao scoring
Sheaf consistency
Langevin lifecycle
Cross-encoder reranking
LLM answer synthesis
Agentic multi-round retrieval
Data leaves device
EU AI Act compliant
Internet required

Frequently Asked Questions

Can I switch between modes after installation?

+
Yes. Run 'slm mode a', 'slm mode b', or 'slm mode c' at any time. Mode changes take effect immediately. Your stored memories are not affected — all modes use the same database.

Which mode should I start with?

+
Start with Mode A if you are unsure. It requires no API keys, no cloud setup, and no internet connection. You can upgrade to Mode B or C later without losing any data.

Does Mode A really work without any LLM?

+
Yes. Mode A uses mathematical retrieval (Fisher-Rao similarity, BM25 keyword matching, entity graph traversal, and temporal search) with cross-encoder reranking. No cloud dependency at any stage. Mode A Retrieval scored 74.8% on LoCoMo — the highest local-first score reported. The pure zero-LLM configuration (no LLM even for answer framing) scored 60.4%.

What LLM providers does Mode C support?

+
Mode C supports Azure OpenAI, OpenAI, Anthropic (Claude), Ollama, and OpenRouter. Configure with 'slm provider set' or through the web dashboard settings.

Start with Mode A. Upgrade when you need to.

One install. Choose your mode. Switch anytime. Your memories stay consistent.