SuperLocalMemory Logo
SuperLocalMemory
Integration

OpenAI Codex Persistent Memory

Give OpenAI Codex persistent memory across autonomous coding sessions. Auto-configured via MCP. Free, local-first, works offline.

Codex Is Powerful — But It Forgets Everything

OpenAI Codex is an autonomous coding agent that can understand, generate, and refactor code at scale. But without persistent memory, every session starts from scratch.

Your project conventions, architectural decisions, preferred patterns, and debugging history — all gone when the session ends. You re-explain the same context over and over.

And if you switch between Codex, Claude Code, and Cursor, none of that context carries over. Each tool is an island.

SuperLocalMemory gives OpenAI Codex persistent, cross-tool memory — 100% local, private, and always available.

What Codex Remembers With SuperLocalMemory

Context Type With SuperLocalMemory Without
Project architecture Persists across sessions Re-explain every time
Code conventions Remembered automatically Inconsistent output
Debugging history Builds on past context Repeats same mistakes
Cross-tool context Shared with 17+ AI tools Locked to Codex only
Offline access Always available Requires connection
Privacy 100% local, your machine Depends on provider
Cost Free forever N/A

Set Up Codex Memory in 3 Steps

1

Install & Start (Auto-Configures Codex)

install
$ npm install -g superlocalmemory && slm start

Installs SuperLocalMemory globally and auto-configures OpenAI Codex. No API keys required.

2

Or Configure Manually

Option A — Use the Codex CLI to add SuperLocalMemory:

terminal
$ codex mcp add superlocalmemory -- npx superlocalmemory

Option B — Edit ~/.codex/config.toml directly:

config.toml
[mcp_servers.superlocalmemory]
command = "npx"
args = ["-y", "superlocalmemory@latest"]

Using the Desktop App? MCP is configured automatically — skip this step.

3

Code With Memory

Codex can now store and recall memories locally. Use it naturally during autonomous coding:

terminal
# Codex stores project context via MCP
slm remember "This project uses modular architecture with strict separation of concerns"
# Recall context in any session
slm recall "project architecture"

What Persistent Memory Gets You

Autonomous Coding Memory

Codex remembers your project architecture, code patterns, and conventions across autonomous coding sessions. No more re-explaining your codebase.

Cross-Tool Memory

Memories from Codex carry over to Claude Code, Cursor, and 15+ other AI tools. Your project context is never locked to a single tool.

You Own Your Data

Single SQLite file, fully portable. Back it up, move it between machines, or delete it entirely. No cloud residue.

Project Context Graphs

Builds semantic relationships between code concepts, dependencies, and decisions. Codex understands the connections in your codebase.

Instant Retrieval

10.6ms median memory recall from local storage. No network round-trips, no API rate limits, no cloud latency.

Private & Offline

All memory stays on your machine. Works offline. Your proprietary code patterns and architectural decisions never leave your device.

Who Benefits From Codex + Memory?

Solo Developers

Keep project context, code patterns, and architectural decisions persistent across autonomous coding sessions. Codex picks up right where it left off.

Teams & Enterprises

Share project conventions and coding standards through persistent memory. Every team member's Codex instance starts with the same context.

Multi-Tool Workflows

Use Codex for autonomous coding, Claude Code for architecture, and Cursor for editing — all sharing the same memory. No context silos.

Open Source Maintainers

Build a knowledge base of project conventions, contribution patterns, and codebase decisions that persists across sessions and tools.

Frequently Asked Questions

How does SuperLocalMemory integrate with OpenAI Codex?

+
SuperLocalMemory connects to OpenAI Codex as a local MCP server. Codex reads and writes memories through the MCP protocol, giving it persistent context across autonomous coding sessions. Configuration is automatic or via ~/.codex/config.toml.

Does OpenAI Codex support local MCP servers?

+
Yes. OpenAI Codex supports local MCP servers configured through ~/.codex/config.toml. SuperLocalMemory auto-configures this on install, or you can add it manually with a single command.

Can I use the same memories in Claude Code and Cursor?

+
Yes. SuperLocalMemory uses a shared local SQLite database. Any MCP-compatible tool can access the same memories — OpenAI Codex, Claude Code, Cursor, Windsurf, VS Code, and 17+ other tools.

Does Codex remember my project context between sessions?

+
With SuperLocalMemory, yes. Codex can store architectural decisions, code patterns, project conventions, and debugging context that persists across sessions. Without it, every Codex session starts from scratch.

Is SuperLocalMemory free?

+
Yes, completely free and open source under the MIT license. No subscriptions, no API costs, no usage limits. It runs entirely on your hardware.

Works With Your Entire AI Toolkit

Give Codex a Memory

Free, local, private. Persistent AI memory for OpenAI Codex that you fully own and control.