Link — Local LLM Memory System
URL: https://github.com/gowtham0992/link/wiki Author: gowtham0992 Type: Open-source developer tool / MCP server
Core Argument
LLM agents (Claude, Cursor, Copilot, Codex) are stateless by default — they forget everything between sessions. Link solves this by providing a local, source-backed, auditable memory layer that agents query via MCP (Model Context Protocol), giving them durable knowledge about the user and their work without any cloud dependency.
Core concept: “Sources become wiki knowledge. Explicit ‘remember’ becomes agent memory. Queries use both.”
Storage Model (Three Layers)
| Layer | Contents |
|---|---|
raw/ | Immutable originals: notes, transcripts, articles, PDFs, screenshots |
wiki/ | LLM-generated pages: concepts, entities, source summaries |
| MCP tools | Compact query packets served to agents on demand |
Agent Integration Contract
Agents are expected to follow a four-step protocol:
- Check status (is memory available?)
- Ingest sources (add new raw material)
- Query context (retrieve relevant knowledge)
- Validate changes (confirm updates are correct)
CLI Reference
brew install gowtham0992/link/link
link demo # create sample wiki
link serve # local web viewer (127.0.0.1 only)
link query # retrieve answer-ready packets
link remember # save a durable memory
Privacy & Security
- No cloud backend, no telemetry
- Web server binds to 127.0.0.1 only
- Secret detection in source files
- Git ignores raw sources by default
- Pre-sharing validation tools available
Key Takeaways
- Local-first design makes memory inspectable and auditable — important for trust in agentic systems
- Three-layer architecture (raw → wiki → MCP packets) mirrors good knowledge-management practice
- The ingest/query split means agents don’t need to re-process raw sources on every query
- Source provenance is first-class: every wiki page links back to its raw inputs
- Privacy model (local-only) removes a common organizational blocker for AI adoption