Lock It Down: Local-Only Mode
You can explicitly lock repr to prevent any accidental cloud operations:- ✅ Block
repr push,repr sync, andrepr login - ✅ Refuse to make any network calls except to your local LLM
- ✅ Store everything in
~/.repr/on your machine - ✅ Show a warning if any command tries to access the network
Set Up Your Local LLM
To use repr offline, you need a local language model. We recommend Ollama—it’s free, fast, and runs on your laptop.1
Install Ollama
Download from ollama.com or install via Homebrew:
2
Start the Ollama service
http://localhost:11434. No data leaves your machine.3
Download a model
Pick a model that fits your hardware:First pull takes a few minutes. After that, it’s instant.
4
Configure repr to use Ollama
Run the interactive setup:Repr will auto-detect Ollama and show available models:
5
Verify it works
Test your local LLM connection:Output:
repr generate --local, it uses your local model—no data leaves your machine.
Alternative: Bring Your Own Keys (BYOK)
Maybe you don’t want to run a local LLM (not enough RAM, laptop gets hot, whatever). You can still avoid repr’s servers by using your own API keys with OpenAI, Anthropic, or other providers. Your keys are stored in your OS keychain (macOS Keychain, Windows Credential Manager, Linux Secret Service)—not in repr’s config file.repr generate, it calls OpenAI directly with your key. Repr’s servers never see your data.
Network policy with BYOK:
- ✅ Direct connection to
api.openai.com(or your chosen provider) - ✅ No data goes through repr.dev servers
- ✅ Your code and diffs are sent only to your API provider
- ✅ Keys stored in OS keychain, not config files
How Private Is This?
Let’s be crystal clear about what happens in each mode:Local-Only Mode (Ollama)
- Repr reads commits from your local git repos
- Diffs are sent to
http://localhost:11434(your machine) - Ollama processes them locally
- Stories are saved to
~/.repr/stories(your machine)
BYOK Mode (Your API Keys)
- Repr reads commits from your local git repos
- Diffs are sent directly to
api.openai.com(or your provider) - OpenAI processes them and returns stories
- Stories are saved to
~/.repr/stories(your machine)
Cloud Mode (repr.dev)
- Repr reads commits from your local git repos
- Diffs are sent to
api.repr.devfor processing - Stories are generated and synced to your account
- Stories are saved locally and in the cloud
push, pull, sync commands).
Verify Your Privacy Settings
You can audit exactly what repr has done:Air-Gapped or Restricted Networks
Working in a truly air-gapped environment? No problem.- Install repr offline: Download the binary on a connected machine, transfer via USB
- Transfer model weights: Download Ollama models on a connected machine, copy to the air-gapped system
- Lock to local-only:
repr privacy lock-local --permanent
Unlock Later (If Needed)
Changed your mind? Want to enable cloud sync?The Bottom Line
Repr respects your privacy by default. But if you need guaranteed local-only operation:- Lock it:
repr privacy lock-local - Use Ollama:
repr llm configure→ select Ollama - Verify:
repr privacy explain - Generate:
repr generate --local

