The Universal Fix: Run Doctor
When something’s wrong, start here:repr doctor tells you how to fix it.
Still broken? Read on.
Common Issues
”Local LLM not found”
Symptoms:- Ollama (or your local LLM) isn’t running
- Wrong endpoint configured
1
Check if Ollama is running
2
Start Ollama
3
Pull a model if you haven't
4
Test the connection
“Not authenticated” / “Auth token expired”
Symptoms:”Repository not tracked”
Symptoms:“Git hook not working” / “Commits not being queued”
Symptoms: You’ve installed hooks, but when you commit, nothing happens.repr hooks status shows hooks aren’t firing.
Diagnose:
1
Reinstall hooks
2
Check for conflicting hooks
Some repos have existing hooks that might conflict:If you see
post-commit.old or post-commit.sample, there was a pre-existing hook.Repr tries to be smart and chain hooks, but occasionally this fails. Manually check .git/hooks/post-commit and make sure it includes the repr hook.3
Test manually
Make a commit and check if it queued:Look for “Last triggered” to update.
“Corrupted config”
Symptoms:1
Back up your config (just in case)
2
Try to view the config
3
Reset config to defaults
~/.repr/config.json in your $EDITOR. Copy settings from your backup.
”Out of disk space” / “Slow performance”
Symptoms: Repr is slow or you’re running out of disk space. Check storage:- Using a smaller model:
ollama pull phi3(lighter than llama3.2) - Using cloud LLM:
repr generate --cloud(requires login) - Using BYOK:
repr llm add openai(your own API key)
“Sync conflicts” / “Can’t push to cloud”
Symptoms:1
Force pull (use remote version)
~/.repr/conflicts/.2
Or force push (use local version)
3
Or manually resolve
Check the conflict directory:Pick the version you want and manually restore it:
“Stories not generating” / “LLM errors”
Symptoms:-
Local LLM crashed: Restart Ollama
-
Model not loaded: Pull the model again
-
Out of memory: Your model is too big for your RAM
-
API rate limit (BYOK): You hit your provider’s rate limit
Wait a few minutes or switch providers:
“Can’t find repr command” / “Command not found”
Symptoms:1
Check if repr is installed
2
If installed via Homebrew
3
If installed via pipx
4
Verify installation
repr version 0.2.0 (or similar)Debugging Commands
When reporting issues or digging deeper, these commands help:Check Version
Check Auth Status
Check Configuration
Check Repo Status
Check Hook Queue
Check Privacy/Network Audit
Run Full Diagnostics
repr doctor.
Getting Help
1. Check the Docs
2. Run Doctor
3. Check GitHub Issues
Search existing issues: https://github.com/repr-app/cli/issues Someone might have already solved your problem.4. File a Bug Report
If nothing works, file an issue:- What you were trying to do
- What happened (error message)
repr --version- Output of
repr doctor - Operating system (macOS, Linux, Windows)
Edge Cases
repr Works, But Stories Are Bad Quality
Not a bug—this is an LLM problem. Fixes:-
Try a different model:
-
Use a better LLM via BYOK:
OpenAI’s models are generally better than open-source local models.
-
Add context with custom prompts:
-
Edit stories manually:
Polish them yourself. Repr gives you the 80% draft, you add the 20% polish.
repr Is Too Slow
Causes:- Local LLM is underpowered
- Processing too many commits at once
- Disk I/O bottleneck
“Permission denied” Errors
Symptoms:Still Stuck?
If none of this helps:-
Try a fresh install:
-
Ask for help:
- GitHub Issues: https://github.com/repr-app/cli/issues
- Discord: https://repr.dev/discord (if available)
- Email: support@repr.dev
repr doctor when asking for help. Makes debugging way easier.
