Quickstart
This walk-through gets you from zero to a generated standup in under a minute. We’ll connect Git first because it requires no OAuth.
1. Verify the install
Section titled “1. Verify the install”devrecall --versionIf this fails, see Install.
2. Run the setup wizard
Section titled “2. Run the setup wizard”devrecall setupThe wizard creates ~/.devrecall/config.json and the SQLite database,
walks you through which directories to scan for Git repos, and offers
to start OAuth flows for the other sources.
To skip the wizard and just initialize:
devrecall setup --quickThen point DevRecall at your repos by editing ~/.devrecall/config.json:
{ "git": { "enabled": true, "scan_paths": ["~/Projects", "~/work"], "emails": ["pavel@company.com"] }}Author emails are auto-detected from git config user.email across
your repos when left empty.
3. Sync your activity
Section titled “3. Sync your activity”devrecall sync --since 7dFirst run pulls the last 7 days of commits. Subsequent runs are incremental.
4. Generate a standup
Section titled “4. Generate a standup”devrecall standupOutput:
Yesterday (2026-04-24):- backend-api: 3 commits — fix auth retry on 401 (PR #423)- backend-api: reviewed PR #418 (payment integration)
Today: no meetings on calendar (calendar not connected)If no LLM is configured, you get a structured template-based standup. Configure an LLM for natural-language output — see LLM strategy.
5. Connect a richer source
Section titled “5. Connect a richer source”devrecall auth slack # opens browser for OAuthdevrecall auth google # Google Calendardevrecall auth github # OAuth, or --method pat / --method gh-clidevrecall auth jiradevrecall auth linearEach auth command opens a browser for OAuth (or prompts for an
API token where applicable). Tokens are stored in
~/.devrecall/tokens/ with 0600 permissions.
Run devrecall sync after connecting a new source.
6. Try chat
Section titled “6. Try chat”devrecall chat> what did I ship last week?Chat needs an LLM configured. Local Ollama is free and private:
# Install Ollama: https://ollama.com/downloadollama pull gemma4# `gemma4` is the default — no config needed. To override, edit ~/.devrecall/config.json:# "llm": { "provider": "ollama", "model": "<other-model>" }Or bring your own OpenAI / Anthropic key — see Configure.
What’s next
Section titled “What’s next”- Configure sources, LLM, and sync behavior
- Integrations — what each source collects
- Features — standup, chat, brag doc