Skip to content

Quickstart

This walk-through gets you from zero to a generated standup in under a minute. We’ll connect Git first because it requires no OAuth.

Terminal window
devrecall --version

If this fails, see Install.

Terminal window
devrecall setup

The wizard creates ~/.devrecall/config.json and the SQLite database, walks you through which directories to scan for Git repos, and offers to start OAuth flows for the other sources.

To skip the wizard and just initialize:

Terminal window
devrecall setup --quick

Then point DevRecall at your repos by editing ~/.devrecall/config.json:

{
"git": {
"enabled": true,
"scan_paths": ["~/Projects", "~/work"],
"emails": ["pavel@company.com"]
}
}

Author emails are auto-detected from git config user.email across your repos when left empty.

Terminal window
devrecall sync --since 7d

First run pulls the last 7 days of commits. Subsequent runs are incremental.

Terminal window
devrecall standup

Output:

Yesterday (2026-04-24):
- backend-api: 3 commits — fix auth retry on 401 (PR #423)
- backend-api: reviewed PR #418 (payment integration)
Today: no meetings on calendar (calendar not connected)

If no LLM is configured, you get a structured template-based standup. Configure an LLM for natural-language output — see LLM strategy.

Terminal window
devrecall auth slack # opens browser for OAuth
devrecall auth google # Google Calendar
devrecall auth github # OAuth, or --method pat / --method gh-cli
devrecall auth jira
devrecall auth linear

Each auth command opens a browser for OAuth (or prompts for an API token where applicable). Tokens are stored in ~/.devrecall/tokens/ with 0600 permissions.

Run devrecall sync after connecting a new source.

Terminal window
devrecall chat
> what did I ship last week?

Chat needs an LLM configured. Local Ollama is free and private:

Terminal window
# Install Ollama: https://ollama.com/download
ollama pull gemma4
# `gemma4` is the default — no config needed. To override, edit ~/.devrecall/config.json:
# "llm": { "provider": "ollama", "model": "<other-model>" }

Or bring your own OpenAI / Anthropic key — see Configure.