Environment variables

Copy .env.example to .env for local development. Docker Compose deployments use additional variables documented in README.md.

Core services

VariableRequiredDescription
DATABASE_URLYesPostgreSQL connection string
REDIS_URLYesRedis for BullMQ job queue
MINIO_ENDPOINTYesS3-compatible storage host
MINIO_PORTYesMinIO port (e.g. 9000)
MINIO_ACCESS_KEYYesMinIO access key
MINIO_SECRET_KEYYesMinIO secret key
MINIO_USE_SSLNotrue / false
MINIO_BUCKETYesBucket for scan artifacts

Authentication

VariableDescription
NEXTAUTH_SECRETSession signing / encryption secret (long random string)
NEXTAUTH_URLPublic app URL (e.g. http://localhost:3000)
ADMIN_EMAILSeed script default admin email
ADMIN_PASSWORDSeed script default admin password — change after first login
NEXT_PUBLIC_SOURCE_CODE_URLOptional link on login page

GitHub

VariableDescription
GITHUB_ID / GITHUB_SECRETOptional NextAuth GitHub login
GITHUB_OAUTH_CLIENT_IDOAuth app for Repositories + fix PR (falls back to GITHUB_ID)
GITHUB_OAUTH_CLIENT_SECRETOAuth secret (falls back to GITHUB_SECRET)
TOKEN_ENCRYPTION_KEYEncrypts stored GitHub tokens (defaults to NEXTAUTH_SECRET)
GITHUB_WEBHOOK_SECRETHMAC secret for /api/webhooks/github
GITHUB_PR_TOKEN / GITHUB_TOKENLegacy PAT — not used when OAuth is connected
Authorization callback URL:
{NEXTAUTH_URL}/api/integrations/github/callback

OAuth scopes requested: read:user repo

GitLab

VariableDescription
GITLAB_ID / GITLAB_SECRETOptional GitLab OAuth
GITLAB_URLGitLab instance base (default https://gitlab.com)

LLM & worker

VariableDefaultDescription
OPENROUTER_API_KEYCloud LLM via OpenRouter
OPENROUTER_MODELDefault model when using OpenRouter env key
WORKER_CONCURRENCY2Parallel scan jobs per worker
MAX_LLM_CONCURRENCY5Parallel LLM calls within a scan
LLM_CHUNK_TOKENS3000Code chunk size for API models
LLM_CHUNK_OVERLAP_TOKENS200Overlap between chunks
LLM_MAX_RESPONSE_TOKENS4096Max LLM response tokens
OLLAMA_CHUNK_TOKENS1200Chunk size for local Ollama
OLLAMA_CHUNK_OVERLAP_TOKENS100Ollama overlap
OLLAMA_MAX_RESPONSE_TOKENS2048Ollama max response
LLM_MIN_CONFIDENCE0.7Drop LLM findings below this confidence
OLLAMA_HOSTDocker: host Ollama URL (see README)

Email (SMTP)

VariableDescription
SMTP_HOSTMail server hostname
SMTP_PORTUsually 587 (TLS) or 465
SMTP_USER / SMTP_PASSWORDCredentials
SMTP_FROMFrom address
SMTP_TLStrue to use STARTTLS

Local dev: use Mailpit or similar on port 1025/8025.

Docker-only (README)

Production Compose may also define:

UI settings (LLM provider, policies, build gates) are stored in PostgreSQL per organization. Environment variables act as defaults and fallbacks for LLM keys and SMTP.