Environment variables
Copy .env.example to .env for local development.
Docker Compose deployments use additional variables documented in README.md.
Core services
| Variable | Required | Description |
DATABASE_URL | Yes | PostgreSQL connection string |
REDIS_URL | Yes | Redis for BullMQ job queue |
MINIO_ENDPOINT | Yes | S3-compatible storage host |
MINIO_PORT | Yes | MinIO port (e.g. 9000) |
MINIO_ACCESS_KEY | Yes | MinIO access key |
MINIO_SECRET_KEY | Yes | MinIO secret key |
MINIO_USE_SSL | No | true / false |
MINIO_BUCKET | Yes | Bucket for scan artifacts |
Authentication
| Variable | Description |
NEXTAUTH_SECRET | Session signing / encryption secret (long random string) |
NEXTAUTH_URL | Public app URL (e.g. http://localhost:3000) |
ADMIN_EMAIL | Seed script default admin email |
ADMIN_PASSWORD | Seed script default admin password — change after first login |
NEXT_PUBLIC_SOURCE_CODE_URL | Optional link on login page |
GitHub
| Variable | Description |
GITHUB_ID / GITHUB_SECRET | Optional NextAuth GitHub login |
GITHUB_OAUTH_CLIENT_ID | OAuth app for Repositories + fix PR (falls back to GITHUB_ID) |
GITHUB_OAUTH_CLIENT_SECRET | OAuth secret (falls back to GITHUB_SECRET) |
TOKEN_ENCRYPTION_KEY | Encrypts stored GitHub tokens (defaults to NEXTAUTH_SECRET) |
GITHUB_WEBHOOK_SECRET | HMAC secret for /api/webhooks/github |
GITHUB_PR_TOKEN / GITHUB_TOKEN | Legacy PAT — not used when OAuth is connected |
Authorization callback URL:
{NEXTAUTH_URL}/api/integrations/github/callback
OAuth scopes requested: read:user repo
GitLab
| Variable | Description |
GITLAB_ID / GITLAB_SECRET | Optional GitLab OAuth |
GITLAB_URL | GitLab instance base (default https://gitlab.com) |
LLM & worker
| Variable | Default | Description |
OPENROUTER_API_KEY | — | Cloud LLM via OpenRouter |
OPENROUTER_MODEL | — | Default model when using OpenRouter env key |
WORKER_CONCURRENCY | 2 | Parallel scan jobs per worker |
MAX_LLM_CONCURRENCY | 5 | Parallel LLM calls within a scan |
LLM_CHUNK_TOKENS | 3000 | Code chunk size for API models |
LLM_CHUNK_OVERLAP_TOKENS | 200 | Overlap between chunks |
LLM_MAX_RESPONSE_TOKENS | 4096 | Max LLM response tokens |
OLLAMA_CHUNK_TOKENS | 1200 | Chunk size for local Ollama |
OLLAMA_CHUNK_OVERLAP_TOKENS | 100 | Ollama overlap |
OLLAMA_MAX_RESPONSE_TOKENS | 2048 | Ollama max response |
LLM_MIN_CONFIDENCE | 0.7 | Drop LLM findings below this confidence |
OLLAMA_HOST | — | Docker: host Ollama URL (see README) |
Email (SMTP)
| Variable | Description |
SMTP_HOST | Mail server hostname |
SMTP_PORT | Usually 587 (TLS) or 465 |
SMTP_USER / SMTP_PASSWORD | Credentials |
SMTP_FROM | From address |
SMTP_TLS | true to use STARTTLS |
Local dev: use Mailpit or similar on port 1025/8025.
Docker-only (README)
Production Compose may also define:
POSTGRES_PASSWORD, PEPPER_PORT, WORKER_REPLICAS
PEPPER_IMAGE, PEPPER_VERSION for private registries
UI settings (LLM provider, policies, build gates) are stored in PostgreSQL per organization.
Environment variables act as defaults and fallbacks for LLM keys and SMTP.