backend
frontend
auth
integrations
Per‑user LLM keys and project LLM settings
- Add per-user LLM API key management and project LLM options — Users can now add their own OpenAI or Anthropic API keys and choose provider and model per project. Keys are stored securely and only masked values are shown in the UI, enabling per-project customization of changelog generation.
New Features
- Added GET and POST endpoints to manage user LLM configurations so users can save and retrieve their OpenAI or Anthropic API keys.backend
Improvements
- Updated Project settings UI to let users pick LLM provider and model, and to submit an optional per-project API key.frontend
- Made project settings PATCH support llm_provider and llm_model and forward BYOK API keys to the user key endpoint when provided.backend
- Added model lists and sensible defaults for OpenAI and Anthropic to simplify model selection.frontend
API
- Added a web proxy route that forwards authenticated requests to the new /v1/user/llm-config endpoints using the user's Clerk ID.auth
Security
- Stored user API keys in Vault and never returned raw keys to the web app; the UI shows only masked key values.integrations