- Add OpenAI-compatible API support with custom endpoints
- Implement LiteLLM Router for multi-provider routing
- Add new config options: OPENAI_COMPATIBLE_API_KEY/BASE
- Update environment examples and self-hosting documentation
Enables support for local LLM services like Ollama, LM Studio, vLLM,
and third-party OpenAI-compatible providers.
- Removed hardcoded maximum parallel agent runs and replaced with a configurable property in the Configuration class.
- Updated API and agent limit checks to utilize the new configuration property.
- Simplified error handling by removing limit from error details in the frontend components.
- Adjusted UI components to reflect changes in agent limit data structure.
- Modified the configuration file to update the Stripe pricing IDs for yearly commitment plans in the production environment.
- Ensured consistency with the latest pricing structure for subscription tiers.
- Added support for yearly commitment plans with associated pricing tiers in the billing service.
- Introduced validation functions to manage plan changes and restrictions based on business rules.
- Updated configuration to include new yearly commitment pricing in both production and staging environments.
- Enhanced frontend components to handle subscription management, including cancellation and reactivation features.
- Refactored billing-related hooks and API calls to accommodate new subscription types and improve error handling.
- Added a new API keys service with endpoints for creating, listing, revoking, and deleting API keys.
- Integrated API key validation and caching for performance.
- Updated backend configuration to include API key secret and throttling settings.
- Created frontend components for managing API keys, including a user interface for creating and displaying keys.
- Introduced a new database migration for the API keys table with necessary constraints and security policies.
- Enhanced authentication methods to support API key usage alongside JWT.
- Introduced GEMINI_API_KEY in setup and environment files.
- Updated SetupWizard to include Google Gemini as a supported LLM provider.
- Adjusted model selection logic to accommodate Gemini models.
- Modified billing and configuration files to handle Gemini model pricing and aliases.
- Updated documentation to reflect the addition of GEMINI_API_KEY.
- Added quick setup instructions and detailed setup steps in CONTRIBUTING.md and SELF-HOSTING.md.
- Updated environment variable configurations and added new required services for setup.
- Improved clarity on the setup wizard's functionality and progress saving.
- Revised README files for both backend and frontend to include quick setup instructions and environment configurations.
- Updated model references to the latest version of the Anthropic model across various files.
- Removed deprecated workflow background script.
- Removed unnecessary comments and updated the Docker Compose files for backend and sandbox services.
- Changed `docker-compose` to `docker compose` in README for consistency.
- Updated Chromium dependencies and configurations in Dockerfile for improved stability.
- Enhanced browser automation logic to handle page navigation and state recovery more effectively.
- Adjusted environment variables for better performance and resource management.