Merge pull request #350 from tnfssc/chore/docs-and-cleanup

feat(setup): added setup.py and updated docs
This commit is contained in:
Marko Kraemer 2025-05-18 00:31:44 +02:00 committed by GitHub
commit fdd9181292
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
10 changed files with 1369 additions and 275 deletions

5
.gitignore vendored
View File

@ -190,4 +190,7 @@ supabase/.temp/storage-version
**/__pycache__/
.env.scripts
.env.scripts
redis_data
rabbitmq_data

34
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,34 @@
# Contributing to Suna
Thank you for your interest in contributing to Suna! This document outlines the contribution process and guidelines.
## Contribution Workflow
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/your-feature`)
3. Commit your changes (`git commit -am 'feat(your_file): add some feature'`)
4. Push to the branch (`git push origin feature/your-feature`)
5. Open a Pull Request
## Development Setup
For detailed setup instructions, please refer to:
- [Backend Development Setup](backend/README.md)
- [Frontend Development Setup](frontend/README.md)
## Code Style Guidelines
- Follow existing code style and patterns
- Use descriptive commit messages
- Keep PRs focused on a single feature or fix
## Reporting Issues
When reporting issues, please include:
- Steps to reproduce
- Expected behavior
- Actual behavior
- Environment details (OS, Node/Docker versions, etc.)
- Relevant logs or screenshots

246
README.md
View File

@ -14,10 +14,9 @@ Suna's powerful toolkit includes seamless browser automation to navigate the web
[![Discord Follow](https://dcbadge.limes.pink/api/server/Py6pCBUUPw?style=flat)](https://discord.gg/Py6pCBUUPw)
[![Twitter Follow](https://img.shields.io/twitter/follow/kortixai)](https://x.com/kortixai)
[![GitHub Repo stars](https://img.shields.io/github/stars/kortix-ai/suna)](https://github.com/kortix-ai/suna)
[![Issues](https://img.shields.io/github/issues/kortix-ai/suna
)](https://github.com/kortix-ai/suna/labels/bug)
</div>
[![Issues](https://img.shields.io/github/issues/kortix-ai/suna)](https://github.com/kortix-ai/suna/labels/bug)
</div>
## Table of Contents
@ -26,10 +25,8 @@ Suna's powerful toolkit includes seamless browser automation to navigate the web
- [Frontend](#frontend)
- [Agent Docker](#agent-docker)
- [Supabase Database](#supabase-database)
- [Run Locally / Self-Hosting](#run-locally--self-hosting)
- [Requirements](#requirements)
- [Prerequisites](#prerequisites)
- [Installation Steps](#installation-steps)
- [Use Cases](#use-cases)
- [Self-Hosting](#self-hosting)
- [Acknowledgements](#acknowledgements)
- [License](#license)
@ -40,255 +37,96 @@ Suna's powerful toolkit includes seamless browser automation to navigate the web
Suna consists of four main components:
### Backend API
Python/FastAPI service that handles REST endpoints, thread management, and LLM integration with Anthropic, and others via LiteLLM.
### Frontend
Next.js/React application providing a responsive UI with chat interface, dashboard, etc.
### Agent Docker
Isolated execution environment for every agent - with browser automation, code interpreter, file system access, tool integration, and security features.
### Supabase Database
Handles data persistence with authentication, user management, conversation history, file storage, agent state, analytics, and real-time subscriptions.
## Use Cases
1. **Competitor Analysis** ([Watch](https://www.suna.so/share/5ee791ac-e19c-4986-a61c-6d0659d0e5bc)) - *"Analyze the market for my next company in the healthcare industry, located in the UK. Give me the major players, their market size, strengths, and weaknesses, and add their website URLs. Once done, generate a PDF report."*
1. **Competitor Analysis** ([Watch](https://www.suna.so/share/5ee791ac-e19c-4986-a61c-6d0659d0e5bc)) - _"Analyze the market for my next company in the healthcare industry, located in the UK. Give me the major players, their market size, strengths, and weaknesses, and add their website URLs. Once done, generate a PDF report."_
2. **VC List** ([Watch](https://www.suna.so/share/804d20a3-cf1c-4adb-83bb-0e77cc6adeac)) - *"Give me the list of the most important VC Funds in the United States based on Assets Under Management. Give me website URLs, and if possible an email to reach them out."*
2. **VC List** ([Watch](https://www.suna.so/share/804d20a3-cf1c-4adb-83bb-0e77cc6adeac)) - _"Give me the list of the most important VC Funds in the United States based on Assets Under Management. Give me website URLs, and if possible an email to reach them out."_
3. **Looking for Candidates** ([Watch](https://www.suna.so/share/3ae581b0-2db8-4c63-b324-3b8d29762e74)) - *"Go on LinkedIn, and find me 10 profiles available - they are not working right now - for a junior software engineer position, who are located in Munich, Germany. They should have at least one bachelor's degree in Computer Science or anything related to it, and 1-year of experience in any field/role."*
3. **Looking for Candidates** ([Watch](https://www.suna.so/share/3ae581b0-2db8-4c63-b324-3b8d29762e74)) - _"Go on LinkedIn, and find me 10 profiles available - they are not working right now - for a junior software engineer position, who are located in Munich, Germany. They should have at least one bachelor's degree in Computer Science or anything related to it, and 1-year of experience in any field/role."_
4. **Planning Company Trip** ([Watch](https://www.suna.so/share/725e64a0-f1e2-4bb6-8a1f-703c2833fd72)) - *"Generate me a route plan for my company. We should go to California. We'll be in 8 people. Compose the trip from the departure (Paris, France) to the activities we can do considering that the trip will be 7 days long - departure on the 21st of Apr 2025. Check the weather forecast and temperature for the upcoming days, and based on that, you can plan our activities (outdoor vs indoor)."*
4. **Planning Company Trip** ([Watch](https://www.suna.so/share/725e64a0-f1e2-4bb6-8a1f-703c2833fd72)) - _"Generate me a route plan for my company. We should go to California. We'll be in 8 people. Compose the trip from the departure (Paris, France) to the activities we can do considering that the trip will be 7 days long - departure on the 21st of Apr 2025. Check the weather forecast and temperature for the upcoming days, and based on that, you can plan our activities (outdoor vs indoor)."_
5. **Working on Excel** ([Watch](https://www.suna.so/share/128f23a4-51cd-42a6-97a0-0b458b32010e)) - *"My company asked me to set up an Excel spreadsheet with all the information about Italian lottery games (Lotto, 10eLotto, and Million Day). Based on that, generate and send me a spreadsheet with all the basic information (public ones)."*
5. **Working on Excel** ([Watch](https://www.suna.so/share/128f23a4-51cd-42a6-97a0-0b458b32010e)) - _"My company asked me to set up an Excel spreadsheet with all the information about Italian lottery games (Lotto, 10eLotto, and Million Day). Based on that, generate and send me a spreadsheet with all the basic information (public ones)."_
6. **Automate Event Speaker Prospecting** ([Watch](https://www.suna.so/share/7a7592ea-ed44-4c69-bcb5-5f9bb88c188c)) - *"Find 20 AI ethics speakers from Europe who've spoken at conferences in the past year. Scrapes conference sites, cross-references LinkedIn and YouTube, and outputs contact info + talk summaries."*
6. **Automate Event Speaker Prospecting** ([Watch](https://www.suna.so/share/7a7592ea-ed44-4c69-bcb5-5f9bb88c188c)) - _"Find 20 AI ethics speakers from Europe who've spoken at conferences in the past year. Scrapes conference sites, cross-references LinkedIn and YouTube, and outputs contact info + talk summaries."_
7. **Summarize and Cross-Reference Scientific Papers** ([Watch](https://www.suna.so/share/c2081b3c-786e-4e7c-9bf4-46e9b23bb662)) - *"Research and compare scientific papers talking about Alcohol effects on our bodies during the last 5 years. Generate a report about the most important scientific papers talking about the topic I wrote before."*
7. **Summarize and Cross-Reference Scientific Papers** ([Watch](https://www.suna.so/share/c2081b3c-786e-4e7c-9bf4-46e9b23bb662)) - _"Research and compare scientific papers talking about Alcohol effects on our bodies during the last 5 years. Generate a report about the most important scientific papers talking about the topic I wrote before."_
8. **Research + First Contact Draft** ([Watch](https://www.suna.so/share/6b6296a6-8683-49e5-9ad0-a32952d12c44)) - *"Research my potential customers (B2B) on LinkedIn. They should be in the clean tech industry. Find their websites and their email addresses. After that, based on the company profile, generate a personalized first contact email where I present my company which is offering consulting services to cleantech companies to maximize their profits and reduce their costs."*
8. **Research + First Contact Draft** ([Watch](https://www.suna.so/share/6b6296a6-8683-49e5-9ad0-a32952d12c44)) - _"Research my potential customers (B2B) on LinkedIn. They should be in the clean tech industry. Find their websites and their email addresses. After that, based on the company profile, generate a personalized first contact email where I present my company which is offering consulting services to cleantech companies to maximize their profits and reduce their costs."_
9. **SEO Analysis** ([Watch](https://www.suna.so/share/43491cb0-cd6c-45f0-880c-66ddc8c4b842)) - *"Based on my website suna.so, generate an SEO report analysis, find top-ranking pages by keyword clusters, and identify topics I'm missing."*
9. **SEO Analysis** ([Watch](https://www.suna.so/share/43491cb0-cd6c-45f0-880c-66ddc8c4b842)) - _"Based on my website suna.so, generate an SEO report analysis, find top-ranking pages by keyword clusters, and identify topics I'm missing."_
10. **Generate a Personal Trip** ([Watch](https://www.suna.so/share/37b31907-8349-4f63-b0e5-27ca597ed02a)) - *"Generate a personal trip to London, with departure from Bangkok on the 1st of May. The trip will last 10 days. Find an accommodation in the center of London, with a rating on Google reviews of at least 4.5. Find me interesting outdoor activities to do during the journey. Generate a detailed itinerary plan."*
10. **Generate a Personal Trip** ([Watch](https://www.suna.so/share/37b31907-8349-4f63-b0e5-27ca597ed02a)) - _"Generate a personal trip to London, with departure from Bangkok on the 1st of May. The trip will last 10 days. Find an accommodation in the center of London, with a rating on Google reviews of at least 4.5. Find me interesting outdoor activities to do during the journey. Generate a detailed itinerary plan."_
11. **Recently Funded Startups** ([Watch](https://www.suna.so/share/8b2a897e-985a-4d5e-867b-15239274f764)) - *"Go on Crunchbase, Dealroom, and TechCrunch, filter by Series A funding rounds in the SaaS Finance Space, and build a report with company data, founders, and contact info for outbound sales."*
11. **Recently Funded Startups** ([Watch](https://www.suna.so/share/8b2a897e-985a-4d5e-867b-15239274f764)) - _"Go on Crunchbase, Dealroom, and TechCrunch, filter by Series A funding rounds in the SaaS Finance Space, and build a report with company data, founders, and contact info for outbound sales."_
12. **Scrape Forum Discussions** ([Watch](https://www.suna.so/share/7d7a5d93-a20d-48b0-82cc-e9a876e9fd04)) - *"I need to find the best beauty centers in Rome, but I want to find them by using open forums that speak about this topic. Go on Google, and scrape the forums by looking for beauty center discussions located in Rome. Then generate a list of 5 beauty centers with the best comments about them."*
12. **Scrape Forum Discussions** ([Watch](https://www.suna.so/share/7d7a5d93-a20d-48b0-82cc-e9a876e9fd04)) - _"I need to find the best beauty centers in Rome, but I want to find them by using open forums that speak about this topic. Go on Google, and scrape the forums by looking for beauty center discussions located in Rome. Then generate a list of 5 beauty centers with the best comments about them."_
## Run Locally / Self-Hosting
## Self-Hosting
Suna can be self-hosted on your own infrastructure. Follow these steps to set up your own instance.
Suna can be self-hosted on your own infrastructure using our setup wizard. For a comprehensive guide to self-hosting Suna, please refer to our [Self-Hosting Guide](./docs/SELF-HOSTING.md).
### Requirements
The setup process includes:
You'll need the following components:
- A Supabase project for database and authentication
- Redis database for caching and session management
- RabbitMQ message queue for orchestrating worker tasks
- Daytona sandbox for secure agent execution
- Python 3.11 for the API backend
- API keys for LLM providers (Anthropic, OpenRouter)
- Tavily API key for enhanced search capabilities
- Firecrawl API key for web scraping capabilities
- Setting up a Supabase project for database and authentication
- Configuring Redis for caching and session management
- Setting up Daytona for secure agent execution
- Integrating with LLM providers (Anthropic, OpenAI, Groq, etc.)
- Configuring web search and scraping capabilities
### Prerequisites
1. **Supabase**:
- Create a new [Supabase project](https://supabase.com/dashboard/projects)
- Save your project's API URL, anon key, and service role key for later use
- Install the [Supabase CLI](https://supabase.com/docs/guides/cli/getting-started)
2. **Redis and RabbitMQ**:
- Go to the `/backend` folder
- Run `docker compose up redis rabbitmq`
3. **Daytona**:
- Create an account on [Daytona](https://app.daytona.io/)
- Generate an API key from your account settings
- Go to [Images](https://app.daytona.io/dashboard/images)
- Click "Add Image"
- Enter `kortix/suna:0.1.2` as the image name
- Set `/usr/bin/supervisord -n -c /etc/supervisor/conf.d/supervisord.conf` as the Entrypoint
4. **LLM API Keys**:
- Obtain an API key [Anthropic](https://www.anthropic.com/)
- While other providers should work via [LiteLLM](https://github.com/BerriAI/litellm), Anthropic is recommended the prompt needs to be adjusted for other providers to output correct XML for tool calls.
5. **Search API Key** (Optional):
- For enhanced search capabilities, obtain an [Tavily API key](https://tavily.com/)
- For web scraping capabilities, obtain a [Firecrawl API key](https://firecrawl.dev/)
6. **RapidAPI API Key** (Optional):
- To enable API services like LinkedIn, and others, you'll need a RapidAPI key
- Each service requires individual activation in your RapidAPI account:
1. Locate the service's `base_url` in its corresponding file (e.g., `"https://linkedin-data-scraper.p.rapidapi.com"` in [`backend/agent/tools/data_providers/LinkedinProvider.py`](backend/agent/tools/data_providers/LinkedinProvider.py))
2. Visit that specific API on the RapidAPI marketplace
3. Subscribe to the service (many offer free tiers with limited requests)
4. Once subscribed, the service will be available to your agent through the API Services tool
### Installation Steps
### Quick Start
1. **Clone the repository**:
```bash
git clone https://github.com/kortix-ai/suna.git
cd suna
```
2. **Configure backend environment**:
2. **Run the setup wizard**:
```bash
cd backend
cp .env.example .env # Create from example if available, or use the following template
python setup.py
```
Edit the `.env` file and fill in your credentials:
```bash
NEXT_PUBLIC_URL="http://localhost:3000"
### Manual Setup
# Supabase credentials from step 1
SUPABASE_URL=your_supabase_url
SUPABASE_ANON_KEY=your_supabase_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key
See the [Self-Hosting Guide](./docs/SELF-HOSTING.md) for detailed manual setup instructions.
# Redis credentials from step 2
REDIS_HOST=your_redis_host
REDIS_PORT=6379
REDIS_PASSWORD=your_redis_password
REDIS_SSL=True # Set to False for local Redis without SSL
The wizard will guide you through all necessary steps to get your Suna instance up and running. For detailed instructions, troubleshooting tips, and advanced configuration options, see the [Self-Hosting Guide](./SELF-HOSTING.md).
RABBITMQ_HOST=your_rabbitmq_host # Set to localhost if running locally
RABBITMQ_PORT=5672
## Contributing
# Daytona credentials from step 3
DAYTONA_API_KEY=your_daytona_api_key
DAYTONA_SERVER_URL="https://app.daytona.io/api"
DAYTONA_TARGET="us"
# Anthropic
ANTHROPIC_API_KEY=
# OpenAI API:
OPENAI_API_KEY=your_openai_api_key
# Optional but recommended
TAVILY_API_KEY=your_tavily_api_key # For enhanced search capabilities
FIRECRAWL_API_KEY=your_firecrawl_api_key # For web scraping capabilities
RAPID_API_KEY=
```
3. **Set up Supabase database**:
```bash
# Login to Supabase CLI
supabase login
# Link to your project (find your project reference in the Supabase dashboard)
supabase link --project-ref your_project_reference_id
# Push database migrations
supabase db push
```
Then, go to the Supabase web platform again -> choose your project -> Project Settings -> Data API -> And in the "Exposed Schema" add "basejump" if not already there
4. **Configure frontend environment**:
```bash
cd ../frontend
cp .env.example .env.local # Create from example if available, or use the following template
```
Edit the `.env.local` file:
```
NEXT_PUBLIC_SUPABASE_URL=your_supabase_url
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
NEXT_PUBLIC_BACKEND_URL="http://localhost:8000/api" # Use this for local development
NEXT_PUBLIC_URL="http://localhost:3000"
```
Note: If you're using Docker Compose, use the container name instead of localhost:
```
NEXT_PUBLIC_BACKEND_URL="http://backend:8000/api" # Use this when running with Docker Compose
```
5. **Install dependencies**:
```bash
# Install frontend dependencies
cd frontend
npm install
# Install backend dependencies
cd ../backend
poetry install
```
6. **Start the application**:
In one terminal, start the frontend:
```bash
cd frontend
npm run dev
```
In another terminal, start the backend:
```bash
cd backend
poetry run python3.11 api.py
```
In one more terminal, start the backend worker:
```bash
cd backend
poetry run python3.11 -m dramatiq run_agent_background
```
5-6. **Docker Compose Alternative**:
Before running with Docker Compose, make sure your environment files are properly configured:
- In `backend/.env`, set all the required environment variables as described above
- For Redis configuration, use `REDIS_HOST=redis` instead of localhost
- For RabbitMQ, use `RABBITMQ_HOST=rabbitmq` instead of localhost
- The Docker Compose setup will automatically set these Redis environment variables:
```
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=
REDIS_SSL=False
RABBITMQ_HOST=rabbitmq
RABBITMQ_PORT=5672
```
- In `frontend/.env.local`, make sure to set `NEXT_PUBLIC_BACKEND_URL="http://backend:8000/api"` to use the container name
Then run:
```bash
export GITHUB_REPOSITORY="your-github-username/repo-name"
docker compose -f docker-compose.ghcr.yaml up
```
If you're building the images locally instead of using pre-built ones:
```bash
docker compose up
```
The Docker Compose setup includes Redis and RabbitMQ services that will be used by the backend automatically.
7. **Access Suna**:
- Open your browser and navigate to `http://localhost:3000`
- Sign up for an account using the Supabase authentication
- Start using your self-hosted Suna instance!
We welcome contributions from the community! Please see our [Contributing Guide](./CONTRIBUTING.md) for more details.
## Acknowledgements
### Main Contributors
- [Adam Cohen Hillel](https://x.com/adamcohenhillel)
- [Dat-lequoc](https://x.com/datlqqq)
- [Marko Kraemer](https://twitter.com/markokraemer)
### Technologies
- [Daytona](https://daytona.io/) - Secure agent execution environment
- [Supabase](https://supabase.com/) -
- [Supabase](https://supabase.com/) - Database and authentication
- [Playwright](https://playwright.dev/) - Browser automation
- [OpenAI](https://openai.com/) - LLM provider
- [Anthropic](https://www.anthropic.com/) - LLM provider
@ -296,8 +134,6 @@ The Docker Compose setup includes Redis and RabbitMQ services that will be used
- [Firecrawl](https://firecrawl.dev/) - Web scraping capabilities
- [RapidAPI](https://rapidapi.com/) - API services
## License
Kortix Suna is licensed under the Apache License, Version 2.0. See [LICENSE](./LICENSE) for the full license text.

View File

@ -3,6 +3,7 @@
## Running the backend
Within the backend directory, run the following command to stop and start the backend:
```bash
docker compose down && docker compose up --build
```
@ -12,11 +13,13 @@ docker compose down && docker compose up --build
You can run individual services from the docker-compose file. This is particularly useful during development:
### Running only Redis and RabbitMQ
```bash
docker compose up redis rabbitmq
```
### Running only the API and Worker
```bash
docker compose up api worker
```
@ -24,35 +27,52 @@ docker compose up api worker
## Development Setup
For local development, you might only need to run Redis and RabbitMQ, while working on the API locally. This is useful when:
- You're making changes to the API code and want to test them directly
- You want to avoid rebuilding the API container on every change
- You're running the API service directly on your machine
To run just Redis and RabbitMQ for development:```bash
docker compose up redis rabbitmq
Then you can run your API service locally with the following commands
```sh
# On one terminal
cd backend
poetry run python3.11 api.py
# On another terminal
cd frontend
poetry run python3.11 -m dramatiq run_agent_background
```
Then you can run your API service locally with your preferred method (e.g., poetry run python3.11 api.py).
### Environment Configuration
When running services individually, make sure to:
1. Check your `.env` file and adjust any necessary environment variables
2. Ensure Redis connection settings match your local setup (default: `localhost:6379`)
3. Ensure RabbitMQ connection settings match your local setup (default: `localhost:5672`)
4. Update any service-specific environment variables if needed
### Important: Redis Host Configuration
When running the API locally with Redis in Docker, you need to set the correct Redis host in your `.env` file:
- For Docker-to-Docker communication (when running both services in Docker): use `REDIS_HOST=redis`
- For local-to-Docker communication (when running API locally): use `REDIS_HOST=localhost`
### Important: RabbitMQ Host Configuration
When running the API locally with Redis in Docker, you need to set the correct RabbitMQ host in your `.env` file:
- For Docker-to-Docker communication (when running both services in Docker): use `RABBITMQ_HOST=rabbitmq`
- For local-to-Docker communication (when running API locally): use `RABBITMQ_HOST=localhost`
Example `.env` configuration for local development:
```env
```sh
REDIS_HOST=localhost (instead of 'redis')
REDIS_PORT=6379
REDIS_PASSWORD=

View File

@ -1,3 +1,5 @@
# This is a Docker Compose file for the backend service. For self-hosting, look at the root docker-compose.yml file.
version: "3.8"
services:

View File

@ -1,46 +0,0 @@
version: '3.8'
services:
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis-data:/data
command: redis-server --save 60 1 --loglevel warning
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 3
backend:
image: ghcr.io/${GITHUB_REPOSITORY}/suna-backend:latest
ports:
- "8000:8000"
volumes:
- ./backend/.env:/app/.env:ro
environment:
- ENV_MODE=local
- REDIS_HOST=redis
- REDIS_PORT=6379
- REDIS_PASSWORD=
- REDIS_SSL=False
depends_on:
redis:
condition: service_healthy
frontend:
image: ghcr.io/${GITHUB_REPOSITORY}/suna-frontend:latest
ports:
- "3000:3000"
volumes:
- ./frontend/.env.local:/app/.env.local:ro
environment:
- NODE_ENV=production
command: ["npm", "run", "dev"]
depends_on:
- backend
volumes:
redis-data:

View File

@ -1,10 +1,8 @@
services:
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis-data:/data
- redis_data:/data
command: redis-server --save 60 1 --loglevel warning
healthcheck:
test: ["CMD", "redis-cli", "ping"]
@ -14,8 +12,6 @@ services:
rabbitmq:
image: rabbitmq
# ports:
# - "127.0.0.1:5672:5672"
volumes:
- rabbitmq_data:/var/lib/rabbitmq
restart: unless-stopped
@ -40,6 +36,31 @@ services:
- REDIS_PORT=6379
- REDIS_PASSWORD=
- REDIS_SSL=False
- RABBITMQ_HOST=rabbitmq
- RABBITMQ_PORT=5672
depends_on:
redis:
condition: service_healthy
rabbitmq:
condition: service_healthy
worker:
condition: service_started
worker:
build:
context: ./backend
dockerfile: Dockerfile
command: python -m dramatiq run_agent_background
volumes:
- ./backend/.env:/app/.env:ro
environment:
- ENV_MODE=local
- REDIS_HOST=redis
- REDIS_PORT=6379
- REDIS_PASSWORD=
- REDIS_SSL=False
- RABBITMQ_HOST=rabbitmq
- RABBITMQ_PORT=5672
depends_on:
redis:
condition: service_healthy
@ -47,6 +68,7 @@ services:
condition: service_healthy
frontend:
init: true
build:
context: ./frontend
dockerfile: Dockerfile
@ -61,5 +83,5 @@ services:
- backend
volumes:
redis-data:
rabbitmq_data:
redis_data:
rabbitmq_data:

283
docs/SELF-HOSTING.md Normal file
View File

@ -0,0 +1,283 @@
# Suna Self-Hosting Guide
This guide provides detailed instructions for setting up and hosting your own instance of Suna, an open-source generalist AI agent.
## Table of Contents
- [Overview](#overview)
- [Prerequisites](#prerequisites)
- [Installation Steps](#installation-steps)
- [Manual Configuration](#manual-configuration)
- [Post-Installation Steps](#post-installation-steps)
- [Troubleshooting](#troubleshooting)
## Overview
Suna consists of four main components:
1. **Backend API** - Python/FastAPI service for REST endpoints, thread management, and LLM integration
2. **Backend Worker** - Python/Dramatiq worker service for handling agent tasks
3. **Frontend** - Next.js/React application providing the user interface
4. **Agent Docker** - Isolated execution environment for each agent
5. **Supabase Database** - Handles data persistence and authentication
## Prerequisites
Before starting the installation process, you'll need to set up the following:
### 1. Supabase Project
1. Create an account at [Supabase](https://supabase.com/)
2. Create a new project
3. Note down the following information (found in Project Settings → API):
- Project URL (e.g., `https://abcdefg.supabase.co`)
- API keys (anon key and service role key)
### 2. API Keys
Obtain the following API keys:
#### Required
- **LLM Provider** (at least one of the following):
- [Anthropic](https://console.anthropic.com/) - Recommended for best performance
- [OpenAI](https://platform.openai.com/)
- [Groq](https://console.groq.com/)
- [OpenRouter](https://openrouter.ai/)
- [AWS Bedrock](https://aws.amazon.com/bedrock/)
- **Search and Web Scraping**:
- [Tavily](https://tavily.com/) - For enhanced search capabilities
- [Firecrawl](https://firecrawl.dev/) - For web scraping capabilities
- **Agent Execution**:
- [Daytona](https://app.daytona.io/) - For secure agent execution
#### Optional
- **RapidAPI** - For accessing additional API services (optional)
### 3. Required Software
Ensure the following tools are installed on your system:
- **[Git](https://git-scm.com/downloads)**
- **[Docker](https://docs.docker.com/get-docker/)**
- **[Python 3.11](https://www.python.org/downloads/)**
- **[Poetry](https://python-poetry.org/docs/#installation)**
- **[Node.js & npm](https://nodejs.org/en/download/)**
- **[Supabase CLI](https://supabase.com/docs/guides/local-development/cli/getting-started)**
## Installation Steps
### 1. Clone the Repository
```bash
git clone https://github.com/kortix-ai/suna.git
cd suna
```
### 2. Run the Setup Wizard
The setup wizard will guide you through the installation process:
```bash
python setup.py
```
The wizard will:
- Check if all required tools are installed
- Collect your API keys and configuration information
- Set up the Supabase database
- Configure environment files
- Install dependencies
- Start Suna using your preferred method
### 3. Supabase Configuration
During setup, you'll need to:
1. Log in to the Supabase CLI
2. Link your local project to your Supabase project
3. Push database migrations
4. Manually expose the 'basejump' schema in Supabase:
- Go to your Supabase project
- Navigate to Project Settings → API
- Add 'basejump' to the Exposed Schema section
### 4. Daytona Configuration
As part of the setup, you'll need to:
1. Create a Daytona account
2. Generate an API key
3. Create a Docker image:
- Image name: `kortix/suna:0.1.2`
- Entrypoint: `/usr/bin/supervisord -n -c /etc/supervisor/conf.d/supervisord.conf`
## Manual Configuration
If you prefer to configure your installation manually, or if you need to modify the configuration after installation, here's what you need to know:
### Backend Configuration (.env)
The backend configuration is stored in `backend/.env`
Example configuration:
```sh
# Environment Mode
ENV_MODE=local
# DATABASE
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=your-anon-key
SUPABASE_SERVICE_ROLE_KEY=your-service-role-key
# REDIS
REDIS_HOST=redis
REDIS_PORT=6379
REDIS_PASSWORD=
REDIS_SSL=false
# RABBITMQ
RABBITMQ_HOST=rabbitmq
RABBITMQ_PORT=5672
# LLM Providers
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-key
MODEL_TO_USE=anthropic/claude-3-7-sonnet-latest
# WEB SEARCH
TAVILY_API_KEY=your-tavily-key
# WEB SCRAPE
FIRECRAWL_API_KEY=your-firecrawl-key
FIRECRAWL_URL=https://api.firecrawl.dev
# Sandbox container provider
DAYTONA_API_KEY=your-daytona-key
DAYTONA_SERVER_URL=https://app.daytona.io/api
DAYTONA_TARGET=us
NEXT_PUBLIC_URL=http://localhost:3000
```
### Frontend Configuration (.env.local)
The frontend configuration is stored in `frontend/.env.local` and includes:
- Supabase connection details
- Backend API URL
Example configuration:
```sh
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your-anon-key
NEXT_PUBLIC_BACKEND_URL=http://backend:8000/api
NEXT_PUBLIC_URL=http://localhost:3000
```
## Post-Installation Steps
After completing the installation, you'll need to:
1. **Create an account** - Use Supabase authentication to create your first account
2. **Verify installations** - Check that all components are running correctly
## Startup Options
Suna can be started in two ways:
### 1. Using Docker Compose (Recommended)
This method starts all required services in Docker containers:
```bash
docker compose up -d
```
### 2. Manual Startup
This method requires you to start each component separately:
1. Start Redis and RabbitMQ (required for backend):
```bash
docker compose up redis rabbitmq -d
```
2. Start the frontend (in one terminal):
```bash
cd frontend
npm run dev
```
3. Start the backend (in another terminal):
```bash
cd backend
poetry run python3.11 api.py
```
4. Start the worker (in one more terminal):
```bash
cd backend
poetry run python3.11 -m dramatiq run_agent_background
```
## Troubleshooting
### Common Issues
1. **Docker services not starting**
- Check Docker logs: `docker compose logs`
- Ensure Docker is running correctly
- Verify port availability (3000 for frontend, 8000 for backend)
2. **Database connection issues**
- Verify Supabase configuration
- Check if 'basejump' schema is exposed in Supabase
3. **LLM API key issues**
- Verify API keys are correctly entered
- Check for API usage limits or restrictions
4. **Daytona connection issues**
- Verify Daytona API key
- Check if the container image is correctly configured
### Logs
To view logs and diagnose issues:
```bash
# Docker Compose logs
docker compose logs -f
# Frontend logs (manual setup)
cd frontend
npm run dev
# Backend logs (manual setup)
cd backend
poetry run python3.11 api.py
# Worker logs (manual setup)
cd backend
poetry run python3.11 -m dramatiq run_agent_background
```
---
For further assistance, join the [Suna Discord Community](https://discord.gg/Py6pCBUUPw) or check the [GitHub repository](https://github.com/kortix-ai/suna) for updates and issues.

View File

@ -1,4 +1,4 @@
This is a [Next.js](https://nextjs.org) project bootstrapped with [`create-next-app`](https://nextjs.org/docs/app/api-reference/cli/create-next-app).
# Suna frontend
## Getting Started
@ -6,20 +6,10 @@ First, run the development server:
```bash
npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev
```
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
You can start editing the page by modifying `app/page.tsx`. The page auto-updates as you edit the file.
This project uses [`next/font`](https://nextjs.org/docs/app/building-your-application/optimizing/fonts) to automatically optimize and load [Geist](https://vercel.com/font), a new font family for Vercel.
## Learn More
To learn more about Next.js, take a look at the following resources:
@ -27,8 +17,6 @@ To learn more about Next.js, take a look at the following resources:
- [Next.js Documentation](https://nextjs.org/docs) - learn about Next.js features and API.
- [Learn Next.js](https://nextjs.org/learn) - an interactive Next.js tutorial.
You can check out [the Next.js GitHub repository](https://github.com/vercel/next.js) - your feedback and contributions are welcome!
## Deploy on Vercel
The easiest way to deploy your Next.js app is to use the [Vercel Platform](https://vercel.com/new?utm_medium=default-template&filter=next.js&utm_source=create-next-app&utm_campaign=create-next-app-readme) from the creators of Next.js.

952
setup.py Normal file
View File

@ -0,0 +1,952 @@
#!/usr/bin/env python3
import os
import sys
import json
import time
import shutil
import platform
import subprocess
from pathlib import Path
import urllib.request
import configparser
from getpass import getpass
import re
import socket
import random
import string
# ANSI colors for pretty output
class Colors:
HEADER = '\033[95m'
BLUE = '\033[94m'
CYAN = '\033[96m'
GREEN = '\033[92m'
YELLOW = '\033[93m'
RED = '\033[91m'
ENDC = '\033[0m'
BOLD = '\033[1m'
UNDERLINE = '\033[4m'
def print_banner():
"""Print Suna setup banner"""
print(f"""
{Colors.BLUE}{Colors.BOLD}
Setup Wizard
{Colors.ENDC}
""")
def print_step(step_num, total_steps, step_name):
"""Print a step header"""
print(f"\n{Colors.BLUE}{Colors.BOLD}Step {step_num}/{total_steps}: {step_name}{Colors.ENDC}")
print(f"{Colors.CYAN}{'='*50}{Colors.ENDC}\n")
def print_info(message):
"""Print info message"""
print(f"{Colors.CYAN} {message}{Colors.ENDC}")
def print_success(message):
"""Print success message"""
print(f"{Colors.GREEN}{message}{Colors.ENDC}")
def print_warning(message):
"""Print warning message"""
print(f"{Colors.YELLOW}⚠️ {message}{Colors.ENDC}")
def print_error(message):
"""Print error message"""
print(f"{Colors.RED}{message}{Colors.ENDC}")
def check_requirements():
"""Check if all required tools are installed"""
requirements = {
'git': 'https://git-scm.com/downloads',
'docker': 'https://docs.docker.com/get-docker/',
'python3': 'https://www.python.org/downloads/',
'poetry': 'https://python-poetry.org/docs/#installation',
'pip3': 'https://pip.pypa.io/en/stable/installation/',
'node': 'https://nodejs.org/en/download/',
'npm': 'https://docs.npmjs.com/downloading-and-installing-node-js-and-npm',
}
missing = []
for cmd, url in requirements.items():
try:
# Check if python3/pip3 for Windows
if platform.system() == 'Windows' and cmd in ['python3', 'pip3']:
cmd_to_check = cmd.replace('3', '')
else:
cmd_to_check = cmd
subprocess.run(
[cmd_to_check, '--version'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=True
)
print_success(f"{cmd} is installed")
except (subprocess.SubprocessError, FileNotFoundError):
missing.append((cmd, url))
print_error(f"{cmd} is not installed")
if missing:
print_error("Missing required tools. Please install them before continuing:")
for cmd, url in missing:
print(f" - {cmd}: {url}")
sys.exit(1)
return True
def check_docker_running():
"""Check if Docker is running"""
try:
result = subprocess.run(
['docker', 'info'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=True
)
print_success("Docker is running")
return True
except subprocess.SubprocessError:
print_error("Docker is installed but not running. Please start Docker and try again.")
sys.exit(1)
def check_suna_directory():
"""Check if we're in a Suna repository"""
required_dirs = ['backend', 'frontend']
required_files = ['README.md', 'docker-compose.yaml']
for directory in required_dirs:
if not os.path.isdir(directory):
print_error(f"'{directory}' directory not found. Make sure you're in the Suna repository root.")
return False
for file in required_files:
if not os.path.isfile(file):
print_error(f"'{file}' not found. Make sure you're in the Suna repository root.")
return False
print_success("Suna repository detected")
return True
def validate_url(url, allow_empty=False):
"""Validate a URL"""
if allow_empty and not url:
return True
pattern = re.compile(
r'^(?:http|https)://' # http:// or https://
r'(?:(?:[A-Z0-9](?:[A-Z0-9-]{0,61}[A-Z0-9])?\.)+(?:[A-Z]{2,6}\.?|[A-Z0-9-]{2,}\.?)|' # domain
r'localhost|' # localhost
r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})' # or IP
r'(?::\d+)?' # optional port
r'(?:/?|[/?]\S+)$', re.IGNORECASE)
return bool(pattern.match(url))
def validate_api_key(api_key, allow_empty=False):
"""Validate an API key (basic format check)"""
if allow_empty and not api_key:
return True
# Basic check: not empty and at least 10 chars
return bool(api_key)
def collect_supabase_info():
"""Collect Supabase information"""
print_info("You'll need to create a Supabase project before continuing")
print_info("Visit https://supabase.com/dashboard/projects to create one")
print_info("After creating your project, visit the project settings -> Data API and you'll need to get the following information:")
print_info("1. Supabase Project URL (e.g., https://abcdefg.supabase.co)")
print_info("2. Supabase anon key")
print_info("3. Supabase service role key")
input("Press Enter to continue once you've created your Supabase project...")
while True:
supabase_url = input("Enter your Supabase Project URL (e.g., https://abcdefg.supabase.co): ")
if validate_url(supabase_url):
break
print_error("Invalid URL format. Please enter a valid URL.")
while True:
supabase_anon_key = input("Enter your Supabase anon key: ")
if validate_api_key(supabase_anon_key):
break
print_error("Invalid API key format. It should be at least 10 characters long.")
while True:
supabase_service_role_key = input("Enter your Supabase service role key: ")
if validate_api_key(supabase_service_role_key):
break
print_error("Invalid API key format. It should be at least 10 characters long.")
return {
'SUPABASE_URL': supabase_url,
'SUPABASE_ANON_KEY': supabase_anon_key,
'SUPABASE_SERVICE_ROLE_KEY': supabase_service_role_key,
}
def collect_daytona_info():
"""Collect Daytona API key"""
print_info("You'll need to create a Daytona account before continuing")
print_info("Visit https://app.daytona.io/ to create one")
print_info("Then, generate an API key from 'Keys' menu")
print_info("After that, go to Images (https://app.daytona.io/dashboard/images)")
print_info("Click '+ Create Image'")
print_info("Enter 'kortix/suna:0.1.2' as the image name")
print_info("Set '/usr/bin/supervisord -n -c /etc/supervisor/conf.d/supervisord.conf' as the Entrypoint")
input("Press Enter to continue once you've completed these steps...")
while True:
daytona_api_key = input("Enter your Daytona API key: ")
if validate_api_key(daytona_api_key):
break
print_error("Invalid API key format. It should be at least 10 characters long.")
return {
'DAYTONA_API_KEY': daytona_api_key,
'DAYTONA_SERVER_URL': "https://app.daytona.io/api",
'DAYTONA_TARGET': "us",
}
def collect_llm_api_keys():
"""Collect LLM API keys for various providers"""
print_info("You need at least one LLM provider API key to use Suna")
print_info("Available LLM providers: OpenAI, Anthropic, Groq, OpenRouter")
# Display provider selection options
print(f"\n{Colors.CYAN}Select LLM providers to configure:{Colors.ENDC}")
print(f"{Colors.CYAN}[1] {Colors.GREEN}OpenAI{Colors.ENDC}")
print(f"{Colors.CYAN}[2] {Colors.GREEN}Anthropic{Colors.ENDC} {Colors.CYAN}(recommended for best performance){Colors.ENDC}")
print(f"{Colors.CYAN}[3] {Colors.GREEN}Groq{Colors.ENDC}")
print(f"{Colors.CYAN}[4] {Colors.GREEN}OpenRouter{Colors.ENDC} {Colors.CYAN}(access to multiple models){Colors.ENDC}")
print(f"{Colors.CYAN}[5] {Colors.GREEN}AWS Bedrock{Colors.ENDC}")
print(f"{Colors.CYAN}Enter numbers separated by commas (e.g., 1,2,4){Colors.ENDC}\n")
while True:
providers_input = input("Select providers (required, at least one): ")
selected_providers = []
try:
# Parse the input, handle both comma-separated and space-separated
provider_numbers = [int(p.strip()) for p in providers_input.replace(',', ' ').split()]
for num in provider_numbers:
if num == 1:
selected_providers.append('OPENAI')
elif num == 2:
selected_providers.append('ANTHROPIC')
elif num == 3:
selected_providers.append('GROQ')
elif num == 4:
selected_providers.append('OPENROUTER')
elif num == 5:
selected_providers.append('AWS_BEDROCK')
if selected_providers:
break
else:
print_error("Please select at least one provider.")
except ValueError:
print_error("Invalid input. Please enter provider numbers (e.g., 1,2,4).")
# Collect API keys for selected providers
api_keys = {}
model_info = {}
# Model aliases for reference
model_aliases = {
'OPENAI': ['openai/gpt-4o', 'openai/gpt-4o-mini'],
'ANTHROPIC': ['anthropic/claude-3-7-sonnet-latest', 'anthropic/claude-3-5-sonnet-latest'],
'GROQ': ['groq/llama-3.1-70b-versatile', 'groq/llama-3.1-405b-reasoning-preview'],
'OPENROUTER': ['openrouter/google/gemini-2.5-pro-preview', 'openrouter/deepseek/deepseek-chat-v3-0324:free', 'openrouter/openai/gpt-4o-2024-11-20'],
'AWS_BEDROCK': ['anthropic.claude-3-7-sonnet-20250219-v1:0', 'anthropic.claude-3-5-sonnet-20241022-v2:0']
}
for provider in selected_providers:
print_info(f"\nConfiguring {provider}")
if provider == 'OPENAI':
while True:
api_key = input("Enter your OpenAI API key: ")
if validate_api_key(api_key):
api_keys['OPENAI_API_KEY'] = api_key
# Recommend default model
print(f"\n{Colors.CYAN}Recommended OpenAI models:{Colors.ENDC}")
for i, model in enumerate(model_aliases['OPENAI'], 1):
print(f"{Colors.CYAN}[{i}] {Colors.GREEN}{model}{Colors.ENDC}")
model_choice = input("Select default model (1-4) or press Enter for gpt-4o: ").strip()
if not model_choice:
model_info['default_model'] = 'openai/gpt-4o'
elif model_choice.isdigit() and 1 <= int(model_choice) <= len(model_aliases['OPENAI']):
model_info['default_model'] = model_aliases['OPENAI'][int(model_choice) - 1]
else:
model_info['default_model'] = 'openai/gpt-4o'
print_warning(f"Invalid selection, using default: openai/gpt-4o")
break
print_error("Invalid API key format. It should be at least 10 characters long.")
elif provider == 'ANTHROPIC':
while True:
api_key = input("Enter your Anthropic API key: ")
if validate_api_key(api_key):
api_keys['ANTHROPIC_API_KEY'] = api_key
# Recommend default model
print(f"\n{Colors.CYAN}Recommended Anthropic models:{Colors.ENDC}")
for i, model in enumerate(model_aliases['ANTHROPIC'], 1):
print(f"{Colors.CYAN}[{i}] {Colors.GREEN}{model}{Colors.ENDC}")
model_choice = input("Select default model (1-3) or press Enter for claude-3-7-sonnet: ").strip()
if not model_choice or model_choice == '1':
model_info['default_model'] = 'anthropic/claude-3-7-sonnet-latest'
elif model_choice.isdigit() and 1 <= int(model_choice) <= len(model_aliases['ANTHROPIC']):
model_info['default_model'] = model_aliases['ANTHROPIC'][int(model_choice) - 1]
else:
model_info['default_model'] = 'anthropic/claude-3-7-sonnet-latest'
print_warning(f"Invalid selection, using default: anthropic/claude-3-7-sonnet-latest")
break
print_error("Invalid API key format. It should be at least 10 characters long.")
elif provider == 'GROQ':
while True:
api_key = input("Enter your Groq API key: ")
if validate_api_key(api_key):
api_keys['GROQ_API_KEY'] = api_key
# Recommend default model
print(f"\n{Colors.CYAN}Recommended Groq models:{Colors.ENDC}")
for i, model in enumerate(model_aliases['GROQ'], 1):
print(f"{Colors.CYAN}[{i}] {Colors.GREEN}{model}{Colors.ENDC}")
model_choice = input("Select default model (1-2) or press Enter for llama-3.1-70b: ").strip()
if not model_choice or model_choice == '1':
model_info['default_model'] = 'groq/llama-3.1-70b-versatile'
elif model_choice == '2':
model_info['default_model'] = 'groq/llama-3.1-405b-reasoning-preview'
else:
model_info['default_model'] = 'groq/llama-3.1-70b-versatile'
print_warning(f"Invalid selection, using default: groq/llama-3.1-70b-versatile")
break
print_error("Invalid API key format. It should be at least 10 characters long.")
elif provider == 'OPENROUTER':
while True:
api_key = input("Enter your OpenRouter API key: ")
if validate_api_key(api_key):
api_keys['OPENROUTER_API_KEY'] = api_key
api_keys['OPENROUTER_API_BASE'] = 'https://openrouter.ai/api/v1'
# Recommend default model
print(f"\n{Colors.CYAN}Recommended OpenRouter models:{Colors.ENDC}")
for i, model in enumerate(model_aliases['OPENROUTER'], 1):
print(f"{Colors.CYAN}[{i}] {Colors.GREEN}{model}{Colors.ENDC}")
model_choice = input("Select default model (1-3) or press Enter for gemini-2.5-flash: ").strip()
if not model_choice or model_choice == '1':
model_info['default_model'] = 'openrouter/google/gemini-2.5-flash-preview'
elif model_choice.isdigit() and 1 <= int(model_choice) <= len(model_aliases['OPENROUTER']):
model_info['default_model'] = model_aliases['OPENROUTER'][int(model_choice) - 1]
else:
model_info['default_model'] = 'openrouter/google/gemini-2.5-flash-preview'
print_warning(f"Invalid selection, using default: openrouter/google/gemini-2.5-flash-preview")
break
print_error("Invalid API key format. It should be at least 10 characters long.")
elif provider == 'AWS_BEDROCK':
print_info("For AWS Bedrock, you'll need AWS credentials and region")
aws_access_key = input("Enter your AWS Access Key ID: ")
aws_secret_key = input("Enter your AWS Secret Access Key: ")
aws_region = input("Enter your AWS Region (e.g., us-west-2): ") or "us-west-2"
if aws_access_key and aws_secret_key:
api_keys['AWS_ACCESS_KEY_ID'] = aws_access_key
api_keys['AWS_SECRET_ACCESS_KEY'] = aws_secret_key
api_keys['AWS_REGION_NAME'] = aws_region
# Recommend default model for AWS Bedrock
print(f"\n{Colors.CYAN}Recommended AWS Bedrock models:{Colors.ENDC}")
for i, model in enumerate(model_aliases['AWS_BEDROCK'], 1):
print(f"{Colors.CYAN}[{i}] {Colors.GREEN}{model}{Colors.ENDC}")
model_choice = input("Select default model (1-2) or press Enter for claude-3-7-sonnet: ").strip()
if not model_choice or model_choice == '1':
model_info['default_model'] = 'bedrock/anthropic.claude-3-7-sonnet-20250219-v1:0'
elif model_choice == '2':
model_info['default_model'] = 'bedrock/amazon.titan-text-lite-v1'
else:
model_info['default_model'] = 'bedrock/anthropic.claude-3-7-sonnet-20250219-v1:0'
print_warning(f"Invalid selection, using default: bedrock/anthropic.claude-3-7-sonnet-20250219-v1:0")
else:
print_warning("AWS credentials incomplete, Bedrock will not be configured correctly")
# If no default model has been set, check which provider was selected and set an appropriate default
if 'default_model' not in model_info:
if 'ANTHROPIC_API_KEY' in api_keys:
model_info['default_model'] = 'anthropic/claude-3-7-sonnet-latest'
elif 'OPENAI_API_KEY' in api_keys:
model_info['default_model'] = 'openai/gpt-4o'
elif 'OPENROUTER_API_KEY' in api_keys:
model_info['default_model'] = 'openrouter/google/gemini-2.5-flash-preview'
elif 'GROQ_API_KEY' in api_keys:
model_info['default_model'] = 'groq/llama-3.1-70b-versatile'
elif 'AWS_ACCESS_KEY_ID' in api_keys:
model_info['default_model'] = 'bedrock/anthropic.claude-3-7-sonnet-20250219-v1:0'
print_success(f"Using {model_info['default_model']} as the default model")
# Add the default model to the API keys dictionary
api_keys['MODEL_TO_USE'] = model_info['default_model']
return api_keys
def collect_search_api_keys():
"""Collect search API keys (now required, not optional)"""
print_info("You'll need to obtain API keys for search and web scraping")
print_info("Visit https://tavily.com/ to get a Tavily API key")
print_info("Visit https://firecrawl.dev/ to get a Firecrawl API key")
while True:
tavily_api_key = input("Enter your Tavily API key: ")
if validate_api_key(tavily_api_key):
break
print_error("Invalid API key format. It should be at least 10 characters long.")
while True:
firecrawl_api_key = input("Enter your Firecrawl API key: ")
if validate_api_key(firecrawl_api_key):
break
print_error("Invalid API key format. It should be at least 10 characters long.")
# Ask if user is self-hosting Firecrawl
is_self_hosted = input("Are you self-hosting Firecrawl? (y/n): ").lower().strip() == 'y'
firecrawl_url = "https://api.firecrawl.dev" # Default URL
if is_self_hosted:
while True:
custom_url = input("Enter your Firecrawl URL (e.g., https://your-firecrawl-instance.com): ")
if validate_url(custom_url):
firecrawl_url = custom_url
break
print_error("Invalid URL format. Please enter a valid URL.")
return {
'TAVILY_API_KEY': tavily_api_key,
'FIRECRAWL_API_KEY': firecrawl_api_key,
'FIRECRAWL_URL': firecrawl_url,
}
def collect_rapidapi_keys():
"""Collect RapidAPI key (optional)"""
print_info("To enable API services like LinkedIn, and others, you'll need a RapidAPI key")
print_info("Each service requires individual activation in your RapidAPI account:")
print_info("1. Locate the service's `base_url` in its corresponding file (e.g., https://linkedin-data-scraper.p.rapidapi.com in backend/agent/tools/data_providers/LinkedinProvider.py)")
print_info("2. Visit that specific API on the RapidAPI marketplace")
print_info("3. Subscribe to th`e service (many offer free tiers with limited requests)")
print_info("4. Once subscribed, the service will be available to your agent through the API Services tool")
print_info("A RapidAPI key is optional for API services like LinkedIn")
print_info("Visit https://rapidapi.com/ to get your API key if needed")
print_info("You can leave this blank and add it later if desired")
rapid_api_key = input("Enter your RapidAPI key (optional, press Enter to skip): ")
# Allow empty key
if not rapid_api_key:
print_info("Skipping RapidAPI key setup. You can add it later if needed.")
else:
# Validate if not empty
if not validate_api_key(rapid_api_key, allow_empty=True):
print_warning("The API key format seems invalid, but continuing anyway.")
return {
'RAPID_API_KEY': rapid_api_key,
}
def configure_backend_env(env_vars, use_docker=True):
"""Configure backend .env file"""
env_path = os.path.join('backend', '.env')
# Redis configuration (based on deployment method)
redis_host = 'redis' if use_docker else 'localhost'
redis_config = {
'REDIS_HOST': redis_host,
'REDIS_PORT': '6379',
'REDIS_PASSWORD': '',
'REDIS_SSL': 'false',
}
# RabbitMQ configuration (based on deployment method)
rabbitmq_host = 'rabbitmq' if use_docker else 'localhost'
rabbitmq_config = {
'RABBITMQ_HOST': rabbitmq_host,
'RABBITMQ_PORT': '5672',
}
# Organize all configuration
all_config = {}
# Create a string with the formatted content
env_content = """# Generated by Suna setup script
# Environment Mode
# Valid values: local, staging, production
ENV_MODE=local
#DATABASE
"""
# Supabase section
for key, value in env_vars['supabase'].items():
env_content += f"{key}={value}\n"
# Redis section
env_content += "\n# REDIS\n"
for key, value in redis_config.items():
env_content += f"{key}={value}\n"
# RabbitMQ section
env_content += "\n# RABBITMQ\n"
for key, value in rabbitmq_config.items():
env_content += f"{key}={value}\n"
# LLM section
env_content += "\n# LLM Providers:\n"
# Add empty values for all LLM providers we support
all_llm_keys = ['ANTHROPIC_API_KEY', 'OPENAI_API_KEY', 'GROQ_API_KEY', 'OPENROUTER_API_KEY', 'MODEL_TO_USE']
# Add AWS keys separately
aws_keys = ['AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY', 'AWS_REGION_NAME']
# First add the keys that were provided
for key, value in env_vars['llm'].items():
if key in all_llm_keys:
env_content += f"{key}={value}\n"
# Remove from the list once added
if key in all_llm_keys:
all_llm_keys.remove(key)
# Add empty values for any remaining LLM keys
for key in all_llm_keys:
env_content += f"{key}=\n"
# AWS section
env_content += "\n# AWS Bedrock\n"
for key in aws_keys:
value = env_vars['llm'].get(key, '')
env_content += f"{key}={value}\n"
# Additional OpenRouter params
if 'OR_SITE_URL' in env_vars['llm'] or 'OR_APP_NAME' in env_vars['llm']:
env_content += "\n# OpenRouter Additional Settings\n"
if 'OR_SITE_URL' in env_vars['llm']:
env_content += f"OR_SITE_URL={env_vars['llm']['OR_SITE_URL']}\n"
if 'OR_APP_NAME' in env_vars['llm']:
env_content += f"OR_APP_NAME={env_vars['llm']['OR_APP_NAME']}\n"
# DATA APIs section
env_content += "\n# DATA APIS\n"
for key, value in env_vars['rapidapi'].items():
env_content += f"{key}={value}\n"
# Web search section
env_content += "\n# WEB SEARCH\n"
tavily_key = env_vars['search'].get('TAVILY_API_KEY', '')
env_content += f"TAVILY_API_KEY={tavily_key}\n"
# Web scrape section
env_content += "\n# WEB SCRAPE\n"
firecrawl_key = env_vars['search'].get('FIRECRAWL_API_KEY', '')
firecrawl_url = env_vars['search'].get('FIRECRAWL_URL', '')
env_content += f"FIRECRAWL_API_KEY={firecrawl_key}\n"
env_content += f"FIRECRAWL_URL={firecrawl_url}\n"
# Daytona section
env_content += "\n# Sandbox container provider:\n"
for key, value in env_vars['daytona'].items():
env_content += f"{key}={value}\n"
# Add next public URL at the end
env_content += f"NEXT_PUBLIC_URL=http://localhost:3000\n"
# Write to file
with open(env_path, 'w') as f:
f.write(env_content)
print_success(f"Backend .env file created at {env_path}")
print_info(f"Redis host is set to: {redis_host}")
print_info(f"RabbitMQ host is set to: {rabbitmq_host}")
def configure_frontend_env(env_vars, use_docker=True):
"""Configure frontend .env.local file"""
env_path = os.path.join('frontend', '.env.local')
# Use the appropriate backend URL based on start method
backend_url = "http://backend:8000/api" if use_docker else "http://localhost:8000/api"
config = {
'NEXT_PUBLIC_SUPABASE_URL': env_vars['supabase']['SUPABASE_URL'],
'NEXT_PUBLIC_SUPABASE_ANON_KEY': env_vars['supabase']['SUPABASE_ANON_KEY'],
'NEXT_PUBLIC_BACKEND_URL': backend_url,
'NEXT_PUBLIC_URL': 'http://localhost:3000',
}
# Write to file
with open(env_path, 'w') as f:
for key, value in config.items():
f.write(f"{key}={value}\n")
print_success(f"Frontend .env.local file created at {env_path}")
print_info(f"Backend URL is set to: {backend_url}")
def setup_supabase():
"""Setup Supabase database"""
print_info("Setting up Supabase database...")
# Check if the Supabase CLI is installed
try:
subprocess.run(
['supabase', '--version'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=True
)
except (subprocess.SubprocessError, FileNotFoundError):
print_error("Supabase CLI is not installed.")
print_info("Please install it by following instructions at https://supabase.com/docs/guides/cli/getting-started")
print_info("After installing, run this setup again")
sys.exit(1)
# Extract project reference from Supabase URL
supabase_url = os.environ.get('SUPABASE_URL')
if not supabase_url:
# Get from main function if environment variable not set
env_path = os.path.join('backend', '.env')
if os.path.exists(env_path):
with open(env_path, 'r') as f:
for line in f:
if line.startswith('SUPABASE_URL='):
supabase_url = line.strip().split('=', 1)[1]
break
project_ref = None
if supabase_url:
# Extract project reference from URL (format: https://[project_ref].supabase.co)
match = re.search(r'https://([^.]+)\.supabase\.co', supabase_url)
if match:
project_ref = match.group(1)
print_success(f"Extracted project reference '{project_ref}' from your Supabase URL")
# If extraction failed, ask the user
if not project_ref:
print_info("Could not extract project reference from Supabase URL")
print_info("Get your Supabase project reference from the Supabase dashboard")
print_info("It's the portion after 'https://' and before '.supabase.co' in your project URL")
project_ref = input("Enter your Supabase project reference: ")
# Change the working directory to backend
backend_dir = os.path.join(os.getcwd(), 'backend')
print_info(f"Changing to backend directory: {backend_dir}")
try:
# Login to Supabase CLI (interactive)
print_info("Logging into Supabase CLI...")
subprocess.run(['supabase', 'login'], check=True)
# Link to project
print_info(f"Linking to Supabase project {project_ref}...")
subprocess.run(
['supabase', 'link', '--project-ref', project_ref],
cwd=backend_dir,
check=True
)
# Push database migrations
print_info("Pushing database migrations...")
subprocess.run(['supabase', 'db', 'push'], cwd=backend_dir, check=True)
print_success("Supabase database setup completed")
# Reminder for manual step
print_warning("IMPORTANT: You need to manually expose the 'basejump' schema in Supabase")
print_info("Go to the Supabase web platform -> choose your project -> Project Settings -> Data API")
print_info("In the 'Exposed Schema' section, add 'basejump' if not already there")
input("Press Enter once you've completed this step...")
except subprocess.SubprocessError as e:
print_error(f"Failed to setup Supabase: {e}")
sys.exit(1)
def install_dependencies():
"""Install frontend and backend dependencies"""
print_info("Installing required dependencies...")
try:
# Install frontend dependencies
print_info("Installing frontend dependencies...")
subprocess.run(
['npm', 'install'],
cwd='frontend',
check=True
)
print_success("Frontend dependencies installed successfully")
# Lock dependencies
print_info("Locking dependencies...")
subprocess.run(
['poetry', 'lock'],
cwd='backend',
check=True
)
# Install backend dependencies
print_info("Installing backend dependencies...")
subprocess.run(
['poetry', 'install'],
cwd='backend',
check=True
)
print_success("Backend dependencies installed successfully")
return True
except subprocess.SubprocessError as e:
print_error(f"Failed to install dependencies: {e}")
print_info("You may need to install them manually.")
return False
def start_suna():
"""Start Suna using Docker Compose or manual startup"""
print_info("You can start Suna using either Docker Compose or by manually starting the frontend, backend and worker.")
print(f"\n{Colors.CYAN}How would you like to start Suna?{Colors.ENDC}")
print(f"{Colors.CYAN}[1] {Colors.GREEN}Docker Compose{Colors.ENDC} {Colors.CYAN}(recommended, starts all services){Colors.ENDC}")
print(f"{Colors.CYAN}[2] {Colors.GREEN}Manual startup{Colors.ENDC} {Colors.CYAN}(requires Redis, RabbitMQ & separate terminals){Colors.ENDC}\n")
while True:
start_method = input("Enter your choice (1 or 2): ")
if start_method in ["1", "2"]:
break
print_error("Invalid selection. Please enter '1' for Docker Compose or '2' for Manual startup.")
use_docker = start_method == "1"
if use_docker:
print_info("Starting Suna with Docker Compose...")
try:
# TODO: uncomment when we have pre-built images on Docker Hub or GHCR
# GitHub repository environment variable setup
# github_repo = None
# print(f"\n{Colors.CYAN}Do you want to use pre-built images or build locally?{Colors.ENDC}")
# print(f"{Colors.CYAN}[1] {Colors.GREEN}Pre-built images{Colors.ENDC} {Colors.CYAN}(faster){Colors.ENDC}")
# print(f"{Colors.CYAN}[2] {Colors.GREEN}Build locally{Colors.ENDC} {Colors.CYAN}(customizable){Colors.ENDC}\n")
# while True:
# build_choice = input("Enter your choice (1 or 2): ")
# if build_choice in ["1", "2"]:
# break
# print_error("Invalid selection. Please enter '1' for pre-built images or '2' for building locally.")
# use_prebuilt = build_choice == "1"
# if use_prebuilt:
# # Get GitHub repository name from user
# print_info("For pre-built images, you need to specify a GitHub repository name")
# print_info("Example format: your-github-username/repo-name")
# github_repo = input("Enter GitHub repository name: ")
# if not github_repo or "/" not in github_repo:
# print_warning("Invalid GitHub repository format. Using a default value.")
# # Create a random GitHub repository name as fallback
# random_name = ''.join(random.choices(string.ascii_lowercase, k=8))
# github_repo = f"user/{random_name}"
# # Set the environment variable
# os.environ["GITHUB_REPOSITORY"] = github_repo
# print_info(f"Using GitHub repository: {github_repo}")
# # Start with pre-built images
# print_info("Using pre-built images...")
# subprocess.run(['docker', 'compose', '-f', 'docker-compose.ghcr.yaml', 'up', '-d'], check=True)
# else:
# # Start with docker-compose (build images locally)
# print_info("Building images locally...")
# subprocess.run(['docker', 'compose', 'up', '-d'], check=True)
print_info("Building images locally...")
subprocess.run(['docker', 'compose', 'up', '-d'], check=True)
# Wait for services to be ready
print_info("Waiting for services to start...")
time.sleep(10) # Give services some time to start
# Check if services are running
result = subprocess.run(
['docker', 'compose', 'ps'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
check=True,
text=True
)
if "backend" in result.stdout and "frontend" in result.stdout:
print_success("Suna services are up and running!")
else:
print_warning("Some services might not be running correctly. Check 'docker compose ps' for details.")
except subprocess.SubprocessError as e:
print_error(f"Failed to start Suna: {e}")
sys.exit(1)
return use_docker
else:
print_info("For manual startup, you'll need to:")
print_info("1. Start Redis and RabbitMQ in Docker (required for the backend)")
print_info("2. Start the frontend with npm run dev")
print_info("3. Start the backend with poetry run python3.11 api.py")
print_info("4. Start the worker with poetry run python3.11 -m dramatiq run_agent_background")
print_warning("Note: Redis and RabbitMQ must be running before starting the backend")
print_info("Detailed instructions will be provided at the end of setup")
return use_docker
def final_instructions(use_docker=True, env_vars=None):
"""Show final instructions"""
print(f"\n{Colors.GREEN}{Colors.BOLD}✨ Suna Setup Complete! ✨{Colors.ENDC}\n")
# Display LLM configuration info if available
if env_vars and 'llm' in env_vars and 'MODEL_TO_USE' in env_vars['llm']:
default_model = env_vars['llm']['MODEL_TO_USE']
print_info(f"Suna is configured to use {Colors.GREEN}{default_model}{Colors.ENDC} as the default LLM model")
if use_docker:
print_info("Your Suna instance is now running!")
print_info("Access it at: http://localhost:3000")
print_info("Create an account using Supabase authentication to start using Suna")
print("\nUseful Docker commands:")
print(f"{Colors.CYAN} docker compose ps{Colors.ENDC} - Check the status of Suna services")
print(f"{Colors.CYAN} docker compose logs{Colors.ENDC} - View logs from all services")
print(f"{Colors.CYAN} docker compose logs -f{Colors.ENDC} - Follow logs from all services")
print(f"{Colors.CYAN} docker compose down{Colors.ENDC} - Stop Suna services")
print(f"{Colors.CYAN} docker compose up -d{Colors.ENDC} - Start Suna services (after they've been stopped)")
else:
print_info("Suna setup is complete but services are not running yet.")
print_info("To start Suna, you need to:")
print_info("1. Start Redis and RabbitMQ (required for backend):")
print(f"{Colors.CYAN} cd backend")
print(f" docker compose up redis rabbitmq -d{Colors.ENDC}")
print_info("2. In one terminal:")
print(f"{Colors.CYAN} cd frontend")
print(f" npm run dev{Colors.ENDC}")
print_info("3. In another terminal:")
print(f"{Colors.CYAN} cd backend")
print(f" poetry run python3.11 api.py{Colors.ENDC}")
print_info("3. In one more terminal:")
print(f"{Colors.CYAN} cd backend")
print(f" poetry run python3.11 -m dramatiq run_agent_background{Colors.ENDC}")
print_info("4. Once all services are running, access Suna at: http://localhost:3000")
print_info("5. Create an account using Supabase authentication to start using Suna")
def main():
total_steps = 8 # Reduced by 1 since we're skipping the clone step
current_step = 1
# Print banner
print_banner()
print("This wizard will guide you through setting up Suna, an open-source generalist AI agent.\n")
# Step 1: Check requirements
print_step(current_step, total_steps, "Checking requirements")
check_requirements()
check_docker_running()
# Check if we're in the Suna repository
if not check_suna_directory():
print_error("This setup script must be run from the Suna repository root directory.")
print_info("Please clone the repository first with:")
print_info(" git clone https://github.com/kortix-ai/suna.git")
print_info(" cd suna")
print_info("Then run this setup script again.")
sys.exit(1)
current_step += 1
# Collect all environment variables
print_step(current_step, total_steps, "Collecting Supabase information")
supabase_info = collect_supabase_info()
# Set Supabase URL in environment for later use
os.environ['SUPABASE_URL'] = supabase_info['SUPABASE_URL']
current_step += 1
print_step(current_step, total_steps, "Collecting Daytona information")
daytona_info = collect_daytona_info()
current_step += 1
print_step(current_step, total_steps, "Collecting LLM API keys")
llm_api_keys = collect_llm_api_keys()
current_step += 1
print_step(current_step, total_steps, "Collecting search and web scraping API keys")
search_api_keys = collect_search_api_keys()
current_step += 1
print_step(current_step, total_steps, "Collecting RapidAPI key")
rapidapi_keys = collect_rapidapi_keys()
current_step += 1
# Combine all environment variables
env_vars = {
'supabase': supabase_info,
'daytona': daytona_info,
'llm': llm_api_keys,
'search': search_api_keys,
'rapidapi': rapidapi_keys,
}
# Setup Supabase database
setup_supabase()
current_step += 1
# Install dependencies before starting Suna
print_step(current_step, total_steps, "Installing dependencies")
install_dependencies()
# Configure environment files with the correct settings before starting
print_info("Configuring environment files...")
configure_backend_env(env_vars, True) # Always create for Docker first
configure_frontend_env(env_vars, True)
# Now ask how to start Suna
print_step(current_step, total_steps, "Starting Suna")
use_docker = start_suna()
# Update environment files if needed for non-Docker setup
if not use_docker:
print_info("Updating environment files for manual startup...")
configure_backend_env(env_vars, use_docker)
configure_frontend_env(env_vars, use_docker)
# Final instructions
final_instructions(use_docker, env_vars)
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:
print("\n\nSetup interrupted. You can resume setup anytime by running this script again.")
sys.exit(1)