Revert "add cursor rules"

This commit is contained in:
Marko Kraemer 2025-07-28 08:08:40 +02:00 committed by GitHub
parent 7844380ea2
commit d47dd64ff9
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
5 changed files with 156 additions and 1220 deletions

View File

@ -1,312 +0,0 @@
---
globs: backend/**/*
alwaysApply: false
---
# Backend Development Guidelines
## Python Standards & Best Practices
### Language Features
- Use Python 3.11+ features and type hints consistently
- Follow PEP 8 style guidelines with black formatting
- Use async/await for all I/O operations
- Leverage dataclasses and Pydantic models for structure
- Use context managers for resource management
### Type Safety
- Comprehensive type hints for all functions and classes
- Use `typing` module for complex types (Union, Optional, List, Dict)
- Define custom types for domain concepts
- Use `typing.Protocol` for interface definitions
## FastAPI Architecture Patterns
### API Design
- Use dependency injection for database connections and services
- Implement proper request/response models with Pydantic v2
- Follow RESTful API design principles
- Use FastAPI's automatic OpenAPI documentation
- Implement proper HTTP status codes and error responses
### Route Organization
- Group related routes in separate modules
- Use APIRouter for modular route organization
- Implement consistent error handling middleware
- Use dependencies for authentication and authorization
### Example Patterns
```python
# Route with proper dependency injection
@router.post("/agents", response_model=AgentResponse)
async def create_agent(
agent_data: AgentCreateRequest,
db: DBConnection = Depends(get_db),
user: UserClaims = Depends(get_current_user)
) -> AgentResponse:
try:
agent = await agent_service.create_agent(agent_data, user.id)
return AgentResponse.from_orm(agent)
except ValueError as e:
raise HTTPException(status_code=400, detail=str(e))
```
## Database Integration
### Supabase Patterns
- Use proper SQL migrations in `backend/supabase/migrations/`
- Follow established schema patterns with UUID primary keys
- Implement row-level security (RLS) for all user-accessible tables
- Use proper indexing for performance optimization
### Migration Best Practices
```sql
-- Idempotent migration pattern
BEGIN;
-- Create table with proper constraints
CREATE TABLE IF NOT EXISTS example_table (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES auth.users(id) ON DELETE CASCADE,
name VARCHAR(255) NOT NULL,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW(),
CONSTRAINT example_table_name_not_empty CHECK (LENGTH(TRIM(name)) > 0)
);
-- Create indexes for performance
CREATE INDEX IF NOT EXISTS idx_example_table_user_id ON example_table(user_id);
CREATE INDEX IF NOT EXISTS idx_example_table_created_at ON example_table(created_at);
-- Enable RLS
ALTER TABLE example_table ENABLE ROW LEVEL SECURITY;
-- Create RLS policies
CREATE POLICY "Users can manage their own records" ON example_table
FOR ALL USING (auth.uid() = user_id);
-- Create trigger for updated_at
CREATE OR REPLACE FUNCTION update_updated_at_column()
RETURNS TRIGGER AS $$
BEGIN
NEW.updated_at = NOW();
RETURN NEW;
END;
$$ language 'plpgsql';
DROP TRIGGER IF EXISTS update_example_table_updated_at ON example_table;
CREATE TRIGGER update_example_table_updated_at
BEFORE UPDATE ON example_table
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
COMMIT;
```
## Tool Development Framework
### Tool Base Classes
- Extend `AgentBuilderBaseTool` for agent builder tools
- Extend `Tool` for general agent tools
- Use proper inheritance patterns and method overrides
### Tool Schema Implementation
```python
class ExampleTool(AgentBuilderBaseTool):
@openapi_schema({
"type": "function",
"function": {
"name": "example_action",
"description": "Perform an example action with detailed description",
"parameters": {
"type": "object",
"properties": {
"param1": {
"type": "string",
"description": "First parameter with clear explanation"
},
"param2": {
"type": "integer",
"description": "Second parameter with default value",
"default": 0
}
},
"required": ["param1"]
}
}
})
@xml_schema(
tag_name="example-action",
mappings=[
{"param_name": "param1", "node_type": "attribute", "path": "."},
{"param_name": "param2", "node_type": "attribute", "path": "."}
]
)
async def example_action(self, param1: str, param2: int = 0) -> ToolResult:
try:
logger.info(f"Executing example_action with params: {param1}, {param2}")
# Implementation logic here
result = await self.perform_action(param1, param2)
return self.success_response(
result=result,
message=f"Successfully completed action for {param1}"
)
except Exception as e:
logger.error(f"Tool execution failed: {e}", exc_info=True)
return self.fail_response(f"Failed to perform action: {str(e)}")
```
### Tool Registration
- Use `AgentBuilderToolRegistry` pattern for registering tools
- Follow `MCPToolWrapper` patterns for external tool integration
- Use `DynamicToolBuilder` for runtime tool creation
- Implement proper tool discovery and validation
## LLM Integration & Agent System
### LiteLLM Usage
- Use LiteLLM for multi-provider support (Anthropic, OpenAI, etc.)
- Implement proper prompt templates and formatting
- Handle rate limits and retries gracefully
- Use structured outputs when available
### Agent Threading
- Use ThreadManager for conversation management
- Implement proper message threading and context
- Handle tool execution timeouts gracefully
- Use narrative communication style for user updates
### Background Jobs
- Use Dramatiq for async processing
- Implement proper task queuing and retries
- Use QStash for scheduled tasks
- Handle job failures and monitoring
## Security & Authentication
### JWT Validation
- Validate JWT tokens without signature verification for Supabase
- Extract user claims properly from tokens
- Implement proper role-based access control
- Handle token expiration and refresh
### Credential Management
- Use environment variables for all API keys
- Encrypt sensitive data like MCP credentials using Fernet
- Implement secure credential storage in database
- Rotate credentials regularly
### Input Validation
- Validate all inputs using Pydantic models
- Sanitize user inputs to prevent injection attacks
- Implement rate limiting for API endpoints
- Use CORS policies appropriately
## Error Handling & Logging
### Structured Logging
```python
from utils.logger import logger
# Example usage
logger.info(
"Agent execution started",
agent_id=agent_id,
user_id=user_id,
trace_id=trace_id
)
```
### Error Handling Patterns
- Use custom exception classes for domain errors
- Implement proper error boundaries and recovery
- Log errors with appropriate context
- Return user-friendly error messages
### Monitoring & Observability
- Use Langfuse for LLM call tracing
- Integrate Sentry for error tracking
- Implement health checks for services
- Use Prometheus metrics for monitoring
## Performance Optimization
### Async Patterns
- Use async/await consistently for I/O operations
- Implement proper connection pooling
- Use Redis for caching frequently accessed data
- Optimize database queries with proper indexing
### Resource Management
- Use context managers for database connections
- Implement proper timeout handling
- Use connection pooling for external APIs
- Monitor memory usage and clean up resources
## Testing Strategies
### Unit Testing
- Use pytest with async support
- Mock external dependencies properly
- Test error conditions and edge cases
- Maintain high test coverage for critical paths
### Integration Testing
- Test API endpoints with real database
- Use test fixtures for consistent data
- Test authentication and authorization flows
- Validate tool execution and responses
## Key Dependencies & Versions
### Core Framework
- FastAPI 0.115+ for API framework
- Python 3.11+ with latest type hints
- Pydantic 2.x for data validation
- Uvicorn for ASGI server
### Database & Storage
- Supabase 2.17+ for database and auth
- Redis 5.2+ for caching and sessions
- PostgreSQL via Supabase with RLS
### Agent & LLM
- LiteLLM 1.72+ for LLM integration
- Dramatiq 1.18+ for background jobs
- Langfuse for observability
- Sentry for error tracking
### Security & Utilities
- PyJWT for token validation
- Cryptography for encryption
- APScheduler for task scheduling
- Structlog for structured logging

View File

@ -1,143 +0,0 @@
---
globs: frontend/**/*
alwaysApply: false
---
# Frontend Development Guidelines
## TypeScript Standards
- Use TypeScript strictly - no `any` types unless absolutely necessary
- Define proper interfaces and types for all components and functions
- Use type imports: `import type { ComponentProps } from 'react'`
- Leverage TypeScript 5+ features like `satisfies` operator
## Next.js App Router
- Use App Router with `app/` directory structure
- Follow file naming: kebab-case for files, PascalCase for components
- Organize components in feature-based folders
- Keep reusable components in `src/components/`
## UI Framework - shadcn/ui Default
### Setup
```bash
npx shadcn-ui@latest init
npx shadcn-ui@latest add button input form card dropdown-menu dialog
```
### Usage
```typescript
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
// Use shadcn/ui components directly
const AgentCard = ({ agent }: { agent: Agent }) => (
<Card>
<CardHeader>
<CardTitle>{agent.name}</CardTitle>
</CardHeader>
<CardContent>
<p>{agent.description}</p>
<Button>Run Agent</Button>
</CardContent>
</Card>
);
```
## State Management
- **Server State**: `@tanstack/react-query` for data fetching
- **Local State**: React hooks (`useState`, `useReducer`)
- **Forms**: React Hook Form with Zod validation
```typescript
// Query pattern
function useAgents() {
return useQuery({
queryKey: ["agents"],
queryFn: () => agentService.getAgents(),
});
}
// Form pattern with shadcn/ui
import { useForm } from "react-hook-form";
import { zodResolver } from "@hookform/resolvers/zod";
import {
Form,
FormField,
FormItem,
FormLabel,
FormControl,
} from "@/components/ui/form";
const form = useForm({
resolver: zodResolver(schema),
});
```
## Supabase Integration
```typescript
// Auth hook
function useAuth() {
const [user, setUser] = useState<User | null>(null);
useEffect(() => {
const {
data: { subscription },
} = supabase.auth.onAuthStateChange((event, session) =>
setUser(session?.user ?? null)
);
return () => subscription.unsubscribe();
}, []);
return { user };
}
```
## Performance
- Use `lazy()` and `Suspense` for code splitting
- Use `memo()` and `useMemo()` for expensive computations
- Use `useCallback()` for stable function references
## Key Dependencies
### Core Framework
- Next.js 15+ with App Router and Turbopack
- React 18+ with TypeScript 5+
### UI & Styling
- shadcn/ui for components
- Tailwind CSS for styling
- Lucide React for icons
### State & Data
- @tanstack/react-query for server state
- @supabase/supabase-js for database
- react-hook-form + zod for forms
## Essential shadcn/ui Components
Add these commonly used components:
```bash
npx shadcn-ui@latest add button input textarea select checkbox form card dialog dropdown-menu badge table tabs toast
```
## Best Practices
- Use shadcn/ui components as the default choice
- Follow shadcn/ui patterns for consistent styling
- Use the `cn` utility for conditional classes
- Implement proper loading and error states
- Use semantic HTML elements
- Ensure keyboard navigation works

View File

@ -1,619 +0,0 @@
---
description: docker, infrastructure
alwaysApply: false
---
# Infrastructure & DevOps Guidelines
## Docker & Containerization
### Dockerfile Best Practices
```dockerfile
# Multi-stage build pattern for production optimization
FROM node:18-alpine AS frontend-builder
WORKDIR /app
COPY frontend/package*.json ./
RUN npm ci --only=production
COPY frontend/ ./
RUN npm run build
FROM python:3.11-slim AS backend-base
WORKDIR /app
COPY backend/pyproject.toml ./
RUN pip install -e .
FROM backend-base AS backend-production
COPY backend/ ./
EXPOSE 8000
CMD ["gunicorn", "api:app", "--host", "0.0.0.0", "--port", "8000"]
# Security best practices
RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs
USER nextjs
# Health check implementation
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8000/health || exit 1
```
### Docker Compose Patterns
```yaml
# Production-ready docker-compose.yml
version: "3.8"
services:
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
target: production
ports:
- "3000:3000"
environment:
- NODE_ENV=production
- NEXT_PUBLIC_SUPABASE_URL=${SUPABASE_URL}
depends_on:
- backend
restart: unless-stopped
networks:
- app-network
backend:
build:
context: ./backend
dockerfile: Dockerfile
ports:
- "8000:8000"
environment:
- DATABASE_URL=${DATABASE_URL}
- REDIS_URL=${REDIS_URL}
volumes:
- ./backend/logs:/app/logs
depends_on:
redis:
condition: service_healthy
restart: unless-stopped
networks:
- app-network
redis:
image: redis:7-alpine
ports:
- "6379:6379"
volumes:
- redis-data:/data
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 10s
timeout: 3s
retries: 3
restart: unless-stopped
networks:
- app-network
nginx:
image: nginx:alpine
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf:ro
- ./ssl:/etc/nginx/ssl:ro
depends_on:
- frontend
- backend
restart: unless-stopped
networks:
- app-network
volumes:
redis-data:
networks:
app-network:
driver: bridge
```
## Environment Management
### Environment Configuration
```bash
# .env.local (development)
NODE_ENV=development
NEXT_PUBLIC_SUPABASE_URL=http://localhost:54321
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_anon_key
DATABASE_URL=postgresql://user:pass@localhost:5432/suna_dev
REDIS_URL=redis://localhost:6379
LOG_LEVEL=debug
# .env.production
NODE_ENV=production
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
DATABASE_URL=postgresql://user:pass@prod-db:5432/suna_prod
REDIS_URL=redis://prod-redis:6379
LOG_LEVEL=info
SENTRY_DSN=your_sentry_dsn
```
### Tool Version Management (mise.toml)
```toml
[tools]
node = "18.17.0"
python = "3.11.5"
docker = "24.0.0"
docker-compose = "2.20.0"
[env]
UV_VENV = ".venv"
PYTHON_KEYRING_BACKEND = "keyring.backends.null.Keyring"
```
### Environment-Specific Scripts
```bash
#!/bin/bash
# scripts/start-dev.sh
set -e
echo "Starting development environment..."
# Check if required tools are installed
command -v docker >/dev/null 2>&1 || { echo "Docker is required but not installed. Aborting." >&2; exit 1; }
command -v docker-compose >/dev/null 2>&1 || { echo "Docker Compose is required but not installed. Aborting." >&2; exit 1; }
# Start services
docker-compose -f docker-compose.dev.yml up -d
echo "✅ Development services started"
# Wait for services to be healthy
echo "Waiting for services to be ready..."
sleep 10
# Run database migrations
docker-compose -f docker-compose.dev.yml exec backend python -m alembic upgrade head
echo "✅ Database migrations completed"
echo "🚀 Development environment is ready!"
echo "Frontend: http://localhost:3000"
echo "Backend: http://localhost:8000"
echo "Redis: localhost:6379"
```
## Deployment Strategies
### GitHub Actions CI/CD
```yaml
# .github/workflows/deploy.yml
name: Deploy to Production
on:
push:
branches: [main]
pull_request:
branches: [main]
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.11"
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: "18"
cache: "npm"
cache-dependency-path: frontend/package-lock.json
- name: Install backend dependencies
run: |
cd backend
pip install -e .
- name: Install frontend dependencies
run: |
cd frontend
npm ci
- name: Run backend tests
run: |
cd backend
pytest
- name: Run frontend tests
run: |
cd frontend
npm run test
- name: Lint code
run: |
cd backend && python -m black --check .
cd frontend && npm run lint
build-and-push:
needs: test
runs-on: ubuntu-latest
if: github.ref == 'refs/heads/main'
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v4
- name: Log in to Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
deploy:
needs: build-and-push
runs-on: ubuntu-latest
if: github.ref == 'refs/heads/main'
steps:
- name: Deploy to production
run: |
# Add your deployment commands here
# e.g., kubectl apply, docker-compose up, etc.
echo "Deploying to production..."
```
### Database Migration Management
```bash
#!/bin/bash
# scripts/migrate.sh
set -e
ENVIRONMENT=${1:-development}
echo "Running migrations for $ENVIRONMENT environment..."
case $ENVIRONMENT in
development)
docker-compose exec backend python -m alembic upgrade head
;;
production)
# Production migration with backup
kubectl exec -it backend-pod -- python -m alembic upgrade head
;;
*)
echo "Unknown environment: $ENVIRONMENT"
exit 1
;;
esac
echo "✅ Migrations completed for $ENVIRONMENT"
```
## Monitoring & Observability
### Health Check Endpoints
```python
# backend/health.py
from fastapi import APIRouter, Depends
from sqlalchemy.orm import Session
from redis import Redis
import time
router = APIRouter()
@router.get("/health")
async def health_check():
"""Comprehensive health check endpoint"""
start_time = time.time()
checks = {
"status": "healthy",
"timestamp": start_time,
"version": "1.0.0",
"environment": os.getenv("NODE_ENV", "development"),
"checks": {}
}
# Database health check
try:
db.execute("SELECT 1")
checks["checks"]["database"] = {"status": "healthy", "latency_ms": 0}
except Exception as e:
checks["status"] = "unhealthy"
checks["checks"]["database"] = {"status": "unhealthy", "error": str(e)}
# Redis health check
try:
redis_client.ping()
checks["checks"]["redis"] = {"status": "healthy"}
except Exception as e:
checks["status"] = "unhealthy"
checks["checks"]["redis"] = {"status": "unhealthy", "error": str(e)}
checks["response_time_ms"] = (time.time() - start_time) * 1000
return checks
@router.get("/metrics")
async def metrics():
"""Prometheus-style metrics endpoint"""
return {
"active_connections": get_active_connections(),
"memory_usage_mb": get_memory_usage(),
"cpu_usage_percent": get_cpu_usage(),
"request_count": get_request_count(),
}
```
### Logging Configuration
```python
# backend/utils/logging.py
import structlog
import logging.config
def setup_logging(environment: str = "development"):
"""Configure structured logging"""
processors = [
structlog.stdlib.filter_by_level,
structlog.stdlib.add_logger_name,
structlog.stdlib.add_log_level,
structlog.stdlib.PositionalArgumentsFormatter(),
structlog.processors.TimeStamper(fmt="iso"),
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
]
if environment == "production":
processors.append(structlog.processors.JSONRenderer())
else:
processors.append(structlog.dev.ConsoleRenderer())
structlog.configure(
processors=processors,
wrapper_class=structlog.stdlib.BoundLogger,
logger_factory=structlog.stdlib.LoggerFactory(),
cache_logger_on_first_use=True,
)
```
## Security & Compliance
### Security Headers (Nginx)
```nginx
# nginx.conf security configuration
server {
listen 443 ssl http2;
server_name your-domain.com;
# SSL configuration
ssl_certificate /etc/nginx/ssl/cert.pem;
ssl_certificate_key /etc/nginx/ssl/key.pem;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512;
# Security headers
add_header X-Frame-Options DENY always;
add_header X-Content-Type-Options nosniff always;
add_header X-XSS-Protection "1; mode=block" always;
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline';" always;
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# Rate limiting
limit_req_zone $binary_remote_addr zone=api:10m rate=10r/s;
location /api/ {
limit_req zone=api burst=20 nodelay;
proxy_pass http://backend:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### Secrets Management
```bash
#!/bin/bash
# scripts/setup-secrets.sh
set -e
echo "Setting up secrets for production..."
# Create Kubernetes secrets
kubectl create secret generic suna-secrets \
--from-literal=database-url=$DATABASE_URL \
--from-literal=redis-url=$REDIS_URL \
--from-literal=supabase-key=$SUPABASE_SERVICE_KEY \
--from-literal=openai-api-key=$OPENAI_API_KEY \
--dry-run=client -o yaml | kubectl apply -f -
echo "✅ Secrets configured"
```
## Performance & Scaling
### Load Balancing Configuration
```yaml
# kubernetes/ingress.yml
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: suna-ingress
annotations:
nginx.ingress.kubernetes.io/rewrite-target: /
nginx.ingress.kubernetes.io/ssl-redirect: "true"
nginx.ingress.kubernetes.io/rate-limit: "100"
nginx.ingress.kubernetes.io/rate-limit-window: "1m"
spec:
tls:
- hosts:
- suna.example.com
secretName: suna-tls
rules:
- host: suna.example.com
http:
paths:
- path: /api/
pathType: Prefix
backend:
service:
name: backend-service
port:
number: 8000
- path: /
pathType: Prefix
backend:
service:
name: frontend-service
port:
number: 3000
```
### Auto-scaling Configuration
```yaml
# kubernetes/hpa.yml
apiVersion: autoscaling/v2
kind: HorizontalPodAutoscaler
metadata:
name: backend-hpa
spec:
scaleTargetRef:
apiVersion: apps/v1
kind: Deployment
name: backend-deployment
minReplicas: 2
maxReplicas: 10
metrics:
- type: Resource
resource:
name: cpu
target:
type: Utilization
averageUtilization: 70
- type: Resource
resource:
name: memory
target:
type: Utilization
averageUtilization: 80
```
## Backup & Recovery
### Database Backup Strategy
```bash
#!/bin/bash
# scripts/backup-database.sh
set -e
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups"
DATABASE_URL=${DATABASE_URL}
echo "Creating database backup..."
# Create backup with compression
pg_dump "$DATABASE_URL" | gzip > "$BACKUP_DIR/suna_backup_$TIMESTAMP.sql.gz"
# Upload to cloud storage (adjust for your provider)
aws s3 cp "$BACKUP_DIR/suna_backup_$TIMESTAMP.sql.gz" "s3://suna-backups/"
# Clean up local backups older than 7 days
find "$BACKUP_DIR" -name "suna_backup_*.sql.gz" -mtime +7 -delete
echo "✅ Database backup completed: suna_backup_$TIMESTAMP.sql.gz"
```
### Disaster Recovery Plan
```bash
#!/bin/bash
# scripts/restore-database.sh
set -e
BACKUP_FILE=${1}
if [ -z "$BACKUP_FILE" ]; then
echo "Usage: $0 <backup-file>"
exit 1
fi
echo "Restoring database from $BACKUP_FILE..."
# Download backup from cloud storage
aws s3 cp "s3://suna-backups/$BACKUP_FILE" "/tmp/$BACKUP_FILE"
# Restore database
gunzip -c "/tmp/$BACKUP_FILE" | psql "$DATABASE_URL"
echo "✅ Database restored from $BACKUP_FILE"
```
## Key Infrastructure Tools
### Container & Orchestration
- Docker 24+ for containerization
- Docker Compose for local development
- Kubernetes for production orchestration
- Helm for package management
### CI/CD & Automation
- GitHub Actions for CI/CD pipelines
- Terraform for infrastructure as code
- Ansible for configuration management
- ArgoCD for GitOps deployments
### Monitoring & Observability
- Prometheus for metrics collection
- Grafana for dashboards and visualization
- Jaeger for distributed tracing
- ELK stack for log aggregation
### Security & Compliance
- Vault for secrets management
- OWASP ZAP for security testing
- Trivy for container vulnerability scanning
- Falco for runtime security monitoring

View File

@ -1,146 +0,0 @@
---
alwaysApply: true
---
# Suna AI Agent Project - Cursor Rules
## Project Overview
Suna is an open-source generalist AI agent with a full-stack architecture:
- **Frontend**: Next.js 15+ with TypeScript, Tailwind CSS, Radix UI, React Query
- **Backend**: Python 3.11+ with FastAPI, Supabase, Redis, LiteLLM, Dramatiq
- **Agent System**: Isolated Docker environments with comprehensive tool execution
- **Database**: Supabase for persistence, authentication, real-time features, and RLS
## Architecture Components
### Core Stack
- **Frontend**: Next.js App Router + TypeScript + Tailwind + Radix UI
- **Backend**: FastAPI + Supabase + Redis + LiteLLM + Dramatiq workers
- **Database**: PostgreSQL via Supabase with Row Level Security
- **Agent Runtime**: Docker containers with browser automation, code interpreter
- **Authentication**: Supabase Auth with JWT validation
- **Monitoring**: Langfuse tracing, Sentry error tracking, Prometheus metrics
### File Organization
```txt
project/
├── frontend/ # Next.js application
│ └── src/
│ ├── app/ # Next.js app router pages
│ ├── components/ # Reusable React components
│ ├── hooks/ # Custom React hooks
│ ├── lib/ # Utilities and configurations
│ ├── providers/ # Context providers
│ └── contexts/ # React contexts
├── backend/ # Python FastAPI backend
│ ├── agent/ # AI agent core implementation
│ ├── services/ # Business logic services
│ ├── utils/ # Shared utilities
│ ├── supabase/ # Database migrations & config
│ ├── tools/ # Agent tool implementations
│ ├── auth/ # Authentication logic
│ ├── triggers/ # Event-driven triggers
│ └── api.py # Main FastAPI application
└── docs/ # Documentation
```
## Development Principles
### Code Quality Standards
- **Type Safety**: Strict TypeScript frontend, comprehensive Python type hints
- **Error Handling**: Structured error responses, proper exception handling
- **Logging**: Structured logging with context throughout the stack
- **Testing**: Unit tests for core logic, integration tests for APIs
- **Security**: Input validation, authentication, encryption for sensitive data
### Performance Guidelines
- **Frontend**: Code splitting, lazy loading, optimized bundle size
- **Backend**: Async/await patterns, connection pooling, Redis caching
- **Database**: Proper indexing, query optimization, RLS policies
- **Agent**: Timeout handling, resource limits, sandbox isolation
### Integration Patterns
- **LLM Integration**: LiteLLM for multi-provider support, structured prompts
- **Tool System**: Dual schema decorators (OpenAPI + XML), consistent ToolResult
- **Real-time**: Supabase subscriptions for live updates
- **Background Jobs**: Dramatiq for async processing, QStash for scheduling
## Key Technologies
### Frontend Dependencies
- Next.js 15+, React 18+, TypeScript 5+
- @tanstack/react-query, @supabase/supabase-js
- @radix-ui components, @tailwindcss/typography
- @hookform/resolvers, react-hook-form
### Backend Dependencies
- FastAPI 0.115+, Python 3.11+
- Supabase 2.17+, Redis 5.2+, LiteLLM 1.72+
- Dramatiq 1.18+, Pydantic for validation
- Sentry, Langfuse, Prometheus for observability
## Advanced Patterns
### Agent System Architecture
- **Versioning**: Multiple agent versions with `agent_versions` table
- **Configuration**: JSONB config storage with validation
- **Workflows**: Step-by-step execution with `agent_workflows`
- **Triggers**: Scheduled and event-based automation
- **Builder Tools**: Dynamic agent creation and management
### Security & Authentication
- **JWT Validation**: Supabase token verification without signature check
- **Row Level Security**: Database-level access control
- **Credential Encryption**: Secure storage of sensitive API keys
- **Input Validation**: Pydantic models for all user inputs
### Database Patterns
- **Migrations**: Idempotent SQL with proper error handling
- **Indexing**: Foreign keys and query optimization
- **Triggers**: Automated timestamp management
- **Enums**: Safe enum creation with duplicate handling
## Development Workflow
### Environment Setup
- Use `mise.toml` for tool version management
- Docker Compose for local development stack
- Environment-specific configurations (LOCAL/STAGING/PRODUCTION)
### Code Standards
- Follow established naming conventions
- Implement proper error boundaries
- Use consistent logging patterns
- Handle loading and error states
### Testing Strategy
- Unit tests for business logic
- Integration tests for API endpoints
- E2E tests for critical user flows
- Performance testing for agent execution
## When in Doubt
- Follow existing patterns in the codebase
- Check similar implementations for guidance
- Use established error handling and logging
- Prioritize type safety and security
- Consult domain-specific rule files for detailed guidance
- Check similar implementations for guidance
- Use the established error handling patterns
- Follow the logging conventions with structured logging

156
docs/GOOGLE_OAUTH_SETUP.md Normal file
View File

@ -0,0 +1,156 @@
# Google OAuth Setup Guide
This guide will help you configure Google Sign-In for your Suna application to avoid common errors like "Access blocked: This app's request is invalid".
## Prerequisites
- A Google Cloud Console account
- Your Supabase project URL and anon key
## Step 1: Create a Google Cloud Project
1. Go to [Google Cloud Console](https://console.cloud.google.com/)
2. Click "Select a project" → "New Project"
3. Enter a project name (e.g., "Suna App")
4. Click "Create"
## Step 2: Configure OAuth Consent Screen
1. In the Google Cloud Console, go to "APIs & Services" → "OAuth consent screen"
2. Select "External" user type (unless you have a Google Workspace account)
3. Click "Create"
4. Fill in the required fields:
- **App name**: Your application name (e.g., "Suna")
- **User support email**: Your email address
- **App logo**: Optional, but recommended
- **App domain**: Your domain (for local dev, skip this)
- **Authorized domains**: Add your domain(s)
- **Developer contact information**: Your email address
5. Click "Save and Continue"
6. **Scopes**: Click "Add or Remove Scopes"
- Select `.../auth/userinfo.email`
- Select `.../auth/userinfo.profile`
- Select `openid`
- Click "Update"
7. Click "Save and Continue"
8. **Test users**: Add test email addresses if in testing mode
9. Click "Save and Continue"
10. Review and click "Back to Dashboard"
## Step 3: Create OAuth 2.0 Credentials
1. Go to "APIs & Services" → "Credentials"
2. Click "Create Credentials" → "OAuth client ID"
3. Select "Web application" as the application type
4. Configure the client:
- **Name**: "Suna Web Client" (or any name you prefer)
- **Authorized JavaScript origins**:
- Add `http://localhost:3000` (for local development)
- Add `https://yourdomain.com` (for production)
- Add your Supabase URL (e.g., `https://yourproject.supabase.co`)
- **Authorized redirect URIs**:
- Add `http://localhost:3000/auth/callback` (for local development)
- Add `https://yourdomain.com/auth/callback` (for production)
- Add your Supabase auth callback URL: `https://yourproject.supabase.co/auth/v1/callback`
5. Click "Create"
6. **Important**: Copy the "Client ID" - you'll need this
## Step 4: Configure Supabase
1. Go to your [Supabase Dashboard](https://app.supabase.com)
2. Select your project
3. Go to "Authentication" → "Providers"
4. Find "Google" and enable it
5. Add your Google OAuth credentials:
- **Client ID**: Paste the Client ID from Step 3
- **Client Secret**: Leave empty (not needed for web applications)
6. **Authorized Client IDs**: Add your Client ID here as well
7. Click "Save"
## Step 5: Configure Your Application
1. Add the Google Client ID to your environment variables:
**Frontend** (`frontend/.env.local`):
```env
NEXT_PUBLIC_GOOGLE_CLIENT_ID=your-client-id-here
```
2. Restart your development server
## Step 6: Test Your Setup
1. Open your application in a browser
2. Click the "Continue with Google" button
3. You should see the Google sign-in popup
4. Select an account and authorize the application
5. You should be redirected back to your application and logged in
## Common Issues and Solutions
### "Access blocked: This app's request is invalid"
This error usually means:
- **Missing redirect URI**: Make sure all your redirect URIs are added in Google Cloud Console
- **Wrong Client ID**: Verify you're using the correct Client ID
- **OAuth consent screen not configured**: Complete all required fields in the consent screen
### "redirect_uri_mismatch"
- Check that your redirect URIs in Google Cloud Console exactly match your application URLs
- Include the protocol (`http://` or `https://`)
- Don't include trailing slashes
- For local development, use `http://localhost:3000`, not `http://127.0.0.1:3000`
### "invalid_client"
- Verify your Client ID is correct in the environment variables
- Make sure you're using the Web application client ID, not a different type
- Check that the OAuth client hasn't been deleted in Google Cloud Console
### Google button doesn't appear
- Check browser console for errors
- Verify `NEXT_PUBLIC_GOOGLE_CLIENT_ID` is set in your environment
- Make sure the Google Identity Services script is loading
## Production Deployment
When deploying to production:
1. Update Google Cloud Console:
- Add your production domain to "Authorized JavaScript origins"
- Add your production callback URL to "Authorized redirect URIs"
- Update the OAuth consent screen with production information
2. Update your production environment variables:
- Set `NEXT_PUBLIC_GOOGLE_CLIENT_ID` in your deployment platform
3. Verify Supabase settings:
- Ensure Google provider is enabled
- Confirm the Client ID is set correctly
## Security Best Practices
1. **Never commit your Client ID to version control** - always use environment variables
2. **Use HTTPS in production** - Google requires secure connections for OAuth
3. **Restrict your OAuth client** - Only add the domains you actually use
4. **Review permissions regularly** - Remove unused test users and unnecessary scopes
5. **Monitor usage** - Check Google Cloud Console for unusual activity
## Publishing Your App
If you want to remove the "unverified app" warning:
1. Go to "OAuth consent screen" in Google Cloud Console
2. Click "Publish App"
3. Google may require verification for certain scopes
4. Follow the verification process if required
## Need Help?
If you're still experiencing issues:
1. Check the browser console for detailed error messages
2. Verify all URLs and IDs are correctly copied
3. Ensure your Supabase project is properly configured
4. Try using an incognito/private browser window to rule out cache issues