The project's detailed documentation can be found in the `/documentation` directory. This directory contains the following files with comprehensive information on various aspects of the codebase:
-`handlers.mdc` - Documentation for writing and using handlers
-`libs.mdc` - Guidelines for library construction and organization
-`rest.mdc` - REST API formatting rules and patterns
-`testing.mdc` - Testing standards, utilities, and best practices
-`tools.mdc` - Documentation for building tools
-`websockets.mdc` - WebSocket API formatting rules
When working on the codebase, please refer to these documentation files for detailed guidance on implementation patterns, code organization, and best practices.
- **Dependencies**: All dependencies must be inherited from the workspace Cargo.toml using `{ workspace = true }`. Never specify library-specific dependency versions to ensure consistent dependency versions across the entire project.
Handlers are the core business logic components that implement functionality used by both REST and WebSocket endpoints. They are also often used by the CLI package. This document outlines the structure, patterns, and best practices for working with handlers.
- Use shared test utilities from workspace when available
- Keep library-specific test helpers in the library
- Use workspace-defined test macros if available
### 5. Documentation
- Follow workspace documentation style
- Link to related workspace documentation
- Document workspace integration points
- Include examples showing workspace type usage
### 6. Integration Points
- Define clear boundaries with other workspace crates
- Use workspace traits and interfaces
- Share common utilities through workspace-level crates
- Consider cross-crate testing
### 7. Development Workflow
- Run workspace-level tests when making changes
- Update workspace documentation if needed
- Follow workspace versioning strategy
- Use workspace-level CI/CD pipelines
### 8. Dependencies
- All dependencies should be inherited from the workspace
- Never add library-specific dependency versions
- Keep dependencies minimal and focused
- The workspace will manage all dependency versions
# PRD (Product Requirements Document) Guidelines
## Overview
This document provides guidelines for creating and managing Product Requirements Documents (PRDs) in our codebase. All PRDs should follow the standardized template located at [template.md](mdc:prds/template.md)
## PRD Structure
### Location
All PRDs should be stored in the `/prds` directory with the following structure:
```
/prds
├── template.md # The master template for all PRDs
├── active/ # Active/In-progress PRDs
│ ├── feature_auth.md
│ └── api_deployment.md
├── completed/ # Completed PRDs that have been shipped
│ ├── feature_user_auth.md
│ └── api_deployment.md
└── archived/ # Archived/Deprecated PRDs
```
### Naming Convention
- Use snake_case for file names
- Include a prefix for the type of change:
-`feature_` for new features
-`enhancement_` for improvements
-`fix_` for bug fixes
-`refactor_` for code refactoring
-`api_` for API changes
## Using the Template
### Getting Started
1. Copy [template.md](mdc:prds/template.md) to create a new PRD
2. Place it in the `/prds/active` directory
3. Fill out each section following the template's comments and guidelines
### Key Sections to Focus On
The template [template.md](mdc:prds/template.md) provides comprehensive sections. Pay special attention to:
1.**Problem Statement**
- Must clearly articulate the current state
- Include measurable impact
- Reference any relevant metrics or data
2.**Technical Design**
- Include all affected components
- Document ALL file changes (new/modified/deleted)
- Provide actual code examples
- Include database migrations if needed
3.**Implementation Plan**
- Break down into deployable phases
- Include clear success criteria
- List dependencies between phases
- Provide testing strategy for each phase
4.**Testing Strategy**
- Unit test requirements
- Integration test scenarios
## Best Practices
### Documentation
1. Use clear, concise language
2. Include code examples where relevant
3. Document assumptions and dependencies
4. Keep diagrams up to date
5. Use Mermaid for diagrams when possible
### Lifecycle Management
1. Move PRDs between directories based on status:
- New PRDs → `/prds/active`
- Shipped PRDs → `/prds/completed`
- Deprecated PRDs → `/prds/archived`
2. Update status section regularly:
- ✅ Completed items
- ⏳ In Progress items
- 🔜 Upcoming items
- ❌ Known Issues
### Review Process
1. Technical review
- Architecture alignment
- Security considerations
- Performance implications
- Testing coverage
2. Product review
- Feature completeness
- User impact
- Business value
- Success metrics
## Common Pitfalls to Avoid
1. Incomplete technical specifications
2. Missing file change documentation
3. Unclear success criteria
4. Insufficient testing strategy
5. No rollback plan
6. Missing security considerations
7. Undefined monitoring metrics
## Example PRDs
Reference these example PRDs for guidance:
[template.md](mdc:prds/template.md)
## Checklist Before Submission
- [ ] All template sections completed
- [ ] Technical design is detailed and complete
- [ ] File changes are documented
- [ ] Implementation phases are clear (can be as many as you need.)
- [ ] Testing strategy is defined
- [ ] Security considerations addressed
- [ ] Dependencies and Files listed
- [ ] File References included
# REST API Formatting Rules
## Directory Structure
- All REST routes should be located under `src/routes/rest/routes/`
- Each resource should have its own directory (e.g., `api_keys`, `datasets`)
- Resource directories should contain individual files for each operation
- Each resource directory should have a `mod.rs` that exports and configures the routes
Example folder structure:
```
src/routes/rest/
├── routes/
│ ├── api_keys/
│ │ ├── mod.rs # Router configuration and exports
│ │ ├── list_api_keys.rs # GET / - Contains ApiKeyInfo type definition
│ │ ├── get_api_key.rs # GET /:id
│ │ ├── post_api_key.rs # POST /
│ │ └── delete_api_key.rs # DELETE /:id
│ │
│ ├── datasets/
│ │ ├── mod.rs
│ │ ├── list_datasets.rs # GET /
│ │ ├── get_dataset.rs # GET /:id
│ │ ├── post_dataset.rs # POST /
│ │ ├── update_dataset.rs # PUT /:id
│ │ ├── patch_dataset.rs # PATCH /:id
│ │ ├── delete_dataset.rs # DELETE /:id
│ │ └── deploy_dataset.rs # POST /:id/deploy (action endpoint)
│ │
│ └── users/
│ ├── mod.rs
│ ├── list_users.rs
│ ├── get_user.rs
│ ├── post_user.rs
│ ├── update_user.rs
│ └── api_keys/ # Sub-resource example
│ ├── mod.rs
│ ├── list_user_api_keys.rs
│ └── post_user_api_key.rs
```
Note: File names should be descriptive and match their HTTP operation (list_, get_, post_, update_, patch_, delete_). For action endpoints, use a descriptive verb (deploy_, publish_, etc.).
## Route Handler Pattern
- Each REST endpoint should follow a two-function pattern:
2. Business logic handler (e.g., `get_api_key_handler`) that:
- Contains pure business logic
- Returns `Result<T>` where T is your data type
- Can be reused across different routes (REST/WebSocket)
- Handles database operations and core functionality
## Type Definitions
- Response types should be defined in the corresponding list operation file (e.g., `ApiKeyInfo` in `list_api_keys.rs`)
- These types can be reused across different operations on the same resource
- Use strong typing with Rust structs for request/response bodies
## Router Configuration
- Each resource module should have a `mod.rs` that defines its router
- Use axum's `Router::new()` to define routes
- Group related routes with `.merge()`
- Apply middleware (like auth) at the router level where appropriate
- Follow RESTful patterns for endpoints:
- Collection endpoints (no ID):
- GET / - List resources
- POST / - Create resources (accepts single item or array)
- PUT / - Bulk update resources by criteria
- DELETE / - Bulk delete resources by criteria
- Single resource endpoints (with ID):
- GET /:id - Get single resource
- PUT /:id - Full update of resource (accepts single item or array of updates)
- PATCH /:id - Partial update of resource (accepts single item or array of patches)
- DELETE /:id - Delete resources (accepts single id or array of ids)
- Sub-resource endpoints:
- GET /:id/sub_resource - List sub-resources
- POST /:id/sub_resource - Create sub-resources (accepts single item or array)
- Action endpoints (for operations that don't fit CRUD):
- POST /:id/action_name - Perform specific action
- Example: POST /datasets/:id/deploy
- Query/Filter endpoints:
- GET /search - Complex search with query params
- GET /filter - Filtered list with specific criteria
Note: All mutation endpoints (POST, PUT, PATCH, DELETE) should accept both single items and arrays by default. The handler should handle both cases seamlessly. This eliminates the need for separate /bulk endpoints.
## Example Implementation
See @src/routes/rest/routes/api_keys/get_api_key.rs for a reference implementation that demonstrates:
- Separation of HTTP and business logic
- Error handling pattern
- Type usage and database operations
- Clean abstraction of business logic for potential reuse
## Error Handling
- Business logic handlers should return `Result<T>`
- REST handlers should convert errors to appropriate HTTP status codes
- Use `ApiResponse` enum for consistent response formatting
- Include appropriate error logging using `tracing`
## Middleware
- Most the time, every new route should be authenticated, unless specified differently by the user.
- Apply authentication and other middleware at the router level
- Use `route_layer` to apply middleware to groups of routes
- Keep middleware configuration in the resource's `mod.rs`
# Testing Rules and Best Practices
## General Testing Guidelines
- All tests must be async and use tokio test framework
- Tests should be well-documented with clear test case descriptions and expected outputs
- Each test should focus on testing a single piece of functionality
- Tests should be independent and not rely on the state of other tests
- Use meaningful test names that describe what is being tested
## Unit Tests
- Unit tests should be inline with the code they are testing using `#[cfg(test)]` modules
- Each public function should have corresponding unit tests
- Mock external dependencies using mockito for HTTP calls
- Use `mockito::Server::new_async()` instead of `mockito::Server::new()`
- Test both success and error cases
- Test edge cases and boundary conditions
## Integration Tests
- Integration tests should be placed in the `/tests` directory
- Organize integration tests to mirror the main codebase structure
- Each major feature/resource should have its own test file
- Test the interaction between multiple components
- Use real dependencies when possible, mock only what's necessary
// Set any default environment variables for tests
std::env::set_var("TEST_ENV", "test");
});
}
```
### Example Test Fixtures
```rust
// tests/common/fixtures/users.rs
use crate::database::models::User;
use chrono::Utc;
use uuid::Uuid;
pub fn create_test_user() -> User {
User {
id: Uuid::new_v4(),
email: "test@example.com".to_string(),
name: Some("Test User".to_string()),
config: serde_json::Value::Null,
created_at: Utc::now(),
updated_at: Utc::now(),
attributes: serde_json::Value::Null,
}
}
```
### Using Common Test Utilities
```rust
// Example integration test using common utilities
use crate::tests::common::{db::TestDb, env::setup_test_env, fixtures};
#[tokio::test]
async fn test_user_creation() {
// Setup test environment
setup_test_env();
// Initialize test database
let test_db = TestDb::new().await.expect("Failed to setup test database");
// Get test user fixture
let test_user = fixtures::users::create_test_user();
// Run test
let result = create_user(&test_db.pool, &test_user).await?;
// Cleanup
test_db.cleanup().await?;
assert!(result.is_ok());
}
```
# Tools Documentation and Guidelines
## Overview
This document outlines the architecture, patterns, and best practices for building tools in our system. Tools are modular, reusable components that provide specific functionality to our AI agents and application.
## Core Architecture
### ToolExecutor Trait
The foundation of our tools system is the `ToolExecutor` trait. Any struct that wants to be used as a tool must implement this trait: