- Add webSearch method to FirecrawlService with WebSearchOptions, WebSearchResult, and WebSearchResponse interfaces
- Export new types from web-tools package
- Create web-search-tool in AI package with proper input/output schemas
- Add comprehensive unit tests for webSearch method (16 tests total)
- Add tool tests with mocking (4 tests)
- Add integration tests (skipped when no API key)
- Update package exports and dependencies
- Default search limit: 5, default format: markdown, content scraping enabled
- Proper error handling using CompanyResearchError
- AI package has no direct Firecrawl dependency
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
The Snowflake adapter implementation transforms column names to lowercase
for consistency, but the tests were expecting uppercase column names.
This commit updates the tests to match the implementation:
- Update test expectations to use lowercase column names (id, name)
- Fix hasMoreRows assertions to match implementation logic (only true when rowCount > limit)
- Ensure all Snowflake-related tests pass with the current adapter behavior
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Increased allowed variance in cached query time checks to accommodate network fluctuations.
- Corrected property name in test assertions to match expected lowercase format.
- Enhanced SnowflakeAdapter to transform column names to lowercase and adjusted logic for determining if more rows are available from the stream.
- Implemented a custom fetch function in both anthropic.ts and vertex.ts to modify the request body.
- If tool_choice is present, added disable_parallel_tool_use to the request body.
- Included error handling for JSON parsing to ensure fallback to original request on failure.
- Added tests for handling AbortError scenarios in both memory-safe and standard streaming models.
- Updated the FallbackModel to gracefully handle intentional aborts without retrying or causing errors.
- Ensured that the controller closes properly on AbortError and that no fallback to subsequent models occurs.
- Enhanced mock models to simulate AbortError conditions for comprehensive testing.
- Replace SELECT * with explicit column names in permission validation tests
- Update CTE test to use explicit columns in final SELECT
- Maintain test intent while aligning with new wildcard security validation
- Ensure all tests pass with new wildcard blocking behavior
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Updated references across multiple files to reflect the new tool name.
- Adjusted unit tests and schemas to ensure compatibility with the renamed tool.
- Removed the old respondWithoutAnalysis tool implementation and its associated tests.
- Ensured backward compatibility for existing functionality using the old tool name.
- Import Select from node-sql-parser for type safety
- Replace function parameter types with Record<string, unknown> for dynamic AST objects
- Use proper type conversions through 'unknown' for incompatible types
- Maintain existing wildcard validation functionality
- Resolve all 8 noExplicitAny linting violations
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Update query() method to use streamResult: true and stmt.streamRows()
- Add network-level row limiting with default 5000 row cap
- Process stream events (data, error, end) to build result set
- Maintain backward compatibility with existing adapter interface
- Update unit tests to mock streaming behavior
- Fix integration test imports and property names
- Preserve query caching by using original SQL unchanged
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Add validateWildcardUsage function to sql-parser-helpers.ts
- Block SELECT * and qualified wildcards on physical database tables
- Allow wildcards on CTEs and derived tables
- Integrate wildcard validation into permission-validator.ts
- Add comprehensive tests for all wildcard validation scenarios
- Revert Rust SQL analyzer changes to focus on TypeScript implementation
- Fix CTE alias handling for qualified wildcards (e.g., SELECT cte_alias.*)
This prevents bypassing column-level permissions through wildcard queries
while maintaining backward compatibility with legitimate query patterns.
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Add curly braces around default case content to satisfy noSwitchDeclarations rule
- Prevents variable declarations from being accessible to other switch cases
- Apply Biome formatting fixes for import statement and spacing
Co-Authored-By: nate@buster.so <nate@buster.so>
- Add validateWildcardUsage() function to block SELECT * on physical tables
- Allow wildcards on CTEs but block on physical database tables
- Add comprehensive tests for wildcard validation scenarios
- Integrate wildcard validation into permission validator
- Supports all SQL dialects via node-sql-parser
- Prevents permission bypass through wildcard queries
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Import database queries from main @buster/database package
- Use getUserOrganizationId() instead of user.organizationId
- Add proper error handling for users without organizations
- Follow existing patterns from other API handlers
Co-Authored-By: nate@buster.so <nate@buster.so>
- Added support for Google Vertex AI model version 2.2.27 in pnpm-lock.yaml and package.json.
- Introduced new model wrappers for Sonnet4 and Haiku35 with fallback logic.
- Updated various agents to utilize the new Sonnet4 model instead of the previous anthropicCachedModel.
- Enhanced error handling and retry logic in the fallback model implementation.
- Increased processing row limit in Snowflake query to 5000 for better performance.
- Added tests for new AI fallback models and their streaming capabilities.
- Add server-shared types for GetTitleRequest and GetTitleResponse with Zod validation
- Create database query functions for each asset type (chat, metric, collection, dashboard)
- Implement GET endpoint with zValidator middleware and exhaustive switch statement
- Add proper permission checks (publiclyAccessible OR organizationId match)
- Export new functions from database query index files
- Add title route to v2 API index
- Follow existing Hono API patterns with standardErrorHandler
Co-Authored-By: nate@buster.so <nate@buster.so>