- Simplified import statements in web-search-tool.ts and web-search-tool.test.ts for better readability.
- Updated error messages in sql-parser-helpers.test.ts to provide clearer feedback regarding wildcard usage on physical tables.
- Update package.json dependencies in ai and trigger packages
- Fix import statements in web-search-tool.ts and web-search-tool.test.ts
- Update pnpm-lock.yaml to reflect correct package references
- Resolves CI build failure due to missing module '@buster-tools/web-tools'
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Update createWorkflowAwareHealingMessage to include next workflow step info and transition descriptions
- Replace SQL wildcard error message with more user-friendly text
- Maintain existing function signatures and return structures
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Changed test query from 'OpenAI GPT' to 'Buster Data' with an increased limit of 10.
- Reorganized import statements in web-search-tool.ts for better readability.
- Enhanced formatting in permission-validator.test.ts for improved code clarity.
- Add webSearch method to FirecrawlService with WebSearchOptions, WebSearchResult, and WebSearchResponse interfaces
- Export new types from web-tools package
- Create web-search-tool in AI package with proper input/output schemas
- Add comprehensive unit tests for webSearch method (16 tests total)
- Add tool tests with mocking (4 tests)
- Add integration tests (skipped when no API key)
- Update package exports and dependencies
- Default search limit: 5, default format: markdown, content scraping enabled
- Proper error handling using CompanyResearchError
- AI package has no direct Firecrawl dependency
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
The Snowflake adapter implementation transforms column names to lowercase
for consistency, but the tests were expecting uppercase column names.
This commit updates the tests to match the implementation:
- Update test expectations to use lowercase column names (id, name)
- Fix hasMoreRows assertions to match implementation logic (only true when rowCount > limit)
- Ensure all Snowflake-related tests pass with the current adapter behavior
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Increased allowed variance in cached query time checks to accommodate network fluctuations.
- Corrected property name in test assertions to match expected lowercase format.
- Enhanced SnowflakeAdapter to transform column names to lowercase and adjusted logic for determining if more rows are available from the stream.
- Implemented a custom fetch function in both anthropic.ts and vertex.ts to modify the request body.
- If tool_choice is present, added disable_parallel_tool_use to the request body.
- Included error handling for JSON parsing to ensure fallback to original request on failure.
- Added tests for handling AbortError scenarios in both memory-safe and standard streaming models.
- Updated the FallbackModel to gracefully handle intentional aborts without retrying or causing errors.
- Ensured that the controller closes properly on AbortError and that no fallback to subsequent models occurs.
- Enhanced mock models to simulate AbortError conditions for comprehensive testing.
- Replace SELECT * with explicit column names in permission validation tests
- Update CTE test to use explicit columns in final SELECT
- Maintain test intent while aligning with new wildcard security validation
- Ensure all tests pass with new wildcard blocking behavior
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Updated references across multiple files to reflect the new tool name.
- Adjusted unit tests and schemas to ensure compatibility with the renamed tool.
- Removed the old respondWithoutAnalysis tool implementation and its associated tests.
- Ensured backward compatibility for existing functionality using the old tool name.
- Import Select from node-sql-parser for type safety
- Replace function parameter types with Record<string, unknown> for dynamic AST objects
- Use proper type conversions through 'unknown' for incompatible types
- Maintain existing wildcard validation functionality
- Resolve all 8 noExplicitAny linting violations
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>
- Update query() method to use streamResult: true and stmt.streamRows()
- Add network-level row limiting with default 5000 row cap
- Process stream events (data, error, end) to build result set
- Maintain backward compatibility with existing adapter interface
- Update unit tests to mock streaming behavior
- Fix integration test imports and property names
- Preserve query caching by using original SQL unchanged
Co-Authored-By: Dallin Bentley <dallinbentley98@gmail.com>