Agent Integration Tests
```
FUNCTIONAL TESTING PROMPT
Pre-requisites
- You are given a feature description.
- You are given a [file path] to the feature in the codebase.
- If any information is missing, prompt the user for the missing details.
Goal
Create integration tests for [feature name] located at [file path].
Dependency Analysis & Context
- Identify Dependencies and Usage:
- Determine all components and services that depend on this feature.
- Search for module imports and function/class usage across the codebase.
-
Document the public interface(s) used by other components.
-
Categorize Functions/Methods:
- Public Interface: Functions and methods used by other components.
- Internal Implementation: Functions used only within the component.
-
External Dependencies: Third-party or system calls (e.g., databases, external APIs) that must be mocked.
-
Feature Description:
- Provide a detailed description including:
- The inputs (e.g., codebase files, configuration parameters, standards data).
- The processing logic (how inputs are handled).
- The expected outputs (e.g., compliance reports, result messages).
- Error handling and alternate flows.
-
List any supporting services or external dependencies.
-
User Acknowledgment:
- Present the complete description and dependency list to the user for confirmation before proceeding.
Testing Standards & Analysis Requirements
- Review Testing Standards:
- Refer to the standards defined in testing cursor rules.
- Study existing tests for established patterns and conventions.
-
Check for existing fixture and patterns that can be reused (eg. in conftest.py)
-
Analyze the Feature:
- Identify the entry points or public functions.
- Determine expected input types, data structures, and output formats.
- Identify integration points that require dependency mocking.
-
Highlight error handling scenarios.
-
Reporting:
- Output your detailed analysis in a markdown file at
/reports/{a-useful-name}.md
.
Test Scope Guidelines
- Focus on the Public Interface:
- Test functions/methods used by other components.
- Skip tests for purely internal implementation details.
-
Clearly document why each test exists (e.g., which dependent component relies on it).
-
Integration Boundaries:
- Only mock true external dependencies (databases, external APIs, file systems).
- Allow internal implementations to run naturally.
-
Test complete end-to-end workflows rather than isolated steps.
-
Test Coverage:
- Happy Path: Validate that valid inputs produce correctly formatted outputs.
- Error Conditions: Ensure that invalid inputs or failing dependencies trigger proper exceptions or error responses.
- Edge Cases: Evaluate behavior with boundary or unexpected input data.
Test Data Management
- Test Data Setup:
- Create realistic test data that mirrors production scenarios.
-
Include both valid and invalid cases, plus edge cases.
-
Resource and Data Management:
- Use appropriate fixture scopes (function, class, module) for setup and teardown.
- Ensure test data is kept separate from test logic (e.g., via dedicated fixtures or test data files).
- Document any required environment setup and resource cleanup steps.
Implementation Guidelines
- Test Structure:
- Use the Given-When-Then pattern with clear inline comments.
- Organize tests by workflow: Happy Path, Error Handling, and Edge Cases.
-
Simulate realistic end-to-end scenarios rather than isolated unit interactions.
-
Mocking Strategy:
- Mock only external dependencies (e.g., database calls, external API invocations).
- Use realistic mock responses and document any assumptions.
-
Leverage existing test utilities (e.g., @conftest.py, test data fixtures) as needed.
-
Quality Standards:
- Write descriptive test names and use meaningful assertions.
- Follow the project's coding and test conventions.
- Ensure tests are independent, repeatable, maintainable, and readable.
Success Criteria
- Coverage Requirements:
- All public interfaces must be tested.
- Both happy path and error handling scenarios are covered.
-
All realistic usage patterns from dependent components are validated.
-
Quality Checks:
- Tests must validate behavior rather than internal implementation details.
- Ensure tests are fully documented, including assumptions and resource management.
-
Maintain clear setup and teardown procedures.
-
Documentation:
- Provide a summary of test coverage, dependencies, and any limitations.
- Clearly document the purpose, scope, and resource requirements for each test.
Output Format
Provide the complete test implementation in a single code block, organized as follows: 1. Test Setup: - Required imports, fixtures, and test data setup. 2. Happy Path Tests: - Main workflow tests with expected output validations. 3. Error Handling Tests: - Tests for edge cases, error conditions, and resource cleanup. 4. Documentation: - A summary of test coverage, assumptions, and any necessary setup/teardown instructions.
Constraints
- Scope: Modify only the test files and supporting test utilities.
- Format: Provide complete, production-ready test code that adheres to project patterns.
- Dependencies: Leverage existing test helpers and mocks; do not reinvent them.
Review Checklist
Before finalizing the tests, verify: - [ ] All public interfaces are covered. - [ ] Only external dependencies are mocked. - [ ] Test data is realistic and comprehensive. - [ ] Resources are properly set up and cleaned up. - [ ] Documentation is complete and clear. - [ ] Tests adhere to project conventions. - [ ] Error and edge cases are thoroughly covered. - [ ] Assertions are meaningful and descriptive.
Final Instructions
- Step 1: Prompt the user for any missing information or clarifications.
- Step 2: Present the analysis and plan for user feedback before proceeding with the implementation.
- Step 3: Once the user has confirmed, provide the complete test implementation in one code block.