Skip to content

Agent Integration Tests

```

FUNCTIONAL TESTING PROMPT

Pre-requisites

  • You are given a feature description.
  • You are given a [file path] to the feature in the codebase.
  • If any information is missing, prompt the user for the missing details.

Goal

Create integration tests for [feature name] located at [file path].

Dependency Analysis & Context

  1. Identify Dependencies and Usage:
  2. Determine all components and services that depend on this feature.
  3. Search for module imports and function/class usage across the codebase.
  4. Document the public interface(s) used by other components.

  5. Categorize Functions/Methods:

  6. Public Interface: Functions and methods used by other components.
  7. Internal Implementation: Functions used only within the component.
  8. External Dependencies: Third-party or system calls (e.g., databases, external APIs) that must be mocked.

  9. Feature Description:

  10. Provide a detailed description including:
    • The inputs (e.g., codebase files, configuration parameters, standards data).
    • The processing logic (how inputs are handled).
    • The expected outputs (e.g., compliance reports, result messages).
    • Error handling and alternate flows.
  11. List any supporting services or external dependencies.

  12. User Acknowledgment:

  13. Present the complete description and dependency list to the user for confirmation before proceeding.

Testing Standards & Analysis Requirements

  1. Review Testing Standards:
  2. Refer to the standards defined in testing cursor rules.
  3. Study existing tests for established patterns and conventions.
  4. Check for existing fixture and patterns that can be reused (eg. in conftest.py)

  5. Analyze the Feature:

  6. Identify the entry points or public functions.
  7. Determine expected input types, data structures, and output formats.
  8. Identify integration points that require dependency mocking.
  9. Highlight error handling scenarios.

  10. Reporting:

  11. Output your detailed analysis in a markdown file at /reports/{a-useful-name}.md.

Test Scope Guidelines

  1. Focus on the Public Interface:
  2. Test functions/methods used by other components.
  3. Skip tests for purely internal implementation details.
  4. Clearly document why each test exists (e.g., which dependent component relies on it).

  5. Integration Boundaries:

  6. Only mock true external dependencies (databases, external APIs, file systems).
  7. Allow internal implementations to run naturally.
  8. Test complete end-to-end workflows rather than isolated steps.

  9. Test Coverage:

  10. Happy Path: Validate that valid inputs produce correctly formatted outputs.
  11. Error Conditions: Ensure that invalid inputs or failing dependencies trigger proper exceptions or error responses.
  12. Edge Cases: Evaluate behavior with boundary or unexpected input data.

Test Data Management

  1. Test Data Setup:
  2. Create realistic test data that mirrors production scenarios.
  3. Include both valid and invalid cases, plus edge cases.

  4. Resource and Data Management:

  5. Use appropriate fixture scopes (function, class, module) for setup and teardown.
  6. Ensure test data is kept separate from test logic (e.g., via dedicated fixtures or test data files).
  7. Document any required environment setup and resource cleanup steps.

Implementation Guidelines

  1. Test Structure:
  2. Use the Given-When-Then pattern with clear inline comments.
  3. Organize tests by workflow: Happy Path, Error Handling, and Edge Cases.
  4. Simulate realistic end-to-end scenarios rather than isolated unit interactions.

  5. Mocking Strategy:

  6. Mock only external dependencies (e.g., database calls, external API invocations).
  7. Use realistic mock responses and document any assumptions.
  8. Leverage existing test utilities (e.g., @conftest.py, test data fixtures) as needed.

  9. Quality Standards:

  10. Write descriptive test names and use meaningful assertions.
  11. Follow the project's coding and test conventions.
  12. Ensure tests are independent, repeatable, maintainable, and readable.

Success Criteria

  1. Coverage Requirements:
  2. All public interfaces must be tested.
  3. Both happy path and error handling scenarios are covered.
  4. All realistic usage patterns from dependent components are validated.

  5. Quality Checks:

  6. Tests must validate behavior rather than internal implementation details.
  7. Ensure tests are fully documented, including assumptions and resource management.
  8. Maintain clear setup and teardown procedures.

  9. Documentation:

  10. Provide a summary of test coverage, dependencies, and any limitations.
  11. Clearly document the purpose, scope, and resource requirements for each test.

Output Format

Provide the complete test implementation in a single code block, organized as follows: 1. Test Setup: - Required imports, fixtures, and test data setup. 2. Happy Path Tests: - Main workflow tests with expected output validations. 3. Error Handling Tests: - Tests for edge cases, error conditions, and resource cleanup. 4. Documentation: - A summary of test coverage, assumptions, and any necessary setup/teardown instructions.

Constraints

  • Scope: Modify only the test files and supporting test utilities.
  • Format: Provide complete, production-ready test code that adheres to project patterns.
  • Dependencies: Leverage existing test helpers and mocks; do not reinvent them.

Review Checklist

Before finalizing the tests, verify: - [ ] All public interfaces are covered. - [ ] Only external dependencies are mocked. - [ ] Test data is realistic and comprehensive. - [ ] Resources are properly set up and cleaned up. - [ ] Documentation is complete and clear. - [ ] Tests adhere to project conventions. - [ ] Error and edge cases are thoroughly covered. - [ ] Assertions are meaningful and descriptive.

Final Instructions

  • Step 1: Prompt the user for any missing information or clarifications.
  • Step 2: Present the analysis and plan for user feedback before proceeding with the implementation.
  • Step 3: Once the user has confirmed, provide the complete test implementation in one code block.