Skip to main content
QMSQMS
QMS
  • Welcome to your QMS
  • Quality Manual
  • Procedures
    • GP-001 Control of documents
    • GP-002 Quality planning
    • GP-003 Audits
    • GP-004 Vigilance system
    • GP-005 Human Resources and Training
    • GP-006 Non-conformity, Corrective and Preventive actions
    • GP-007 Post-market surveillance
    • GP-008 Product requirements
    • GP-009 Sales
    • GP-010 Purchases and suppliers evaluation
    • GP-011 Provision of service
    • GP-012 Design, Redesign and Development
      • Deprecated
      • Templates
        • T-012-001 Requirements
        • T-012-003 Test run
        • T-012-004 Software version release
        • T-012-005 Design change control
        • T-012-006 _Product name_ life cycle plan and report_YYYY_nnn
        • T-012-007 Formative evaluation plan_YYYY_nnn
        • T-012-008 Formative evaluation report_YYYY_nnn
        • T-012-009 Validation and testing of machine learning models_YYYY_nnn
        • T-012-010 Device backup verification_YYYY_nnn
        • T-012-012 Customers product version control_YYYY_nnn
        • T-012-013 Design stage review
        • T-012-014 Summative evaluation plan_YYYY_nnn
        • T-012-015 Summative evaluation report YYYY_nnn
        • T-012-016 Software usability test guide
        • T-012-017 Integration test review
        • T-012-018 Test plan
        • T-012-019 SOUP
        • T-012-020 Predetermined Change Control Plan
        • T-012-021 Product Design Phase 1 Checklist
        • T-012-022 Software Design Phase 2 Checklist
        • T-012-023 Software Development Plan
        • T-012-024 Software Candidate Release Phase 3 Checklist
        • T-012-025 Software Verification Phase 4 Checklist
        • T-012-026 Product Validation Phase 5 Checklist
        • T-012-027 Version delivery description
        • T-012-028 Software Requirement Specification
        • T-012-029 Software Architecture Description
        • T-012-030 Software Configuration Management Plan
        • T-012-031 Product Requirements Specification
        • T-012-032 SOUP Name
        • T-012-033 Software Tests Plan
        • T-012-034 Software Test Description
        • T-012-035 Software Test Run
        • T-012-036 Software development plan
      • Specific procedures
    • GP-013 Risk management
    • GP-014 Feedback and complaints
    • GP-015 Clinical evaluation
    • GP-016 Traceability and identification
    • GP-017 Technical assistance service
    • GP-018 Infrastructure and facilities
    • GP-019 Software validation plan
    • GP-020 QMS Data analysis
    • GP-021 Communications
    • GP-022 Document translation
    • GP-023 Change control management
    • GP-024 Cybersecurity
    • GP-025 Usability and Human Factors Engineering
    • GP-027 Corporate Governance
    • GP-050 Data Protection
    • GP-051 Security violations
    • GP-052 Data Privacy Impact Assessment (DPIA)
    • GP-100 Business Continuity (BCP) and Disaster Recovery plans (DRP)
    • GP-101 Information security
    • GP-200 Remote Data Acquisition in Clinical Investigations
    • GP-026 Market-specific product requirements
    • GP-110 Esquema Nacional de Seguridad
  • Records
  • Legit.Health Plus Version 1.1.0.0
  • Legit.Health Plus Version 1.1.0.1
  • Licenses and accreditations
  • External documentation
  • Procedures
  • GP-012 Design, Redesign and Development
  • Templates
  • T-012-035 Software Test Run

T-012-035 Software Test Run

Change history​

RevisionSummaryDate

Test type​

Instructions

Specify the type of test being run, such as system, system user acceptance, or verification. This helps in understanding the context and scope of the test.

Linked activities​

Instructions

List any activities that are linked to this test run, such as specific development tasks, bug fixes, or requirements. This provides traceability and context for the test.

Result​

Instructions

Mark the test as passed or failed. If it is a verification test, indicate whether the software meets the specified requirements.

Description​

Instructions

Provide a general description of the test run, including its purpose, objectives, and any relevant background information. This section should give an overview of what the test aims to achieve and why it is necessary.

Run environment​

Instructions

Describe the environment in which the test was conducted. Include details on hardware, software, and any specific configurations required for the test. This helps in understanding the conditions under which the test was performed and ensures that results can be replicated in similar environments.

Test check list​

Instructions

Provide a checklist of items to verify during the test run. This should include specific criteria or conditions that need to be met for the test to be considered successful. Each item should be clear and actionable, allowing the tester to systematically verify each aspect of the software.

Test case runs​

Instructions

List different test case runs that were executed during this test run. Each entry should include:

  • Result: The outcome of the test case (e.g., passed, failed).
  • Expected results: The expected outcome of the test case.
  • Actual results: The actual outcome observed during the test.
  • Remarks: Any additional comments or observations related to the test case run.

Summary of results​

Instructions

Provide a summary of the results from the test run. This should include an overview of the overall performance of the software, any significant issues encountered, and whether the software meets the requirements as specified in the software requirements specification (SRS). This section should give a clear indication of the software's readiness for deployment or further development.

Defects and issues​

Instructions

List any defects or issues identified during the test run. Each entry should include:

  • Defect ID: A unique identifier for the defect.
  • Description: A brief description of the defect or issue.
  • Severity: The severity of the defect (e.g., critical, major, minor).
  • Status: The current status of the defect (e.g., open, resolved, closed).
  • Reported by: The person or team who reported the defect.
  • Assigned to: The person or team responsible for addressing the defect.
  • Activies generated: Any activities or tasks generated as a result of the defect.
  • Remarks: Any additional comments or observations related to the defect.
Defect IDDescriptionSeverityStatusReported byAssigned toActivities generatedRemarks

Observations and recommendations​

Instructions

Provide any observations made during the test run that are not directly related to defects or issues. This may include performance observations, usability feedback, or suggestions for improvement. Recommendations for future testing or development should also be included in this section. This helps in continuous improvement of the software and the testing process.

Previous
T-012-034 Software Test Description
Next
T-012-036 Software development plan
  • Change history
  • Test type
  • Linked activities
  • Result
  • Description
  • Run environment
  • Test check list
  • Test case runs
  • Summary of results
  • Defects and issues
  • Observations and recommendations
All the information contained in this QMS is confidential. The recipient agrees not to transmit or reproduce the information, neither by himself nor by third parties, through whichever means, without obtaining the prior written permission of Legit.Health (AI LABS GROUP S.L.)