Skip to main content
QMSQMS
QMS
  • Welcome to your QMS
  • Quality Manual
  • Procedures
  • Records
    • GP-001 Documents and records control
    • GP-002 Quality planning
    • GP-003 Audits
      • ISO 13484
      • Deprecated
      • Quantificare
        • CAPA Plan - Response to Quantificare Audit
      • R-003-001 Audit program
    • GP-004 Vigilance system
    • GP-005 HR and training
    • GP-007 Post-market surveillance
    • GP-009 Sales
    • GP-010 Suppliers
    • GP-012 Design, Redesign and Development
    • GP-018 Infrastructure and facilities
    • GP-019 Software validation
    • GP-023 Change control management
    • GP-050 Data Protection
    • GP-051 Security violations
    • GP-052 Data Privacy Impact Assessment (DPIA)
    • GP-200 Remote Data Acquisition in Clinical Investigations
    • GP-011 Provision of service
    • GP-110 Esquema Nacional de Seguridad
  • Legit.Health Plus Version 1.1.0.0
  • Legit.Health Plus Version 1.1.0.1
  • Legit.Health Utilities
  • Licenses and accreditations
  • Applicable Standards and Regulations
  • Pricing
  • Public tenders
  • Records
  • GP-003 Audits
  • Quantificare
  • CAPA Plan - Response to Quantificare Audit

CAPA Plan - Response to Quantificare Audit

Document Information​

FieldValue
Document TitleCorrective and Preventive Action Plan - Quantificare Audit Response
Document ReferenceCAPA-QF-2025-001
Audit DateNovember 6, 2025
Audit Report ReferenceAudit report - 2025-11-06 - AI provider Legit Health [Compatibility Mode]
Document DateJanuary 15, 2026
Client/AuditorQuantificare
Auditee OrganizationLegit Health
Document OwnerQuality Management Department
StatusSubmitted for Client Review
Version1.0

Purpose and Scope​

This Corrective and Preventive Action (CAPA) Plan has been prepared by Legit Health in response to the audit conducted by our valued client, Quantificare, on November 6, 2025. The purpose of this document is to:

  1. Acknowledge and address all findings identified during the audit
  2. Demonstrate our commitment to continuous improvement and quality excellence
  3. Provide detailed corrective actions to resolve identified non-conformities
  4. Implement preventive measures to prevent recurrence
  5. Establish clear timelines and responsibilities for implementation
  6. Maintain transparent communication with Quantificare throughout the resolution process

This CAPA Plan covers all findings from the Quantificare audit and applies to all relevant departments within Legit Health, including Quality Management, AI Development, Human Resources, IT, and Regulatory Affairs.


Executive Summary​

Legit Health thanks Quantificare for the comprehensive audit conducted on November 6, 2025, and values the findings as opportunities for enhancing our quality management system and operational excellence.

The audit identified 1 major finding and 9 minor findings primarily related to:

  • AI Development documentation completeness
  • Quality Management System procedures clarity
  • Training and competency management
  • Supplier evaluation processes
  • Software validation and SOUP management
  • IT security and audit trail management

Our Commitment:

Legit Health takes these findings seriously and has conducted a thorough root cause analysis for each non-conformity. We have developed comprehensive corrective and preventive actions with clear ownership, timelines, and verification methods. All actions are scheduled for completion by April 30, 2026, with the major finding prioritized for completion by March 31, 2026.

We are committed to implementing these improvements and maintaining open communication with Quantificare throughout the process, including progress updates and final verification evidence.


Summary of Findings​

The following table provides an overview of all findings identified by Quantificare during the audit:

Finding IDSeverityCategoryStatusTarget Date
Finding Major 6MajorAI Development DocumentationPlanned2026-01-31
Finding Minor 1MinorQuality Management - Customer CommunicationPlanned2026-03-15
Finding Minor 2MinorRegulatory Compliance - ICH E6 R3Planned2026-03-15
Finding Minor 3MinorTraining EffectivenessPlanned2026-03-15
Finding Minor 4MinorGCP Training UpdatePlanned2026-03-15
Finding Minor 5MinorSupplier EvaluationPlanned2026-03-15
Finding Minor 7MinorAI ExplainabilityPlanned2026-03-15
Finding Minor 8MinorAudit Log ReviewPlanned2026-03-15
Finding Minor 9aMinorSoftware Validation DocumentationPlanned2026-03-15
Finding Minor 9bMinorSOUP Management ProcessPlanned2026-03-15

Overall CAPA Completion Target: April 30, 2026


Evidence-Based Analysis and Action Plan​

This section provides a detailed analysis of each finding based on existing QMS documentation, identifying what is already implemented versus what requires action.

Finding Major 6 - AI Development Report Completeness​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
AI Development Reports exist✅ IMPLEMENTEDAI Development Report structure exists in R-TF-028-005-development-report.mdx with comprehensive sections for each model❌ NO ADDITIONAL ACTION - Structure is completeR-TF-028-005 AI/ML Development ReportN/A
Performance results missing⚠️ PARTIALLY IMPLEMENTEDMultiple sections show "[TO FILL]" or "[PENDING]" for performance metrics (e.g., Erythema, Desquamation, ICD models)✅ ACTION REQUIRED - Complete all pending performance metrics with actual test resultsR-TF-028-005 AI/ML Development Report2026-01-31
Dataset statistics missing⚠️ PARTIALLY IMPLEMENTEDSome models lack complete dataset statistics in Data Requirements sections✅ ACTION REQUIRED - Add dataset statistics (training/validation/test splits, demographics) for all modelsR-TF-028-005 AI/ML Development Report2026-01-31
Documentation process✅ IMPLEMENTEDGP-028 AI Development procedure defines comprehensive AI Development Report requirements including Algorithm Evaluation, bias analysis, performance metrics✅ PREVENTIVE ACTION - Integrate mandatory checkpoints in development workflow as per GP-028GP-028 AI Development2026-01-31

Conclusion: Infrastructure and procedures exist. Main issue is completing pending sections in existing reports.


Finding Minor 1 - Customer Communication Timeframe for Non-Conformities​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
NC management process✅ IMPLEMENTEDGP-006 Non-conformity, Corrective and Preventive actions procedure comprehensive documented❌ NO ADDITIONAL ACTION - Process existsGP-006 Non-conformity, Corrective and Preventive ActionsN/A
Customer support response time✅ IMPLEMENTEDMaximum response time of 48 hours defined for customer tickets (FIFO methodology)❌ NO ADDITIONAL ACTION - Already definedGP-006 Non-conformity, Corrective and Preventive ActionsN/A
Communication timeframes by severity❌ NOT IMPLEMENTEDNo specific timeframes defined for notifying customers based on NC criticality (High/Medium/Low)✅ ACTION REQUIRED - Add specific communication timeframes to GP-006:
- High criticality: Within 24h
- Medium criticality: Within 72h
- Low criticality: Within 5 business days
GP-006 Non-conformity, Corrective and Preventive Actions2026-03-15
NIS2 incident notification✅ IMPLEMENTEDCustomer notification procedures exist in T-030-005 NIS2-Compliant Incident Response Plan with defined timelines by impact level❌ NO ADDITIONAL ACTION - Cybersecurity incidents coveredT-030-005 NIS2-Compliant Incident Response PlanN/A

Conclusion: General NC process and cybersecurity notifications exist. Need to add explicit communication timeframes for product-related NCs by criticality level in GP-006.


Finding Minor 2 - ICH E6 R3 Risk Analysis​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
GCP training exists✅ IMPLEMENTEDGCP training included in Training Matrix for relevant personnel (JD-003, JD-005, JD-007)❌ NO ADDITIONAL ACTION - Training framework existsR-005-003 Training Plan 2025N/A
ICH GCP in R-001-005❌ NOT IMPLEMENTEDICH GCP E6 not explicitly listed in R-001-005 standards list✅ ACTION REQUIRED - Add ICH GCP E6 R3 to R-001-005. Once added, it will be reviewed annually per GP-002 Management Review process, where impact of any changes is analyzedR-001-005 List of Applicable Standards and Regulations2026-03-15

Conclusion: Training infrastructure exists. ICH GCP guideline needs to be added to R-001-005 standards list, then will be systematically reviewed annually during Management Review (GP-002) where impact analysis is performed.


Finding Minor 3 - Training Effectiveness Assessment​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
Training evaluation process✅ IMPLEMENTEDT-005-004 Training evaluation and record template exists with evaluation by employee and manager❌ NO ADDITIONAL ACTION - Evaluation framework existsT-005-004 Training Evaluation and Record TemplateN/A
Training effectiveness metrics✅ IMPLEMENTEDQuality indicator #29 "Satisfactory training report results" with target 80% and result 94.4% (2024)❌ NO ADDITIONAL ACTION - Already trackedR-002-003 Quality Indicators 2024N/A
Knowledge assessment (tests/quizzes)❌ NOT IMPLEMENTEDCurrent evaluation is subjective (manager assessment), no formal knowledge tests or practical assessments found✅ ACTION REQUIRED - Add to GP-005 and T-005-004:
- Knowledge tests for theoretical training
- Practical assessments for procedure training
- Minimum passing scores (e.g., 80%)
- Re-training protocol for failures
GP-005 Human Resources and Training
T-005-004 Training Evaluation and Record Template
2026-03-15
Competency verification⚠️ PARTIALLY IMPLEMENTEDGDPR training has evaluation and sign-off (T-005-006) but generic approach not standardized✅ ACTION REQUIRED - Standardize competency verification across all training types, not just GDPRT-005-006 GDPR Training Evaluation Template2026-03-15

Conclusion: Training evaluation framework exists but lacks objective assessment mechanisms (tests, practical demonstrations) to verify knowledge transfer.


Finding Minor 4 - GCP Training Update (ICH E6 R3)​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
GCP training frequency✅ IMPLEMENTEDTraining plan shows GCP training for relevant roles❌ NO ADDITIONAL ACTION - Framework existsTraining recordsN/A
ICH E6 R3 training (July 2025)❌ NOT COMPLETEDICH E6 R3 released July 2025, training not yet provided (as per Quantificare finding)✅ IMMEDIATE ACTION - Schedule and complete ICH E6 R3 training for all relevant personnel (JD-003, JD-005, JD-007, clinical team) by March 15, 2026To be created: R-005-XXX ICH E6 R3 Training Record2026-03-15
Version update protocol❌ NOT IMPLEMENTEDNo documented protocol for responding to new GCP versions✅ ACTION REQUIRED - Add to GP-005:
- Notification system for new GCP releases
- 90-day maximum response time
- Interim communication protocol
- Temporary training if official not available
GP-005 Human Resources and Training2026-03-15

Conclusion: Same root cause as Finding Minor 2 - need proactive monitoring and response protocol for regulatory updates.


Finding Minor 5 - Supplier Evaluation Methodology Clarity​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
Supplier evaluation process✅ IMPLEMENTEDGP-010 Purchases and suppliers evaluation procedure with scorecard system (0-2 points per criterion)❌ NO ADDITIONAL ACTION - Process existsGP-010 Purchases and Suppliers EvaluationN/A
Evaluation criteria defined✅ IMPLEMENTED7 evaluation facets defined: Quality, QMS Cert, ISMS Cert, Affordable price, Experience, Technical capacity, International reach❌ NO ADDITIONAL ACTION - Criteria clearGP-010 Purchases and Suppliers EvaluationN/A
Scoring methodology✅ IMPLEMENTEDMin/Max scores defined (0-2 for each criterion), minimum required scores by supplier type❌ NO ADDITIONAL ACTION - Scoring system existsGP-010 Purchases and Suppliers EvaluationN/A
"Value" vs "Score" columns⚠️ UNCLEAR IMPLEMENTATIONSupplier evaluation records (e.g., AWS S3 in R-010-001) show two columns: "Value" (raw score 0-10?) and "Score" (weighted 0-2). Relationship not documented in GP-010✅ ACTION REQUIRED - Clarify in GP-010 Section "Supplier scorecard":
- Define "Value" = raw assessment (1-10 scale)
- Define "Score" = weighted/normalized score (0-2)
- Add weighting factors if applicable
- Provide calculation examples
GP-010 Purchases and Suppliers Evaluation2026-03-15
Evaluation template clarity⚠️ NEEDS IMPROVEMENTTemplate T-010-001 has columns but no guidance on how to score✅ ACTION REQUIRED - Add instructions/guidance to T-010-001 template explaining scoring methodologyT-010-001 Supplier Evaluation Template2026-03-15

Conclusion: Process and criteria exist but documentation lacks clarity on the relationship between "Value" and "Score" columns used in practice.


Finding Minor 7 - AI Explainability Methodology​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
Explainability mentioned✅ IMPLEMENTEDGP-028 mentions XAI techniques (Grad-CAM, SHAP) used during development⚠️ INSUFFICIENT - Mentioned but not detailedR-TF-028-002 AI/ML Development PlanN/A
Explainability methodology documented❌ NOT DOCUMENTEDNo formal documentation of explainability methods, validation endpoints, or model attention verification in AI Development Reports✅ ACTION REQUIRED - Add to each model section in R-TF-028-005:
- Explainability method used (GradCAM, attention maps, etc.)
- Validation endpoints for explainability
- Evidence that model focuses on correct features
- Visual examples of attention/saliency maps
R-TF-028-005 AI/ML Development Report2026-03-15
Multi-model consistency verification❌ NOT DOCUMENTEDNo documentation on how consistency between multiple AI models is verified for high-level functionalities✅ ACTION REQUIRED - Document in R-TF-028-005 or create new record:
- Inter-model consistency verification process
- Consistency metrics and thresholds
- Integration testing for multi-model workflows
R-TF-028-005 AI/ML Development Report or create new verification record2026-03-15
Adjudication process❌ NOT DOCUMENTEDNo documented adjudication process for conflicting model outputs or edge cases✅ ACTION REQUIRED - Document adjudication process:
- Decision trees for conflict resolution
- Escalation paths
- Human oversight requirements
- Model disagreement handling
R-TF-012-023 Software Development Plan or create specific procedure2026-03-15

Conclusion: Explainability exists in practice but lacks formal documentation, validation endpoints, and multi-model consistency verification.


Finding Minor 8 - Audit Log Review​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
Audit logs collected✅ IMPLEMENTEDAudit logs saved in AWS database❌ NO ADDITIONAL ACTION - Collection existsAWS InfrastructureN/A
Log review process❌ NOT IMPLEMENTEDNo periodic audit log review performed by Legit Health✅ ACTION REQUIRED - Establish in relevant SOPs (IT Security/GP-030):
- Monthly minimum review frequency
- Types of events to monitor
- Response procedures
- Documentation requirements
- Review records as evidence
GP-030 Incident Response or create IT Security procedure2026-03-15
Automated monitoring❌ NOT IMPLEMENTEDNo automated log analysis tools mentioned✅ ACTION REQUIRED - Implement automated tools:
- Anomaly detection
- Security event alerts
- Access pattern analysis
Technical implementation + documentation in GP-030 or IT Security procedure2026-03-15
Roles and responsibilities❌ NOT DEFINEDNo defined roles for log review✅ ACTION REQUIRED - Define in SOP:
- IT Security Manager responsibility
- Review schedule
- Escalation procedures
GP-030 Incident Response or IT Security procedure2026-03-15

Conclusion: Logs are collected but completely lacking review process, responsibilities, and monitoring procedures.


Finding Minor 9a - Software Validation Risk Classification Clarity​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
Software validation procedure✅ IMPLEMENTEDGP-019 Software validation plan procedure exists with risk-based approach❌ NO ADDITIONAL ACTION - Procedure existsGP-019 Software Validation PlanN/A
Risk-based approach✅ IMPLEMENTEDGP-019 describes high-risk vs non-high-risk classification and different testing approaches❌ NO ADDITIONAL ACTION - Concept implementedGP-019 Software Validation PlanN/A
Risk classification criteria⚠️ NEEDS CLARIFICATIONHigh-risk defined but "non-risked software" not explicitly stated as "validation not required, only issue tracking"✅ ACTION REQUIRED - Add explicit statement in GP-019:
- Define "non-risked software" category
- State clearly: "Non-risked software does not require validation; only issue tracking is needed"
- Provide examples
GP-019 Software Validation Plan2026-03-15
External software list justifications❌ NOT DOCUMENTEDR-019-002 External software list shows software but no risk class justification column✅ ACTION REQUIRED - Update:
1. Add "Risk Class" and "Justification" columns to R-019-002
2. Add risk justification for all listed software
3. Update T-019-002 template
R-019-002 External Software List
T-019-002 External Software List Template
2026-03-15

Conclusion: Risk-based validation exists but lacks explicit documentation of "non-risked" category and missing risk justifications in software inventory.


Finding Minor 9b - SOUP Management Process Documentation​

AspectCurrent StateEvidenceRequired ActionDocument ReferenceDeadline
SOUP documentation template✅ IMPLEMENTEDT-012-019 SOUP template exists with comprehensive sections (description, requirements, system requirements, related risks, etc.)❌ NO ADDITIONAL ACTION - Template existsT-012-019 SOUP Documentation TemplateN/A
SOUP in development plan✅ IMPLEMENTEDR-TF-012-023 Software Development Plan describes SOUP management: identification, classification, verification, review process❌ NO ADDITIONAL ACTION - Process describedR-TF-012-023 Software Development PlanN/A
SOUP in GP-012 procedure✅ IMPLEMENTEDGP-012 mentions SOUP management in Phase 2 (Software Design) including verification requirements❌ NO ADDITIONAL ACTION - Mentioned in procedureGP-012 Design, Redesign and DevelopmentN/A
SOUP request form❌ NOT IMPLEMENTEDNo formal SOUP request form found✅ ACTION REQUIRED - Create T-012-XXX SOUP Request Form including:
- Software identification
- Intended use and justification
- Initial risk assessment
- Vendor information
- Licensing details
To be created: T-012-XXX SOUP Request Form Template2026-03-15
SOUP approval workflow⚠️ PARTIALLY DOCUMENTEDSoftware Development Plan mentions JD-007 approval needed for SOUP updates, but initial approval workflow not clearly defined✅ ACTION REQUIRED - Add explicit SOUP approval workflow to GP-012:
- Request submission process
- Evaluation criteria checklist
- Approval authorities (JD-007)
- Documentation requirements
- Deployment gates
GP-012 Design, Redesign and Development2026-03-15
SOUP register✅ IMPLEMENTEDSOUP components documented per-model and SBOM exists (T-030-002 Software Bills Of Materials)❌ NO ADDITIONAL ACTION - Register existsT-030-002 Software Bills Of Materials and per-model SOUP documentationN/A

Conclusion: SOUP management infrastructure largely exists but lacks formal request form and explicit approval workflow documentation in GP-012.


Detailed CAPA Actions​

MAJOR FINDING​


Finding Major 6 - AI Development Report Incomplete Documentation​

Finding ID: Finding Major 6

Severity: MAJOR Non-Conformity

Quantificare's Finding Description:

"For multiple models, AI Development Report shows that performance results or dataset statistics values are missing for most models, and some models have completely empty sections ('to be completed' or 'pending')."

Legit Health's Response:

We acknowledge this finding and recognize the critical importance of complete and comprehensive AI development documentation. This is our highest priority action item.

Root Cause Analysis:

  • Documentation process not fully integrated into AI model development lifecycle
  • Incomplete data collection during model development phases
  • Lack of verification checkpoints before report finalization
  • Rapid development cycles resulted in documentation lag
  • No automated quality gates to prevent incomplete documentation release

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
6.CA.1Conduct comprehensive audit of all existing AI Development ReportsAI Development Team Lead2026-01-15
6.CA.2Identify all models with incomplete sectionsAI Development Team Lead2026-01-15
6.CA.3Prioritize completion based on model deployment status and clinical impactAI Development Team Lead + CTO2026-01-17
6.CA.4Assign dedicated resources to complete all pending sectionsCTO2026-01-20
6.CA.5Compile performance results and dataset statistics for all modelsAI Development Team2026-01-27
6.CA.6Review and validate all completed documentation for accuracy and completenessQuality Manager + Technical Documentation Lead2026-01-29
6.CA.7Submit completed reports to Quantificare for verificationQuality Manager2026-01-31

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
6.PA.1Process Integration: Implement mandatory documentation checkpoints at each development phase (Design phase: complete design rationale and requirements; Development phase: document architecture and implementation details; Validation phase: complete performance results before model release; Deployment phase: final report review and sign-off)AI Development Team Lead + Quality Manager2026-01-31
6.PA.2Template and Tooling: Create comprehensive AI Development Report templates with all required sections, implement automated checks in development pipeline to flag incomplete documentation, integrate documentation status into project management dashboardsTechnical Documentation Lead + AI Development Team Lead2026-01-31
6.PA.3Quality Gates: Implement policy that no AI model proceeds to next phase without documentation completion, management review and approval required before marking reports as complete, establish peer review process for technical documentation accuracyCTO + Quality Manager2026-01-31
6.PA.4Training and Awareness: Train all AI developers on documentation requirements and importance, include documentation quality in performance evaluations, establish documentation champions within teamsAI Development Team Lead + Quality Manager2026-01-31

Responsible Persons:

  • Primary: AI Development Team Lead
  • Support: Quality Manager, Technical Documentation Lead
  • Oversight: CTO

Verification Method:

  • Submission of completed AI Development Reports to Quantificare
  • Internal audit of documentation completeness (100% target)
  • Quarterly review of documentation process effectiveness
  • Tracking of documentation completion rates in development pipeline

Success Criteria:

  • 100% of AI Development Reports have all sections completed
  • No "pending" or "to be completed" placeholders remain
  • All performance metrics and dataset statistics documented
  • Documentation review process integrated into development workflow

Target Completion Date: March 31, 2026

Progress Milestones:

  • February 15, 2026: Audit complete, prioritization established
  • March 1, 2026: 50% of reports completed
  • March 15, 2026: 80% of reports completed
  • March 31, 2026: 100% completion and submission to Quantificare

Status: Planned - High Priority

Client Communication: We will provide bi-weekly progress updates to Quantificare on documentation completion status, including detailed metrics on completed sections and remaining work.


MINOR FINDINGS​


Finding Minor 1 - Customer Communication Timeframe​

Finding ID: Finding minor 1

Category: Minor Non-Conformity

Description: Legit Health's process does not indicate the timeframe for communicating with the customer in the event of non-compliance that impacts the customer.

Root Cause Analysis:

Why does the process not indicate customer communication timeframes?

  • Because GP-006 (Non-conformity, Corrective and Preventive actions) does not specify timeframes for notifying customers about NCs that affect them

Why doesn't GP-006 specify these timeframes?

  • Because when GP-006 was written, focus was on internal NC management and regulatory notifications (GP-004 Vigilance), customer communication timing was assumed implicit

Why was it assumed implicit?

  • Because existing process defines "maximum 48h response" for customer support tickets (GP-006 line 109), but this is for general inquiries, not NC notifications

Root Cause: GP-006 does not differentiate between customer support response times (general inquiries) and NC notification requirements (safety/performance issues). No explicit requirement exists for communicating product/service NCs to affected customers within defined timeframes based on NC criticality.

Corrective Actions:

Note: Not applicable - this is a procedure gap (potential NC), not a detected occurrence of failure to communicate with customer. However, to address the identified gap:

Action IDAction DescriptionResponsible PersonTarget Date
1.CA.1Review past 12 months of customer-affecting NCs (from T-006-002 List) - verify if any required customer notification and whether timing was appropriateQuality Manager (JD-004)2026-02-15
1.CA.2Document findings - if delays found, implement immediate notifications to affected customersQuality Manager (JD-004)2026-02-20
1.CA.3If no past NCs required customer notification - document as evidence that gap has not caused actual NC yetQuality Manager (JD-004)2026-02-20

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
1.PA.1Update GP-006 Section "Non-conformity management" to add customer notification timeframes based on NC criticality: High criticality NC: Within 24 hours of NC confirmation; Medium criticality NC: Within 72 hours (3 business days) of NC confirmation; Low criticality NC: Within 5 business days of NC confirmation. Notification shall include: NC description, impact on customer, actions taken/planned, expected resolution date, contact for questions. Note: This is separate from GP-004 Vigilance System notifications (for serious incidents to authorities) and GP-014 customer support response times (for general inquiries)Quality Manager (JD-004)2026-03-15
1.PA.2Create T-006-004 "Customer NC Notification Template" with standardized format including all required elementsQuality Manager (JD-004)2026-02-20
1.PA.3Add "Customer Notification Required?" field to T-006-001 Non-conformity report template with Y/N checkbox and "Date Customer Notified" fieldQuality Manager (JD-004)2026-02-20
1.PA.4Update T-006-002 List of non-conformities to include "Customer Notification Date" column for tracking compliance with timeframesQuality Manager (JD-004)2026-02-20
1.PA.5Train Quality team (JD-004, JD-005) and Customer Support team on new notification requirements and templatesQuality Manager (JD-004)2026-03-15
1.PA.6Add customer notification compliance as quality indicator in monthly management reviewsQuality Manager (JD-004)2026-03-15

Verification Method:

Preventive Action Effectiveness:

  • Immediate verification (2026-03-15): Quality Manager confirms GP-006 updated, template created, and training completed
  • 3 months after implementation (2026-05-31): Review all customer-affecting NCs from past 3 months - verify 100% notified within defined timeframes
  • Evidence: Updated GP-006 section, T-006-004 template, training records, NC notification tracking log

Impact Assessment:

  • Verify customer notification process does not delay NC resolution or regulatory notifications
  • Confirm customer satisfaction with communication (review feedback)

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 2 - ICH E6 R3 Risk Analysis​

Finding ID: Finding minor 2

Category: Minor Non-Conformity

Description: No risk analysis was found regarding the new GCP version ICH E6 R3.

Root Cause Analysis:

Why was no risk analysis found for ICH E6 R3?

  • Because ICH E6 R3 (released July 2025) was not added to R-001-005 "List of applicable standards and regulations", so no impact analysis was triggered

Why wasn't ICH E6 R3 added to R-001-005?

  • Because R-001-005 still lists ICH E6 R2 (2016) as the applicable GCP version, and has not been updated to reflect the new R3 version

Why wasn't R-001-005 updated?

  • Because the annual review of applicable standards and regulations (performed as part of GP-002 Management Review) had not yet detected the ICH E6 R3 release, OR the review was performed but ICH GCP was not included in the scope

Why was ICH E6 R3 release not detected?

  • Because ICH GCP guideline is not explicitly listed in R-001-005 current version (it may be implicitly covered but not tracked), so it was not reviewed during the annual regulatory requirements review process

Root Cause: ICH GCP (ICH E6) guideline is not explicitly included in R-001-005 "List of applicable standards and regulations", therefore it is not systematically reviewed during the annual management review process (GP-002) where impact analysis of new/revised regulatory requirements is performed.

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
2.CA.1Perform ICH E6 R3 Impact Analysis as required by GP-002 Management Review process: Compare ICH E6 R2 (2016) vs R3 (2025) - identify key changes; Assess impact on Legit Health: Does R3 affect our clinical trials processes, SOPs, informed consent procedures, data management?; Determine required actions: SOP updates? Training updates? Process changes?; Document impact analysis in R-002-004 Management Review 2026 section "New or revised applicable regulatory requirements"; If significant impact found: Define action plan with responsibilities and deadlines; If minimal/no impact: Document justification (e.g., "Legit Health clinical trial procedures already compliant with R3 requirements")JD-004 (Quality Manager) + JD-005 (Regulatory Affairs Manager)2026-02-20
2.CA.2Present impact analysis to JD-001 (CEO) in Management Review meeting for approval of any required actionsJD-004 (Quality Manager)2026-03-15

Preventive Actions:

Note about Training: GCP training already exists in Training Matrix (R-005-003-training-plan-2025, line 12) with frequency "At least every 3 years". Training requirements are adequate - issue was detection and impact analysis of regulatory update, not training delivery.

Action IDAction DescriptionResponsible PersonTarget Date
2.PA.1Update R-001-005 "List of applicable standards and regulations" to explicitly include ICH GCP guideline. Add to appropriate section (likely "Guides" or "Clinical studies"): Code XX_XX, Name "Guideline for Good Clinical Practice E6", Source ICH/EMA, Version E6 R3, Date 2025-07-XX, Review Frequency Annual. Note: Verify current version listed - update from R2 to R3JD-004 (Quality Manager)2026-02-15
2.PA.2Ensure R-001-005 review process includes all applicable guides/guidelines: Verify R-001-005 includes all standards, regulations, and guidelines applicable to Legit Health operations; Specifically verify ICH GCP E6 is now listed (see action 2.PA.1); Add any other missing clinical/regulatory guidelines if identifiedJD-004 (Quality Manager)2026-02-15
2.PA.3Document in GP-002 Management Review procedure that R-001-005 shall be reviewed annually: GP-002 already requires review of "New or revised applicable regulatory requirements" as Management Review input; Clarify that this review includes: Checking each item in R-001-005 for new versions/updates; When new version detected: Perform impact analysis as per R-002-004 Management Review template; Update R-001-005 version information when changes identifiedJD-004 (Quality Manager)2026-02-20
2.PA.4Add ICH GCP monitoring to Quality Calendar: Include "Review ICH GCP guideline for updates" as annual task in R-002-005 Quality Calendar; Assign to JD-004 (Quality Manager) or JD-005 (Regulatory Affairs Manager); Task timing: Before annual Management Review meetingJD-004 (Quality Manager)2026-02-20

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-15): ICH E6 R3 impact analysis completed and documented in R-002-004 Management Review 2026
  • Evidence: R-002-004 section "New or revised applicable regulatory requirements" includes ICH E6 R3 with impact assessment and action plan (if needed)

Preventive Action Effectiveness:

  • Immediate verification (2026-02-20): R-001-005 updated to include ICH GCP E6 R3 explicitly listed
  • Next annual review (2027-01): Verify R-001-005 review performed as part of Management Review, with documented check of all listed standards/guidelines for updates
  • Evidence: Updated R-001-005 with ICH E6 R3, R-002-005 Quality Calendar with ICH GCP review task, next year's R-002-004 showing systematic review

Impact Assessment:

  • Verify R-001-005 review process can be completed within annual Management Review cycle (no excessive workload)
  • Confirm process enables timely detection and response to regulatory updates affecting clinical operations

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 3 - Training Effectiveness Assessment​

Finding ID: Finding minor 3

Category: Minor Non-Conformity

Description: There are no practical applications to check the efficacy of a training or the understanding of a procedure.

Root Cause Analysis:

Why are there no practical applications to check training efficacy?

  • Because current training evaluation uses only T-005-004 (Training evaluation and record) which includes subjective assessments by employee and manager, not objective knowledge tests

Why does T-005-004 use only subjective assessment?

  • Because GP-005 Section "Evaluation" (line 314+) describes evaluation as manager/employee feedback, but does not require objective competency verification

Why doesn't GP-005 require objective verification?

  • Because when GP-005 was written, focus was on documenting training completion and satisfaction, not on measuring knowledge retention or practical competence

Why was knowledge verification not prioritized?

  • Because T-005-004 template and GP-005 followed basic HR training documentation approach, without considering ISO 13485:2016 clause 6.2 competence requirements

Root Cause: GP-005 and T-005-004 lack requirements for objective assessment of training effectiveness (knowledge tests, practical demonstrations, competency verification) to ensure personnel actually understand and can apply trained procedures.

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
3.CA.1Retroactive competency assessment for critical GCP and QMS training completed in past 12 months: Identify all personnel who completed GCP training (JD-003, JD-005, JD-007, clinical staff); Identify all personnel who completed critical QMS procedure training (GP-006, GP-012, GP-028); Develop and administer short competency quiz (10-15 questions) for each training type; Document results - if score below 80%, schedule re-training; Document as "R-005-XXX Retroactive Training Competency Assessment 2026"HR Manager + Quality Manager2026-03-20
3.CA.2Results analysis: Calculate pass rate; Identify knowledge gaps; Adjust future training content based on findingsHR Manager + Quality Manager2026-03-25

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
3.PA.1Update GP-005 Section "Evaluation" (lines 314+) to add "Competency Verification Requirements": All training shall include objective competency verification appropriate to training type (Theoretical: Written test/quiz min 80% pass; Procedure: Practical demonstration evaluated by qualified observer; Software/tool: Practical exercises with defined success criteria); Verification within 30 days of training completion; Re-training required if below 80%; Training records (T-005-004) shall document resultsHR Manager + Quality Manager2026-03-15
3.PA.2Update T-005-004 Training evaluation and record template to add sections: "Competency Assessment Method", "Assessment Date", "Assessment Score/Result", "Pass/Fail", "Re-training Required?", "Assessor Name and Signature"HR Manager + Quality Manager2026-03-15
3.PA.3Create assessment library: Develop quiz for each standard training module (GCP, GDPR, GP-005, GP-006, GP-012, GP-028, etc.); Create practical assessment checklist for procedure trainings; Store in GP-005 Training-Assessments records folder; Initial set: minimum 5 training types coveredHR Manager + Quality Manager2026-03-15
3.PA.4Expand assessment library over time: Add assessments for new training modules as developedHR ManagerOngoing
3.PA.5Train HR Manager and training coordinators on new competency verification processQuality Manager2026-03-20
3.PA.6Add training competency metrics to R-002-003 Quality Indicators: "Training first-time pass rate" target ≥85%; "Training re-assessment completion rate" target 100%Quality Manager2026-03-15

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-25): R-005-XXX Retroactive Assessment document completed with all identified personnel assessed
  • Evidence: Assessment results spreadsheet, re-training records for those who failed

Preventive Action Effectiveness:

  • 3 months after implementation (2026-06-30): Verify all training completed in past 3 months includes competency verification with documented results in T-005-004
  • Evidence: Sample 10 recent training records - confirm 100% have competency assessment documented

Impact Assessment:

  • Verify competency verification does not delay training closure beyond 30 days
  • Confirm competency pass rates meet ≥85% target (if below, review training content quality)

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 4 - GCP Training Update (ICH E6 R3)​

Finding ID: Finding minor 4

Category: Minor Non-Conformity

Description: The GCP training is performed every 3 years with the platform Global Health Network, or each time there is a new GCP version. The training was not performed for the new version of ICH E6 R3 (released in July 2025).

Root Cause Analysis:

Why was ICH E6 R3 training not performed?

  • Because ICH E6 R3 was released July 2025, but training was not initiated by November 2025 audit date (4+ month delay)

Why was training not initiated?

  • Because no one at Legit Health was aware that ICH E6 R3 had been released, OR awareness existed but training scheduling was delayed

Why was Legit Health not aware / why was scheduling delayed?

  • Because no defined process exists for monitoring GCP releases and triggering training actions

Why is monitoring process missing?

  • Same root cause as Finding Minor 2 - GP-005 does not define responsibility for GCP regulatory monitoring

Root Cause: No organizational process for monitoring GCP updates and triggering required training within defined timeframes. (This is same systemic root cause as Finding Minor 2 - both findings stem from lack of proactive regulatory monitoring.)

Note: Since Finding Minor 2 and Minor 4 share the same root cause (lack of GCP monitoring process), the Preventive Actions are identical - see Finding Minor 2. Below addresses only the specific corrective need (deliver the missing ICH E6 R3 training).

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
4.CA.1Immediately schedule ICH E6 R3 training for all relevant personnel: JD-003 (Regulatory Affairs Manager), JD-005 (Quality Manager - clinical oversight), JD-007 (Clinical Operations team members), any other personnel with "GCP Training" in their training requirements; Training provider: Global Health Network (preferred); Format: Online or in-person GCP ICH E6 R3 course with certificateHR Manager + JD-0052026-03-15
4.CA.2Document training completion using T-005-004 Training evaluation and record for each person trainedHR Manager2026-03-15
4.CA.3Assess impact of training delay on current clinical investigations: Review all active clinical investigations (if any); Determine if ICH E6 R3 changes affect ongoing studies; Document assessment: "R-005-XXX ICH E6 R3 Training Delay Impact Assessment"; If impact identified, implement corrective actions per study protocolsJD-005 (Regulatory/Clinical lead)2026-02-15
4.CA.4If no active clinical investigations currently: Document as "No active studies, therefore no patient impact from training delay"JD-005 (Regulatory/Clinical lead)2026-02-15

Preventive Actions:

Note: See Finding Minor 2 - Preventive actions are identical (update GP-005 with GCP monitoring process, establish quarterly checks, 90-day response timeframe, etc.). No separate preventive actions needed here to avoid duplication.

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-15): All relevant personnel have completed ICH E6 R3 training with certificates/records filed
  • Evidence: T-005-004 training records for each person, GCP certificates, impact assessment document

Preventive Action Effectiveness:

  • See Finding Minor 2 (same preventive action - will be verified once, covers both findings)

Impact Assessment:

  • Verify training delay did not compromise patient safety in any active clinical investigations
  • Confirm personnel understand ICH E6 R3 updates (per competency verification in Finding Minor 3)

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 5 - Supplier Evaluation Clarity​

Finding ID: Finding minor 5

Category: Minor Non-Conformity

Description: Supplier evaluation: The example of AWS S3 was checked during the audit. Several criteria are checked and for each, a value and a score must be completed. In this example, for the criterion "affordable price", the value is 9 and the score is 2. For the criterion "quality of services", the value is 7 and the score is 1. The column "value" is unclear and could not be explained by the auditee.

Root Cause Analysis:

Why could the "value" column not be explained by auditee?

  • Because during AWS S3 evaluation audit review, auditee could not clearly explain difference between "Value" (e.g., 9, 7) and "Score" (e.g., 2, 1) columns

Why couldn't auditee explain it?

  • Because GP-010 Section "Supplier scorecard" (lines 105-125) defines 7 evaluation facets and 0-2 point scoring scale, but does not explain "Value" column that appears in actual practice (R-010-001 records)

Why doesn't GP-010 explain the "Value" column?

  • Because GP-010 describes scoring as "0-2 points per criterion" but supplier evaluation records show two-tier system (Value 1-10, then Score 0-2) that is not documented in procedure

Why is two-tier system not documented?

  • Because either:
    • System evolved in practice but GP-010 was not updated to match, OR
    • "Value" and "Score" are same thing but terminology inconsistency exists

Root Cause: GP-010 supplier evaluation methodology description does not match actual practice documented in R-010-001 records. Specifically, relationship between "Value" column (appears to be 1-10 raw score) and "Score" column (0-2 final score) is not defined, causing confusion during audits.

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
5.CA.1Investigate current supplier evaluation practice: Interview personnel who perform supplier evaluations (Procurement Manager, Quality Manager); Review last 5 supplier evaluation records (R-010-001); Document actual methodology being used (What does "Value" represent? What does "Score" represent? How is conversion calculated?); Document findings in "R-010-XXX Supplier Evaluation Methodology Clarification 2026"Quality Manager + Procurement Manager2026-03-05
5.CA.2Re-evaluate AWS S3 with clarified methodology: Apply documented methodology to AWS S3 evaluation; Verify scores are justified and consistent; Update R-010-001 AWS record with clear explanations if neededQuality Manager + Procurement Manager2026-03-10
5.CA.3Present findings to Quality Manager for approvalQuality Manager2026-03-10

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
5.PA.1Update GP-010 Section "Supplier scorecard" (lines 105-125) to add clear explanation: If two-tier system correct: document Value Assessment (1-10 scale) and Score Conversion (0-2 scale) with conversion table (Score 2 = Value 8-10, Score 1 = Value 5-7, Score 0 = Value 1-4); If single-tier: remove "Value" column from templates, use only "Score"Quality Manager2026-03-15
5.PA.2Update T-010-001 Supplier evaluation template to match GP-010 explanation: Add conversion table reference or remove "Value" column; Add "Evaluator Notes/Evidence" column to document justification for each scoreQuality Manager2026-03-15
5.PA.3Add scoring examples to GP-010: Provide example evaluation for fictitious supplier; Show how each facet is scored with justification; Include total score calculationQuality Manager2026-03-15
5.PA.4Create training material "Supplier Evaluation Methodology": 1-page guidance document with scoring table and examples; Store in GP-010 Templates folderQuality Manager2026-03-15
5.PA.5Train personnel who perform supplier evaluations (Procurement Manager, Quality Manager, relevant JDs) on clarified methodologyQuality Manager2026-03-15
5.PA.6Add supplier evaluation methodology to internal audit checklist - verify evaluations include clear score justificationsQuality Manager2026-03-15

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-10): R-010-XXX Clarification document exists explaining current methodology, AWS S3 re-evaluation completed
  • Evidence: Investigation report, updated AWS S3 evaluation with clear justifications

Preventive Action Effectiveness:

  • 3 months after implementation (2026-06-15): Review 3 new supplier evaluations completed after GP-010 update - verify scoring is clear, consistent, and includes justifications
  • Evidence: Updated GP-010 section, updated T-010-001 template, training records, sample evaluations with clear scoring rationale

Impact Assessment:

  • Verify clarified methodology does not change supplier qualification results significantly (if major changes, may need to re-evaluate existing suppliers)
  • Confirm evaluators can consistently apply methodology

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 7 - AI Explainability Methodology​

Finding ID: Finding minor 7

Category: Minor Non-Conformity

Description: Although explainability is described as being integrated into AI development, no reported methodology or endpoint clearly supports this claim. For example, in erythema intensity quantification, the reported endpoints do not allow assessment of whether the model is actually focusing on the redness area when making its prediction. High-level functionalities within Legit.Health Plus are done using multiple AI models, each specific to a certain task. Documents and answers are only partly convincing with respect to how Legit.Health controls and ensures the consistency of the different model responses for the same final high-level functionality. Each AI model is independently assessed with respect to the sourced state-of-the-art performances. No details are given on the adjudication process.

Root Cause Analysis:

Why is there no reported methodology for explainability?

  • Because R-TF-028-005 (AI Development Report) mentions explainability techniques (GradCAM, SHAP) exist (line 211 in R-TF-028-002), but does not document: (1) specific explainability methodology per model, (2) validation endpoints showing model focuses on correct features, (3) multi-model consistency verification, (4) adjudication process

Why doesn't R-TF-028-005 document these elements?

  • Because GP-028 (AI Development procedure) and R-TF-028-002 (AI/ML Development Plan template) mention explainability as development practice but do not require formal documentation of explainability validation as part of model verification

Why is explainability validation documentation not required?

  • Because when GP-028 and AI Development Report template were created, focus was on performance metrics (accuracy, sensitivity) and clinical validation, not on documenting HOW model decision-making was validated (explainability endpoints)

Why was explainability endpoint documentation not prioritized?

  • Because ISO 13485 and MDR do not explicitly require "explainability" documentation, so it was considered sufficient to implement explainability techniques without formal validation/documentation framework

Root Cause: GP-028 and AI Development Report template (R-TF-028-005) do not require documentation of:

  1. Explainability methodology per model (which XAI technique used and why)
  2. Explainability validation endpoints (how we verify model focuses on correct features)
  3. Multi-model consistency verification (for high-level functionalities using multiple AI models)
  4. Adjudication process (how conflicts/disagreements between models are resolved)

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
7.CA.1Complete explainability documentation for all deployed AI models in R-TF-028-005: For EACH model add "Explainability Validation" section including: Explainability technique used; Validation endpoints (specific tests to verify model attention); Visual evidence (sample GradCAM/attention map images); Edge cases tested; Reviewer (AI developer + clinical expert); Conclusion statementAI/ML Team Lead + AI Developers + Clinical SME2026-03-10
7.CA.2Document multi-model consistency verification for high-level functionalities: Identify all Legit.Health Plus features using multiple AI models; Document which models are involved, how outputs are combined, consistency checks performed, test cases showing coherent combined results; Create "Multi-Model Integration and Consistency Verification" sectionAI/ML Team Lead2026-03-12
7.CA.3Document adjudication process: Create "Model Adjudication Process" section defining scenarios requiring adjudication, decision rules, escalation path, documentation requirements; Review past 6 months of production data for adjudication casesAI/ML Team Lead2026-03-12
7.CA.4Present completed documentation to Quality Manager for review and approvalAI/ML Team Lead2026-03-14

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
7.PA.1Update GP-028 (AI Development) Section "Algorithm Evaluation" to add mandatory "Explainability Validation Requirements" subsection: Technique selection and justification; Validation endpoints (heatmap/attention review min 100 samples, ≥90% correct feature focus target); Visual documentation; Clinical validation; Edge case testingQuality Manager + AI/ML Team Lead2026-03-15
7.PA.2Update R-TF-028-005 AI Development Report template to include mandatory sections: "Explainability Validation" (per model), "Multi-Model Integration" (for features using >1 model), "Adjudication Process" (decision rules for conflicts/edge cases)Quality Manager + AI/ML Team Lead2026-03-15
7.PA.3Create explainability validation checklist template "T-028-XXX Explainability Validation Checklist": Technique documented? Endpoints executed? Visual evidence attached? Clinical review completed? Pass/Fail criteria met? Must be completed before model releaseQuality Manager2026-03-08
7.PA.4Integrate explainability validation into AI development workflow: Add "Explainability Validation" as mandatory gate in Jira workflow; Cannot proceed to "Model Release" without completed checklistAI/ML Team Lead2026-03-10
7.PA.5Train AI development team on explainability documentation requirementsQuality Manager2026-03-08
7.PA.6Add explainability validation to internal AI model audit checklistQuality Manager2026-03-15

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-15): Quality Manager audits R-TF-028-005 - confirms all models have completed explainability, multi-model, and adjudication documentation
  • Evidence: Updated R-TF-028-005 with new sections for all models, sample GradCAM/attention visualizations, multi-model test results, adjudication process document

Preventive Action Effectiveness:

  • 6 months after implementation (2026-10-31): Review next AI model developed after preventive action - verify complete explainability validation documented from start
  • Evidence: Updated GP-028, updated R-TF-028-005 template, T-028-XXX checklist, training records, sample new model with complete explainability docs

Impact Assessment:

  • Verify explainability validation does not delay model development beyond 2 weeks
  • Confirm explainability validation improves clinical confidence in model outputs
  • Check if explainability testing identifies any models with incorrect feature focus (if yes, retrain model)

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 8 - Audit Log Review​

Finding ID: Finding minor 8

Category: Minor Non-Conformity

Description: The audit logs are saved in AWS database. No audit log review is performed by Legit Health.

Root Cause Analysis:

Why is no audit log review performed?

  • Because audit logs are collected in AWS database (CloudWatch, CloudTrail, RDS logs, etc.) but Legit Health does not have established process or schedule for reviewing them

Why is there no review process?

  • Because no SOP defines audit log review requirements - GP-030 (Incident Response/NIS2) addresses incident RESPONSE but not proactive log MONITORING

Why doesn't an SOP define log review?

  • Because when AWS infrastructure was implemented, focus was on log COLLECTION (for regulatory compliance and incident investigation capability), not on PROACTIVE REVIEW for security event detection

Why was proactive review not implemented?

  • Because ISO 13485 and MDR require audit trail capability and data integrity controls, but do not explicitly mandate periodic review schedules - so logs were collected "for when needed" but not monitored routinely

Root Cause: No documented process or assigned responsibility for periodic review of audit logs collected in AWS. While logs are available for incident investigation (reactive), no proactive monitoring occurs to detect security events, unauthorized access, or anomalies before they cause incidents.

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
8.CA.1Immediate retrospective log review: Assign IT Security Manager to perform one-time review of past 6 months of audit logs; Scope: AWS CloudTrail (unusual API calls, privilege escalations, failed auth), Application logs (error spikes, unusual access), Database logs RDS (failed queries, unauthorized access), Network logs; Document findings in "R-030-XXX Retrospective Audit Log Review Jan-Nov 2025"; If security events found: assess impact and implement per GP-030; If no issues: document as "No security events detected"IT Security Manager / System Administrator2026-03-20
8.CA.2Present findings to JD-001 (CEO) and JD-005 (Regulatory/Quality) for reviewIT Security Manager2026-03-25

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
8.PA.1Update GP-030 (Incident Response/NIS2) to add "Audit Log Monitoring and Review Process": Responsible person, monthly review frequency, scope of review (CloudTrail, app logs, DB logs, security alerts), review process steps, documentation requirements, 5 years retentionQuality Manager + IT Security Manager2026-03-25
8.PA.2Create T-030-XXX "Monthly Audit Log Review Record" template: Review date/reviewer, period reviewed, logs reviewed, summary, findings (or "No security events detected"), actions taken, reviewer signature; Store in GP-030 Log-Reviews folderQuality Manager2026-03-25
8.PA.3Evaluate and implement automated log monitoring tools: AWS GuardDuty, CloudWatch alarms, Security Hub; Start with basic CloudWatch alarms for critical events; Configure email/Slack alerts to IT Security Manager; Document configuration and thresholdsIT Security Manager2026-03-15
8.PA.4Add IT Security Manager or System Administrator to organizational roles if not already defined (update job descriptions)Quality Manager2026-03-25
8.PA.5Train designated person(s) on log review process and toolsQuality Manager2026-03-15
8.PA.6Set monthly calendar reminder for log reviewIT Security Manager2026-03-25
8.PA.7Add "Audit Log Review Completion Rate" to quality indicators: target 100% of months have completed reviewQuality Manager2026-03-25

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-25): R-030-XXX Retrospective Review document completed, findings presented to management
  • Evidence: Retrospective review report with documented findings or "no issues detected"

Preventive Action Effectiveness:

  • 3 months after implementation (2026-06-30): Verify 3 consecutive months have completed monthly log review records (March, April, May 2026)
  • Evidence: Updated GP-030 section, T-030-XXX template, 3 monthly log review records with reviewer signatures, automated alert configuration documentation, training records

Impact Assessment:

  • Verify log review process takes less than 4 hours per month (acceptable workload)
  • Confirm automated alerts reduce manual review burden
  • Check if log review detects any previously unknown security events (if yes, assess impact)

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 9a - Software Validation Documentation​

Finding ID: Finding minor 9a

Category: Minor Non-Conformity

Description: In the GP-019, it is not clearly explained that the non-risked software does not need to undergo a validation. In this case, only issues are tracked. Furthermore, in the external software list, there are no justifications for the choice of the risk class.

Root Cause Analysis:

Why is it not clearly explained that non-risked software doesn't need validation?

  • Because GP-019 describes risk-based approach with "high-risk vs non-high-risk" classification (lines 62-71) but does not explicitly state "non-risked software = no validation, only issue tracking"

Why doesn't GP-019 explicitly state this?

  • Because GP-019 was written focusing on validation requirements FOR software that needs validation, not on documenting what happens to software that DOESN'T need validation

Why are justifications missing from external software list?

  • Because R-019-002 (External Software List) lists software names and general info, but does not include "Risk Class" and "Justification" columns showing WHY each software received its risk classification

Why don't templates require risk justification?

  • Because when R-019-002 and T-019-002 were created, focus was on listing software inventory, not on documenting risk assessment rationale

Root Cause:

  1. GP-019 Section "Software classification" does not explicitly state that "non-risked" category exists and requires only issue tracking (no validation)
  2. R-019-002 External Software List and T-019-002 template lack "Risk Class" and "Risk Justification" fields to document assessment rationale

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
9a.CA.1Review and document risk classifications for all software in R-019-002: For EACH software determine Risk Class (High-risk / Non-high-risk / Non-risked) and Justification (patient data? safety/performance? clinical decisions? infrastructure only?); Document in updated R-019-002 or create "R-019-XXX Software Risk Classification Justifications 2026"IT Manager + Quality Manager2026-03-10
9a.CA.2Identify any software lacking validation that should have been validated based on risk assessment: If high-risk software found without validation, initiate validation per GP-019 immediately; Document as separate NC if validation was required but not performedIT Manager + Quality Manager2026-03-12

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
9a.PA.1Update GP-019 Section "Software classification" (around line 122) to add "Software Risk Categories and Validation Requirements": High-risk software (patient data, safety/performance, regulatory) = Full validation per GP-019; Non-high-risk software (business processes, QMS tools) = Simplified validation or issue tracking; Non-risked software (infrastructure, generic office tools) = Only issue tracking requiredQuality Manager2026-03-10
9a.PA.2Update R-019-002 External Software List to add columns: "Risk Class", "Risk Justification", "Validation Status", "Validation Reference"Quality Manager2026-03-10
9a.PA.3Update T-019-002 External Software List template to include new columnsQuality Manager2026-03-10
9a.PA.4Create "T-019-XXX Software Risk Assessment Form" for new software: Software info, intended use, risk assessment questions (patient data? safety? clinical decisions? regulatory?), risk classification decision, validation requirements, approver signaturesQuality Manager2026-03-15
9a.PA.5Add software risk assessment as mandatory step in software procurement/implementation process (update relevant SOP or GP-019)Quality Manager2026-03-15
9a.PA.6Train IT Manager and Quality Manager on software risk assessment processQuality Manager2026-03-15

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-12): R-019-002 or R-019-XXX document includes risk class and justification for all listed software
  • Evidence: Updated External Software List with Risk Class, Justification, Validation Status columns filled for all entries

Preventive Action Effectiveness:

  • 3 months after implementation (2026-06-15): Review any new software added after 2026-03-15 - verify risk assessment form completed and risk class documented
  • Evidence: Updated GP-019 section, updated R-019-002 and T-019-002 templates, T-019-XXX risk assessment form, training records, sample completed risk assessments

Impact Assessment:

  • Verify clarified risk categories do not cause unnecessary validation workload for non-risked software
  • Confirm risk assessment form can be completed in less than 30 minutes per software

Target Completion Date: 2026-03-15

Status: Planned


Finding Minor 9b - SOUP Documentation Process​

Finding ID: Finding minor 9b

Category: Minor Non-Conformity

Description: There is no document for a new Software Of Unknown Provenance (SOUP) request, and the process is not yet documented.

Root Cause Analysis:

Why is there no SOUP request document?

  • Because no formal template or form exists for requesting approval to use new SOUP (Software of Unknown Provenance - third-party libraries, frameworks, tools)

Why is the SOUP process not documented?

  • Because GP-012 (Design, Redesign and Development) and GP-019 (Software Validation) address software in general, but do not specifically document SOUP intake, evaluation, approval, and monitoring process

Why isn't SOUP process specifically documented?

  • Because when GP-012 and GP-019 were written, focus was on Legit Health's own software development and off-the-shelf software tools, not on third-party libraries/frameworks integrated into medical device software

Why was SOUP (libraries/frameworks) not specifically addressed?

  • Because SOUP management is ISO 62304 requirement (clauses 5.3, 7.1, 8.1.2) specific to medical device software development, and GP-012/GP-019 may have been written before full 62304 implementation or SOUP wasn't recognized as separate concern from general "external software"

Root Cause:

  1. No documented SOUP intake and approval process in GP-012 or GP-019
  2. No SOUP request form template requiring risk assessment, security evaluation, and approval before use
  3. No SOUP register distinguishing SOUP (integrated libraries/frameworks) from general external software tools

Corrective Actions:

Action IDAction DescriptionResponsible PersonTarget Date
9b.CA.1Identify all SOUP currently used in Legit.Health Plus: Review package.json, requirements.txt, build files, dependency lists; Distinguish SOUP (code libraries integrated into product) from External Software Tools (Jira, AWS, etc.); Create initial list: Library name, version, purpose, where used (e.g., React, TensorFlow, NumPy, Express.js)IT Manager + Development Team Lead2026-03-08
9b.CA.2Create retroactive SOUP request/approval records for identified SOUP: For each SOUP document identification, intended use, risk assessment (patient data? clinical algorithms? security-critical?), security evaluation (vulnerabilities? update frequency? community support?), license compliance, approval signatures; Document as "R-012-XXX SOUP Inventory and Retroactive Approval 2026"IT Manager + Development Team Lead2026-03-12
9b.CA.3Identify any high-risk SOUP requiring validation: If critical SOUP found (e.g., medical image processing library, cryptography library), initiate validation documentationIT Manager + Quality Manager2026-03-12

Preventive Actions:

Action IDAction DescriptionResponsible PersonTarget Date
9b.PA.1Update GP-012 Section on external software/components to add "SOUP Management Process" (ISO 62304 compliance): SOUP Intake Process (T-012-XXX request form required, IT+Quality Manager approval, SOUP Register); SOUP Evaluation Criteria (functionality, risk, security, license, maintenance, validation needs); SOUP Monitoring (quarterly review for security updates and CVEs)Quality Manager + IT Manager2026-03-12
9b.PA.2Create T-012-XXX "SOUP Request and Evaluation Form" template: SOUP identification, intended use, risk assessment checkboxes, security evaluation (CVE check), license compliance, alternatives considered, validation required Y/N, requester signature, approval signatures, dateQuality Manager2026-03-12
9b.PA.3Create R-012-XXX "SOUP Register": Master list of all approved SOUP with columns Name, Version, Vendor, Purpose, Risk Level, Validation Status, Approval Date, Last Review Date, Status (Active/Deprecated); Store in GP-012 SOUP-Register folder; Update quarterlyQuality Manager2026-03-12
9b.PA.4Establish SOUP monitoring process: Quarterly check SOUP Register against CVE databases and security advisories; Document review in "R-012-XXX SOUP Quarterly Security Review"; If critical vulnerability found, assess impact and implement per GP-006IT Manager2026-03-15
9b.PA.5Integrate SOUP approval into development workflow: Update developer onboarding/training "Do not add new dependencies without SOUP request approval"; Add SOUP check to code review checklist "Any new dependencies? Approved?"IT Manager + Development Team Lead2026-03-15
9b.PA.6Train development team and QA on SOUP processQuality Manager2026-03-15

Verification Method:

Corrective Action Effectiveness:

  • Immediate verification (2026-03-12): R-012-XXX SOUP Inventory document exists with all current SOUP identified and retroactively approved
  • Evidence: SOUP inventory list, retroactive approval records

Preventive Action Effectiveness:

  • 3 months after implementation (2026-06-15): Review development activity - verify any new SOUP added after 2026-03-15 has completed T-012-XXX request form with approval
  • Evidence: Updated GP-012 section, T-012-XXX template, R-012-XXX SOUP Register, quarterly security review record, training records, sample approved SOUP requests

Impact Assessment:

  • Verify SOUP approval process does not delay development beyond 2-3 business days
  • Confirm no unapproved SOUP in production code
  • Check if SOUP security reviews identify any vulnerabilities requiring action

Target Completion Date: 2026-03-15

Status: Planned

  • Audit of SOUP register completeness
  • Training records verification

Target Completion Date: 2026-03-15

Status: Planned


Implementation Timeline​

Q1 2026 (January - March)​

February 2026:

  • Finding Minor 1: Update Non-Conformities SOP with communication timeframes (2026-03-15)
  • Finding Minor 2: Complete ICH E6 R3 risk analysis and update HR SOP (2026-03-15)
  • Finding Minor 4: Complete ICH E6 R3 training and update HR SOP (2026-03-15)

March 2026:

  • Finding Minor 5: Update Supplier Management SOP and evaluation templates (2026-03-15)
  • Finding Minor 9a: Update GP-019 with software risk classification clarity (2026-03-15)
  • Finding Minor 9b: Update GP-012 with SOUP management process (2026-03-15)
  • Finding Major 6: Complete all AI Development Reports (2026-03-15)
  • Finding Minor 3: Implement training effectiveness assessments (2026-03-15)
  • Finding Minor 8: Establish audit log review process (2026-03-15)

Q2 2026 (April - June)​

April 2026:

  • Finding Minor 7: Complete AI explainability methodology documentation (2026-03-15)

Monitoring and Verification​

Internal Verification Activities​

9b.PA.1 Monthly Progress Reviews (Quality Management Team)

  • Review CAPA implementation status
  • Identify and address roadblocks
  • Update completion percentages

9b.PA.2 Quarterly Effectiveness Assessments

  • Verify preventive actions are effective
  • Review metrics and indicators
  • Adjust actions if needed

9b.PA.3 Document Review Checkpoints

  • All updated SOPs to be reviewed by Quality Manager
  • Cross-functional review for AI-related documentation
  • Management approval for major process changes

Key Performance Indicators​

KPITargetMeasurement
CAPA On-Time Completion100%% of CAPAs completed by deadline
Major Finding Resolution100% by 2026-03-15Completion status
Minor Findings Resolution100% by 2026-03-15Completion status
Training Completion (ICH E6 R3)100% relevant personnelTraining records
AI Documentation Completeness100% of modelsReport audit
Audit Log Review FrequencyMonthlyReview records

Risk Assessment​

Risks to CAPA Implementation​

RiskProbabilityImpactMitigation
Resource constraints delay AI documentationMediumHighPrioritize Major Finding 6; allocate dedicated resources
ICH E6 R3 training availability delayedLowMediumContact Global Health Network proactively; consider alternative training providers
Complexity of explainability documentationMediumMediumEngage external AI experts if needed; phased approach
Resistance to new assessment proceduresLowLowInvolve personnel in development; provide adequate training

Resource Requirements​

Personnel​

  • Quality Manager: 40 hours
  • AI/ML Team Lead: 80 hours (Major Finding 6 and Minor Finding 7)
  • HR Manager: 30 hours
  • IT Manager: 40 hours
  • Regulatory Affairs: 20 hours
  • Training Coordinator: 25 hours

External Resources​

  • ICH E6 R3 training materials (Global Health Network or alternative)
  • Potential AI explainability expert consultation (if needed)
  • Automated log analysis tools (evaluation and implementation)

Budget Estimate​

  • Training: €2,000
  • External consultation (if required): €5,000
  • Software tools (log analysis): €3,000
  • Total estimated: €10,000

Communication Plan​

Internal Communication​

9b.PA.4 Management Briefing (Immediate)

  • Present CAPA Plan to management
  • Confirm resource allocation
  • Obtain formal approval

9b.PA.5 Team Meetings (Monthly)

  • Progress updates to all affected teams
  • Discussion of challenges and solutions
  • Coordination of interdependent activities

9b.PA.6 Training Sessions (As scheduled)

  • New procedures and updated SOPs
  • Assessment methodologies
  • Tool usage (log analysis, etc.)

External Communication​

9b.PA.7 Quantificare Notification

  • Submit CAPA Plan for review (within 30 days of audit)
  • Provide progress updates (quarterly)
  • Notify upon completion with evidence

9b.PA.8 Customer Communication (if applicable)

  • Inform affected customers per updated communication timeframes
  • Provide reassurance on quality commitment

Approval​

RoleNameSignatureDate
Quality Manager
General Manager
Regulatory Affairs

Document History​

VersionDateAuthorDescription
1.02026-01-15Quality ManagementInitial CAPA Plan creation following Quantificare audit

Appendices​

Appendix A: Referenced Documents​

  • GP-003: Non-Conformities Management
  • GP-012: Software Development
  • GP-019: Software Validation
  • HR SOPs: Human Resources Procedures
  • Supplier Management SOP
  • AI Development SOP
  • IT Security Procedures

Appendix B: Supporting Evidence​

  • Quantificare Audit Report dated 2025-11-06
  • Current process flowcharts
  • Existing supplier evaluation templates
  • Current AI Development Report templates

Appendix C: Training Materials​

  • To be developed as part of CAPA implementation
  • Training effectiveness assessment templates
  • ICH E6 R3 training materials

End of Document

Previous
R-003-001 Audit program_2023_001
Next
R-003-001 Audit program
  • Document Information
  • Purpose and Scope
  • Executive Summary
  • Summary of Findings
  • Evidence-Based Analysis and Action Plan
    • Finding Major 6 - AI Development Report Completeness
    • Finding Minor 1 - Customer Communication Timeframe for Non-Conformities
    • Finding Minor 2 - ICH E6 R3 Risk Analysis
    • Finding Minor 3 - Training Effectiveness Assessment
    • Finding Minor 4 - GCP Training Update (ICH E6 R3)
    • Finding Minor 5 - Supplier Evaluation Methodology Clarity
    • Finding Minor 7 - AI Explainability Methodology
    • Finding Minor 8 - Audit Log Review
    • Finding Minor 9a - Software Validation Risk Classification Clarity
    • Finding Minor 9b - SOUP Management Process Documentation
  • Detailed CAPA Actions
    • MAJOR FINDING
    • Finding Major 6 - AI Development Report Incomplete Documentation
    • MINOR FINDINGS
    • Finding Minor 1 - Customer Communication Timeframe
    • Finding Minor 2 - ICH E6 R3 Risk Analysis
    • Finding Minor 3 - Training Effectiveness Assessment
    • Finding Minor 4 - GCP Training Update (ICH E6 R3)
    • Finding Minor 5 - Supplier Evaluation Clarity
    • Finding Minor 7 - AI Explainability Methodology
    • Finding Minor 8 - Audit Log Review
    • Finding Minor 9a - Software Validation Documentation
    • Finding Minor 9b - SOUP Documentation Process
  • Implementation Timeline
    • Q1 2026 (January - March)
    • Q2 2026 (April - June)
  • Monitoring and Verification
    • Internal Verification Activities
    • Key Performance Indicators
  • Risk Assessment
    • Risks to CAPA Implementation
  • Resource Requirements
    • Personnel
    • External Resources
    • Budget Estimate
  • Communication Plan
    • Internal Communication
    • External Communication
  • Approval
  • Document History
  • Appendices
    • Appendix A: Referenced Documents
    • Appendix B: Supporting Evidence
    • Appendix C: Training Materials
All the information contained in this QMS is confidential. The recipient agrees not to transmit or reproduce the information, neither by himself nor by third parties, through whichever means, without obtaining the prior written permission of Legit.Health (AI Labs Group S.L.)