Skip to main content
QMSQMS
QMS
  • Welcome to your QMS
  • Quality Manual
  • Procedures
  • Records
  • Legit.Health Plus Version 1.1.0.0
    • Index
    • Overview and Device Description
    • Information provided by the Manufacturer
    • Design and Manufacturing Information
    • GSPR
    • Benefit-Risk Analysis and Risk Management
    • Product Verification and Validation
      • Software
      • Artificial Intelligence
      • Cybersecurity
      • Usability and Human Factors Engineering
      • Clinical
      • Commissioning
        • R-T-029-001 Software Commissioning Plan
        • R-T-029-002 Software Commissioning Report
        • Use Case 001 - Referral application using Diagnostic-Support API (Top-5 pathologies)
        • Use Case 002 - PASI demo calculator using clinical-signs models
    • Post-Market Surveillance
  • Legit.Health Plus Version 1.1.0.1
  • Licenses and accreditations
  • Applicable Standards and Regulations
  • Grants
  • Pricing
  • Public tenders
  • Legit.Health Plus Version 1.1.0.0
  • Product Verification and Validation
  • Commissioning
  • Use Case 001 - Referral application using Diagnostic-Support API (Top-5 pathologies)

Use Case 001 - Referral application using Diagnostic-Support API (Top-5 pathologies)

1. Identifier​

  • UC ID: VAL-UC-001
  • Title: Referral application using Diagnostic-Support API (Top-5 pathologies)

2. Purpose and Coverage​

  • Intended use covered: Provide diagnostic support for referral by returning the five most probable dermatological pathologies for an uploaded lesion image set.
  • System requirements covered: SRS-013, SRS-014, SRS-015, SRS-016, SRS-017, SRS-018, SRS-019, SRS-020, SRS-021, SRS-022, SRS-023, SRS-024, SRS-027, SRS-028, SRS-038, SRS-039, SRS-042, SRS-045, SRS-046, SRS-047, SRS-067, SRS-068, SRS-077, SRS-078, SRS-087.
  • Validation scope: End-to-end flow (frontend → backend → medical-device API → backend → frontend), including quality screening, authentication, analysis, reporting and presentation.

3. Method & Acceptance​

  • Method: Functional testing with predefined datasets.
  • Input information:
    • Non-dermatological image (book photo).
    • Blurry lesion photo (low-quality rejection).
    • Oversized lesion photo (>10 MB).
    • Three photos of melanoma; three of benign nevus; three of psoriasis; three of healthy skin.
  • Objective acceptance criteria:
    1. Images below the DIQA threshold are rejected and not analyzed; frontend shows a quality error next to each rejected image.
    2. For analyzed requests, the response contains an aggregated probability distribution over conditions that sums to 100% across all classes; the frontend displays the top 5 only.
    3. Displayed sensitivity and specificity match the validated metrics of the model version used; entropy is shown in [0,1].
    4. Per-image quality score is displayed for each uploaded image.
    5. Backend obtains a valid JWT via /login before calling /diagnosis-support; protected calls without a valid token are rejected with 401.
    6. 95th-percentile latency for the end-to-end request under nominal load is < 10 000 ms.
    7. Minimum top-5 diagnostic coverage ≥ 95% against the ground-truth dataset.

4. Enablers & Environment​

  • Operating environment(s):
    • Frontend: Next.js 16 (Vercel).
    • Backend: Node.js service.
    • Medical-device API: Docker on AWS, release v1.1.0.0.
  • IT-NETWORK specifics:
    • HTTPS only, TLS ≥ 1.2 (AES-256-GCM), port 443.
    • JSON payloads; images base64-encoded.
    • JWT expiry: 1 hour.
    • Max request size: 10 MB.
    • Information flow: Frontend → Backend → API → Backend → Frontend.

5. Roles & Independence​

  • Validation personnel: JD-007 (Technology Manager, experienced in software validation for medical devices).
  • Independence level: Independent of the development team within the organization.

6. Preconditions & Setup​

  • Preconditions: Software release v1.1.0.0 frozen; accompanying Instructions for Use and Technical Description available; baseline risk file approved for this release.
  • Configuration: API endpoints /login and /diagnosis-support; HTTPS enforced; JSON content type; frontend does not require user login.

7. Procedure​

  1. Launch the referral application.
  2. Upload up to three images per case (include invalid samples: non-dermatological, blurry, oversized).
  3. Observe backend obtaining JWT via /login and invoking /diagnosis-support.
  4. Confirm low-quality or invalid images are rejected with on-screen quality errors; valid images proceed to analysis.
  5. Verify returned report shows:
    • Top-5 pathologies with probabilities; full distribution sums to 100%.
    • Model sensitivity, specificity, and entropy in [0,1].
    • Per-image quality scores.
  6. Repeat for melanoma, nevus, psoriasis and healthy cases and compare to ground truth to compute top-5 coverage.
  7. Record performance (p95 latency) under nominal load.
  • Data capture: Request/response IDs and timestamps; model and API version; JWT issuance/expiry events; per-image quality scores; probability vectors; HTTP status codes; screenshots (rejections and final reports); backend/API logs; audit log entries.

8. Risks & Constraints​

  • Known limitations/constraints: None identified for this validation run.
  • IT-NETWORK hazardous situations considered: None identified for this validation run.

9. Traceability​

  • Links: Intended use (diagnostic support for referral) → SRS (as listed) → VAL-UC-001.

10. Expected Evidence​

  • Results to record: Test conditions and outcomes per dataset; anomalies (if any) with references; validator identity and date.
  • Residual risk status: Acceptable if all acceptance criteria are met and no unresolved anomalies remain.

11. Deviation & Re-validation Rules​

  • Deviation handling: Any deviation is justified and logged in the validation report.
  • Anomaly handling & re-validation trigger: Repeat affected parts if criteria are not met; re-validate upon changes to the API, AI model, or configuration affecting functionality or performance.

12. Validation Checklist​

  • Dataset readiness: All predefined samples available (non-derm, blurry, >10 MB, melanoma×3, nevus×3, psoriasis×3, healthy×3).
  • Release freeze: Target API release v1.1.0.0 and accompanying docs are controlled and referenced.
  • Security/transport: HTTPS with TLS ≥ 1.2 in effect; JSON and base64 images confirmed.
  • Auth flow: Backend retrieves JWT via /login; protected calls without valid token return 401.
  • Quality gate: Images below DIQA threshold are rejected; frontend shows per-image quality error.
  • Per-image scores: Valid images display per-image quality score in the UI.
  • Top-5 output: Report presents top-5 pathologies; full probability distribution sums to 100%.
  • Model metrics: Sensitivity, specificity match validated values of the active model; entropy in [0,1].
  • Performance: End-to-end p95 latency under nominal load is < 10 000 ms.
  • Coverage: Top-5 diagnostic coverage ≥ 95% against ground truth over the specified cases.
  • Logging/evidence: Captured request/response IDs, timestamps, model/API version, HTTP status codes, screenshots (errors/results), backend/API logs, audit entries.
  • Report & anomalies: Validation report completed with any deviations/anomalies documented and resolved; residual risk assessed as acceptable.
Previous
R-T-029-002 Software Commissioning Report
Next
Use Case 002 - PASI demo calculator using clinical-signs models
  • 1. Identifier
  • 2. Purpose and Coverage
  • 3. Method & Acceptance
  • 4. Enablers & Environment
  • 5. Roles & Independence
  • 6. Preconditions & Setup
  • 7. Procedure
  • 8. Risks & Constraints
  • 9. Traceability
  • 10. Expected Evidence
  • 11. Deviation & Re-validation Rules
  • 12. Validation Checklist
All the information contained in this QMS is confidential. The recipient agrees not to transmit or reproduce the information, neither by himself nor by third parties, through whichever means, without obtaining the prior written permission of Legit.Health (AI LABS GROUP S.L.)