M1: Diagnostic Function
Code: 2650005-202501-M1
Status: Open: awaiting manufacturer response
Requirements and references
- GSPR 1, 15.1, 17.2
- Annex II, 6.1(a), (b); 6.2(f)
- EN 62304
- EN 82304-1
Documents reviewed by BSI
ifu.pdfmaster.csvC515_T355/case-001/response.jsonC537_T377/case-001/response.json
Regulatory context
This section is for internal planning only. It will not be included in the final response to BSI.
BSI raised four questions under Major NC 2650005-202501-M1. The regulatory requirements cited are GSPR 1, 15.1, 17.2; Annex II 6.1(a), (b), 6.2(f); EN 62304; and EN 82304-1.
Cited requirements — detailed breakdown
Each response must explicitly tie back to the specific requirement it satisfies. The cited requirements are:
| Requirement | What it demands | Relevant questions |
|---|---|---|
| GSPR 1 | Devices shall achieve intended performance, be safe and effective, with acceptable benefit-risk ratio | Q1 (performance claims must be understandable) |
| GSPR 15.1 | Devices providing diagnostic information must indicate accuracy limits, precision, and stability; limits of accuracy shall be indicated by the manufacturer | Q1 (IFU must clearly communicate accuracy metrics and their applicability) |
| GSPR 17.2 | Software shall be developed per state of the art: lifecycle, risk management, V&V | Q2, Q3, Q4 (V&V evidence must be complete and correct) |
| GSPR 23.4(c) | IFU must state residual risks and undesirable side-effects | Q1 (performance limitations must be communicated) |
| GSPR 23.4(e) | IFU must state warnings and precautions | Q1 (accuracy limitations per condition/context) |
| GSPR 23.4(h) | IFU must provide specifications for appropriate use, including degree of accuracy claimed | Q1 (core of BSI's concern — accuracy claims need context) |
| MDR Annex II 6.1(a) | Technical documentation must include device description and intended purpose, including performance characteristics | Q1 (performance characteristics must be documented in the technical file, not just the IFU) |
| MDR Annex II 6.1(b) | Technical documentation must include design and manufacturing information sufficient to understand design stages and V&V | Q2, Q3, Q4 (test environment and evidence must be traceable) |
| MDR Annex II 6.2(f) | Technical documentation must include verification and validation results and evidence | Q2 (test environment), Q3 (wrong evidence path), Q4 (evidence mismatch) |
| EN 62304 §5.5 | Software integration and integration testing — requires documented test plan including environment | Q2 (test environment), Q3, Q4 (integration test evidence) |
| EN 62304 §5.6 | Software system testing — requires documented system test plan and evidence | Q2 (test environment specification) |
| EN 82304-1 §6.2 | Health software validation — evidence must include environment specification | Q2 (test environment) |
GSPR 15.1 — our position on diagnostic function
Our GSPR analysis (R-TF-008-001) states that "the device does not have a diagnostic nor a measuring function but it is intended to provide information to support practitioners in decision making for diagnostic purposes." Despite this, GSPR 15.1 is marked as applicable because the device provides information that influences diagnostic decisions.
This position must be proactively clarified in our response. The device is a clinical decision-support tool (Rule 11, MDR Annex VIII), not a diagnostic device. It does not make diagnoses — it provides supplementary information (severity scores, ICD probability distributions, image analysis) that practitioners use alongside their clinical judgement. GSPR 15.1 applies because the accuracy and precision of that supplementary information must be communicated to users so they can weigh it appropriately. Our response should frame the IFU updates as fulfilling GSPR 15.1's requirement to "indicate limits of accuracy" for decision-support information, not as an admission that the device has a diagnostic function.
Response approach for a Major NC
This is a Major NC. Our responses need to be thorough and demonstrate corrective action. For each question:
- State the regulatory requirement being addressed — cite specific GSPR/Annex II clauses
- Acknowledge the gap — don't be defensive
- Explain what was already in place — demonstrate maturity of the system
- Describe corrective action taken — with specific document references
- Provide evidence — red-lined documents and supplementary evidence PDF
Cross-cutting: evidence packaging CAPA
Questions 3 and 4 both involve incorrect or mismatched evidence in the integration test package submitted to BSI. This suggests a systemic issue with evidence packaging/compilation, not isolated errors. BSI may raise this in Round 2 if we only fix the individual cases.
The systemic issue
Two out of 76 integration test evidence packages had errors:
- T355 (Q3): Evidence URI pointed to the wrong S3 path (data entry error in CSV)
- T377 (Q4): Evidence
response.jsondoes not match expected output (structural mismatch or compilation error)
This is a 2.6% error rate in evidence packaging. For a Major NC under Annex II 6.2(f), BSI will expect confidence that the remaining 74 test cases are correct.
Proposed corrective and preventive action
-
Audit all 76 evidence URIs against their test case definitions. For each test case, verify:
- The Evidence URI path contains the correct model name
- The
response.jsonat that path has the same JSON structure as the expected result in the CSV - Numerical values are within the specified acceptance criteria
-
Add automated validation to the evidence collection pipeline:
- Schema validation: check that evidence
response.jsonhas the same top-level keys as the expected result - URI validation: check that the evidence URI path matches the model name in the test case
- Tolerance check: verify all numerical values are within acceptance criteria
- Schema validation: check that evidence
-
Document the audit results as supplementary evidence (table showing all 76 test cases verified, with pass/fail and any corrections made).
Action items for CAPA
| # | Action | Owner | Document affected | Priority |
|---|---|---|---|---|
| 12 | Audit all 76 integration test evidence URIs and content | Gerardo | Supplementary evidence PDF | High |
| 13 | Add automated evidence validation to integration pipeline | Gerardo | Integration test tooling | Medium |
Summary of action items
| # | Action | Owner | Document affected | Priority |
|---|---|---|---|---|
| 1 | Add clinical studies reference section to IFU | Taig + Jordi | ifu.pdf (eu-ifu-mdr) | High |
| 2 | Add ICD-11 codes to performanceClaims.ts data model | Taig + Jordi | performanceClaims.ts | High |
| 3 | Surface ICD-11 codes in IFU performance claims | Taig + Jordi | ClinicalBenefitsList.tsx, ifu.pdf | High |
| 4 | Add clinical benefit glossary/legend table to IFU | Taig | clinical-benefits-and-performance-claims.mdx | High |
| 5 | Add consolidated test environment section to Test Report | Gerardo | R-TF-012-035 | High |
| 6 | Verify correct biofilm evidence exists at correct S3 path | Gerardo | — | High |
| 7 | Fix Evidence URI for T355 in CSV | Gerardo | ai-models-integration-tests.csv | High |
| 8 | Export correct biofilm/slough evidence from S3 | Gerardo | Supplementary evidence PDF | High |
| 9 | Add evidence URI validation to integration test pipeline | Gerardo | Integration test tooling | High |
| 10 | BLOCKER: Investigate T377 response.json in S3 | Gerardo | — | Critical |
| 11 | Provide corrected T377 evidence | Gerardo | Supplementary evidence PDF | High |
| 12 | Audit all 76 integration test evidence URIs and content | Gerardo | Supplementary evidence PDF | High |
| 13 | Add automated evidence validation to integration pipeline | Gerardo | Integration test tooling | Medium |
| 14 | Compile red-lined documents | Taig | BSI response package | High |
| 15 | Compile supplementary evidence PDF with bookmarks | Taig | BSI response package | High |
Dependencies
Timeline
- Deadline: 2026-03-17 (Round 1 response)
- Blockers must be resolved by: 2026-03-05 (items 6, 10)
- Documentation updates: 2026-03-07 (items 1-5, 7)
- Evidence compilation: 2026-03-10 (items 8, 11, 12)
- Internal review deadline: 2026-03-14 (items 14, 15)
- Submission: 2026-03-17