BSI Non-Conformities
This section documents the non-conformities (NCs) raised by BSI (Notified Body NB 2797) during the MDR technical documentation assessment for the device (assessment reference T0088560).
Purpose
These MDX pages are our internal working documents to prepare responses. The final responses must be written directly in BSI's Word document for submission. Supplementary evidence should be compiled as a single bookmarked PDF, and red-lined documentation should be provided for any document changes.
Where to find the information to answer BSI
All the documentation BSI reviewed lives in this turborepo. When researching answers, look in these directories:
/apps/eu-ifu-mdr/: The EU IFU (Instructions for Use) under MDR. This is theifu.pdfBSI references./apps/qms/docs/legit-health-plus-version-1-1-0-0/: The product-specific technical file (design and development, software requirements, V&V, risk management, etc.)./apps/qms/docs/procedures/: Quality management procedures (GP-XXX)./apps/qms/docs/records/: Quality records (R-XXX-XXX).
Folder structure
bsi-non-conformities/
├── CLAUDE.md # This file
├── _category_.json
├── technical-review/ # Software, usability & regulatory NCs
│ ├── _category_.json
│ └── round-1/
│ ├── _category_.json
│ ├── index.mdx # Overview: summary table, timeline, all referenced docs
│ ├── m1-diagnostic-function/ # Multi-question NC → per-question subfolders
│ │ ├── _category_.json
│ │ ├── index.mdx # NC overview: code, status, requirements, shared context
│ │ ├── q1-ifu-performance-claims/
│ │ │ ├── _category_.json
│ │ │ ├── question.mdx
│ │ │ ├── research-and-planning.mdx
│ │ │ └── response.mdx
│ │ ├── q2-test-environment/
│ │ │ └── (same structure)
│ │ ├── q3-biofilm-slough-verification/
│ │ │ └── (same structure)
│ │ └── q4-t377-test-results/
│ │ └── (same structure)
│ ├── m2-software-vv/ # Multi-question NC → per-question subfolders
│ │ ├── _category_.json
│ │ ├── index.mdx
│ │ ├── q1-waf-verification/
│ │ │ ├── _category_.json
│ │ │ ├── question.mdx
│ │ │ ├── research-and-planning.mdx
│ │ │ └── response.mdx
│ │ ├── q2-labeling-ifu-vv/
│ │ │ └── (same structure)
│ │ └── q3-hl7-fhir-verification/
│ │ └── (same structure)
│ ├── n1-information-supplied/ # Single-question NC → flat structure
│ │ ├── _category_.json
│ │ ├── question.mdx
│ │ ├── research-and-planning.mdx
│ │ └── response.mdx
│ ├── n2-usability/ # Multi-question NC → per-question subfolders (a–d)
│ │ ├── _category_.json
│ │ ├── index.mdx
│ │ ├── qa-rca-residual-risk/
│ │ │ └── (same structure)
│ │ ├── qb-intended-use-misunderstanding/
│ │ │ └── (same structure)
│ │ ├── qc-ifu-usability-results/
│ │ │ └── (same structure)
│ │ └── qd-safety-information-effectiveness/
│ │ └── (same structure)
│ └── n3-risk-management/
│ └── (same flat structure)
├── ai-ml-review/ # AI/ML-specific NCs (separate BSI form)
│ └── round-1/
└── clinical-review/ # Clinical evidence NCs (separate BSI form)
└── round-1/
Each review type can have up to 3 rounds (round-1/, round-2/, round-3/) per BSI policy.
Folder naming convention
Each NC folder is named using the code BSI assigned:
- M = Major non-conformity
- N = Minor non-conformity
- Number is sequential within the round (M1, M2, N1, N2, N3...)
- Followed by a kebab-case descriptor of the NC topic
The full BSI code format is {job}-{date}-{code}, e.g. 2650005-202501-M1.
NC folder structure
NCs use one of two structures depending on how many questions they contain:
Single-question NCs (flat structure)
For NCs with 1 question (e.g., N1, N2, N3), the NC folder contains exactly three files:
question.mdx: BSI's verbatim question. Contains requirements and references, documents reviewed, and BSI's question reproduced verbatim. This file is the legal record of what BSI asked and must never be modified after initial creation.research-and-planning.mdx: Internal working document (marked with:::infoadmonition) with analysis, gap analysis, and response strategy. Not included in the final response to BSI.response.mdx: Our response, addressing every sub-item from the question.
Multi-question NCs (per-question subfolders)
For NCs with 2+ questions (e.g., M1 with 4 questions, M2 with 3), each question gets its own subfolder:
index.mdx(NC level): NC overview with code, status, requirements, documents reviewed, shared regulatory context, cross-cutting analysis (e.g., CAPA), summary of action items, and timeline.qN-descriptive-name/(per question): Contains the same three files as the flat structure, but scoped to a single question:question.mdx: BSI's verbatim text for this specific question onlyresearch-and-planning.mdx: Analysis, gap analysis, and strategy for this question onlyresponse.mdx: Our response to this question only
Question subfolder naming: q{N}-kebab-case-descriptor where N matches BSI's question number.
File contents
question.mdx: BSI's verbatim questions reproduced exactly. This is the legal record and must never be modified after initial creation.
research-and-planning.mdx: Internal working document with:
- Analysis of BSI's questions: What exactly BSI is asking, what standard/GSPR is at stake, what the underlying concern is.
- Relevant QMS documents and sections: Specific documents, sections, and content from our QMS and technical file that are relevant to answering. Include file paths and section references.
- Gap analysis: What we already had vs what BSI couldn't find vs what genuinely needs updating.
- Response strategy: How to structure the response: whether to point to existing evidence, provide additional evidence, or acknowledge a gap and describe corrective action.
response.mdx: Our responses, numbered to match BSI's questions exactly. Each response must address every sub-item from the corresponding question.
Kanban board
Each round's index.mdx contains a mermaid kanban board that tracks the status of every question/sub-item. You must update the kanban whenever you work on a question. Move the card to the column that reflects its new status:
| Column | Meaning |
|---|---|
received | Question received, no work started yet |
analysed | research-and-planning.mdx is complete (gap analysis and strategy done) |
answered | response.mdx is complete (draft response written) |
sent | Response has been included in the final Word document sent to BSI |
To move a card, cut its line from the current column and paste it under the target column header in the mermaid block. For example, after completing research-and-planning for M1.1:
analysed[Analysed]
m1q1[M1.1: IFU performance claims clarity]@{ ... }
Method and goal
This is not a Q&A exercise. We don't just write answers — we fix the underlying documentation first, and then our response to BSI explains what we fixed and where. Non-conformities are resolved by making actual changes to the QMS/technical file, not by arguing in prose.
The workflow for each NC is:
- Analyse what BSI is asking and what standard/GSPR is at stake.
- Make the fix in the actual documentation (IFU, risk management, V&V records, etc.).
- Write the response referencing the specific document and section we added or changed.
Never say we created a new process as a result of the non-conformity
Never admit we created a new process as a result of the non-conformity, even when it's true. Instead, argue that the process was already in place and has been followed correctly, but that the documentation needed to be updated to better reflect what we were doing.
Auditors care about the letter, not the spirit
BSI auditors work from checklists derived from the relevant standards (MDR, IEC 62366-1, ISO 14971, IEC 62304, etc.). They want to see boxes checked. This means:
- Use the exact language of the standards. If ISO 14971 says "residual risk", write "residual risk" — not "remaining risk" or "leftover risk". Mirror the standard's terminology precisely.
- Don't over-share. Adding information beyond what is required can open new lines of questioning. Be precise and concise. Address exactly what was asked, nothing more.
- Don't argue or justify. If something was missing, fix it and say it's fixed. Don't explain why it was missing or defend the previous state.
- Show compliance, don't prove truth. The goal is demonstrating that the QMS and technical file meet the requirements of the applicable standards. This is a documentation and process exercise, not a scientific debate.
Key rules
- BSI questions are verbatim. Never paraphrase, reorder, or omit text from BSI's original questions in
question.mdx. They are the legal record of what was asked. - Response numbering must match question numbering. If BSI asks questions 1-4, responses must be numbered 1-4. If a question has sub-items a-d, the response should address each sub-item explicitly.
- Sub-items within a question use bold letters:
**a.**,**b.**, etc. - BSI notes/observations that are not formal questions go in a
:::note BSI observationadmonition inquestion.mdx.
Writing responses
When writing a response in response.mdx:
- Address every sub-item explicitly. If BSI raised points a, b, c, d: respond to each one. Use matching bold letters.
- Reference specific documents and sections. Don't say "see the IFU"; say "see IFU, § 4.2 Clinical Benefits and Performance Claims, Table 3."
- Explain what was already there (if BSI missed it) vs what has been updated (if we need to fix something). Be explicit about which case applies.
- For updated documentation, briefly describe the change and note that red-lined versions will be provided. Reference the specific QMS record or TF document being updated (e.g.,
R-TF-012-037). - For evidence, describe concretely what evidence is being provided and where it can be found in the bookmarked PDF.
- Be factual and direct. Don't be defensive or verbose. BSI reviewers read hundreds of these: concise, well-referenced responses are more effective.
- Never use the company name or product name: use "we" and "the device" per QMS CLAUDE.md conventions.
- Keep responses copy-paste friendly. Responses will be copied into BSI's Word document. Use only formatting that transfers cleanly to Word: paragraphs, bold, italic, numbered lists, and bullet lists. Do NOT use tables, admonitions, code blocks, markdown links, or any MDX-specific syntax in
response.mdx. If you need to present structured data (e.g., a gap-to-fix mapping), use a bullet list instead of a table.
Red-lined documents
BSI requires "red-lined documentation" (tracked changes) for any document we modify. Since our documentation is built from source code (MDX/React), we generate red-lined documents using a script that diffs the git-committed version against the working copy.
It's not clear how are we going to tackle this :/