Healthcare · AI Readiness

AI readiness assessment for UK healthcare suppliers

Healthcare suppliers face DTAC, DSPT, and NHS supplier expectations that are increasingly extending to AI-touched workflows. This four-minute readiness check tells you where you stand.

Healthcare supplier organisations sit one degree removed from patient care, but their compliance posture is shaped by the people one degree closer in. NHS trusts, integrated care boards, primary care networks, and the procurement frameworks that govern them all increasingly ask one question of suppliers: how do you handle AI?

The answer expected has moved fast. Three years ago, "we use AI internally for productivity" was an acceptable response. Today, suppliers are being asked to demonstrate that AI use in their service is governed, evidenced, and bounded against patient-data exposure they may or may not technically have.

The healthcare supplier context

Three frameworks shape supplier conversations on AI:

DTAC — the NHS Digital Technology Assessment Criteria. The threshold framework for digital health technologies. AI features increasingly trigger DTAC scope, even where a product's core function pre-dated AI.

DSPT — the Data Security and Protection Toolkit. Mandatory for organisations processing NHS patient data. AI usage interacts with DSPT in ways the toolkit was not originally written to capture; supplier interpretations vary.

NHS supplier expectations. Trusts and ICBs are increasingly asking suppliers — even those who do not directly process patient data — for evidence of AI governance, data residency, and human-in-the-loop arrangements. "We don't see patient data" is rarely the full story; deidentified data, telemetry, and operational data all live in scope.

Why "we don't see patient data" is rarely the full story

Three patterns surface repeatedly:

Pattern one. The supplier processes patient data in a controlled way (DPA, DSPT, encryption at rest and in transit) — and uses AI tools internally, off the patient-data path, where staff are paste-prone. The staff path is the leak path. The DSPT compliance posture is real; the AI exposure is real too.

Pattern two. The supplier handles "deidentified" data only — but the AI tooling has a context window long enough to re-identify, especially when combined with other data sources accessible through the same AI assistant. Deidentification was sufficient before AI; it is sometimes not sufficient after.

Pattern three. The supplier sells software to NHS clients but does not host patient data themselves. They build with embedded AI features from a third party (Microsoft, OpenAI, etc.) and have not asked whether their clients' data is being processed by that third party in ways that breach the clients' own DSPT submissions.

Each pattern is solvable. None is solvable without explicit assessment.

Five readiness dimensions in healthcare context

The Arx Certa scorecard weights heavily on Governance and Security for healthcare supplier contexts.

Governance. An AI policy explicit on patient-data tiering; clear delineation of approved versus prohibited tools; DTAC-aligned evidence trail; documented vendor management for the AI components in your stack.

Data. Classification of all data the AI components in your product or service can touch — including telemetry, operational, and audit data, not just clinical content. Retention controls aligned with NHS expectations. Vendor data residency contractually confirmed.

Infrastructure. UK-hosted, isolated tenancy, encrypted at rest and in transit, capacity-tested for clinical-grade availability where applicable. Operational resilience evidence for AI dependencies.

Security. The healthcare baseline applied to the AI layer — MFA, RBAC, comprehensive logging, monitoring, vendor SOC2 / ISO27001 / Cyber Essentials Plus evidence as applicable to scope. Right-to-audit clauses in AI vendor agreements.

Use case. Each AI use case in your product or operation mapped to its risk surface, decision authority, and human-in-the-loop arrangement. Use cases that touch clinical workflows carry their own additional review.

The supplier audit conversation, and how AI use changes it

NHS supplier audits — whether trust-specific, framework-specific, or DSPT-renewal — have always centred on data handling, security, and contractual compliance. AI use changes the conversation in three ways:

First, it expands the population of vendors in scope. Where five years ago a supplier audit touched perhaps a handful of subprocessors, AI integration can put another five or six AI-specific vendors in scope.

Second, it changes the data flow narrative. "Data goes from us to our subprocessor for processing" was the standard description. Now it is "data goes from us to our AI orchestration layer to the model provider, with telemetry going to a fourth location for monitoring".

Third, it changes the human-in-the-loop conversation. NHS expectations on AI-influenced decisions affecting patients are still evolving, but the trajectory is clear: human accountability remains with the qualified clinician and the operational owner, not with the AI tooling, and the supplier's product must make that boundary explicit.

Suppliers that have a current readiness assessment walk into these audits with the right vocabulary, evidence, and answers ready. Suppliers that haven't, scramble.

Test your healthcare supplier AI readiness in 4 minutes

Twelve questions weighted for the DTAC, DSPT and trust supplier-audit context. Personalised report identifies the gaps most likely to surface in your next supplier review.

Get your AI readiness score → 4 minutes · 12 questions · Personalised report