– By Dean Mawson, Clinical Director and Founder, DPM Digital Health Consultancy
"There is a clear and growing expectation that practices adopting digital and AI solutions will need to have a named, trained Clinical Safety Officer... In my view, this is utter madness."
Without meaning to sound cliché, 2025 was a pivotal year for the use of AI within the NHS. A promised leap forward for productivity, clinician burnout, and time available for patient care.
AI tools spread like wildfire across both primary and secondary care. Notably, AI “Ambient Voice Technology” (AVT), also known as AI scribes, or ambient scribes, have become a hotspot of implementation, discussion — and concern. Whilst adoption of AI tools is a key component of the NHS’s digital transformation and productivity goals, it is also an area where regulation has at times lagged behind.
We’ve seen a significant disconnect between regulatory expectation and real-world capacity, creating unmanaged risk directly to the front line of care.
Regulatory expectation vs real-world capacity
Take primary care for example — the responsibility for assuring digital health technologies will shift from Integrated Care Boards (ICBs) directly onto GP practices.
Whilst there isn’t (yet) an NHS England mandate that formalises this shift, there is a clear and growing expectation that practices adopting digital and AI solutions will need to have a named, trained Clinical Safety Officer (CSO) to manage clinical safety. This person will need to perform risk analysis covering both technical and clinical elements — including output errors, data loss, missing or incorrect information, or delayed outputs — as well as complete DCB0160 compliance, before deploying AI scribes or other digital tools.
In my view, this is utter madness.
Practices are already under immense strain. They simply do not have the capacity, the expertise, or the governance structure needed to assure complex digital tools — whether that’s an AI triage system in this example, or healthtech tools more widely.
For many GP practices or Primary Care Networks (PCNs), this means needing CSO cover immediately. The reality is that many already have access to at least one AI scribe tool, and some may be using them without the full clinical safety assurances in place.
In secondary care, both the capability and the capacity is there in theory and most, if not all, NHS Trusts have a CSO or equivalent role in place. But still, recent research shows that more than 70% of digital health technologies in trusts are without safety assurance – so it isn’t all a bed of roses in secondary care either.
Both environments need support or intervention to meet their responsibilities.
A set-up for failure?
To make the shift work in GP practices, we need a robust national governance structure. Without this, primary care is being set up for systemic failure, and potentially serious patient safety incidents.
Expecting individual practices to shoulder full clinical safety responsibility for increasingly complex AI and other digital health systems is unrealistic. Many do not have access to trained clinical safety professionals, established risk management processes, or the time and headspace required to implement DCB0160 properly. This isn’t a capability gap that can be solved with goodwill or guidance notes.
We risk creating a postcode lottery of safety assurance. Some practices will be fully supported with this process, some will do their best to comply. Others will carry on using tools without the necessary oversight, not through negligence, but because the system has pushed responsibility downward without providing the infrastructure to support it.
What we need at national level
Three things need to happen quickly:
- Clarity: NHS England must clearly state where clinical safety accountability sits for primary care digital tools, and what “good” looks like in practice. Ambiguity is itself a safety risk.
- Capacity: Shared, properly funded clinical safety capability must be made available to primary care, as well as ongoing support and maintenance through mentorship and peer support. Training alone is not enough.
- Proportionate governance: Risk management approaches must reflect the realities of general practice, without diluting safety standards or creating bureaucracy.
What GP practices can do now
While national policy catches up, GP practices still need practical, defensible ways to meet clinical safety obligations now.
The starting point is understanding what digital and AI tools are already in use — formally and informally — where clinical safety cases exist, and where there are gaps.
From there, practices need realistic arrangements in place to meet the expectation of having Clinical Safety Officer (CSO) oversight.
For many practices, this will mean working with experienced clinical safety specialists — such as DPM Digital Health Consultancy — to provide immediate support while longer-term models are established:
- Independent clinical safety leadership that supports risk assessment, safety case development, and decision-making
- DCB0160 implementation, tailored to the realities of general practice
- Hands-on review of AI tools, including ambient scribes, covering workflow impact, data risk, and clinical output risk
- Clear accountability models that work at practice, PCN, or system level
In the short to medium term, this will provide some security and assurance.
In the long term, if NHS England expects clinical safety responsibility to sit closer to the point of care, then it must also provide shared models that work at scale — whether that’s CSOs operating at PCN, federation, or ICB-hosted primary care hub level, with clear lines of accountability and indemnity.
2026 must be the year we grow up about clinical safety
AI has real potential to support clinicians and improve care — but only if it is implemented safely and with the right governance in place.
Right now, we are asking primary care to absorb clinical safety responsibilities that the system has not fully designed or resourced. That is not innovation. It is risk transfer.
If 2025 was the year AI spread across the NHS, then 2026 must be the year we grow up about clinical safety — especially in primary care.
Where to get help.
DPM provides practical clinical safety, regulation, and governance training and support to help GP practices and other healthcare organisations get their clinical safety compliance solved.
Everything we deliver is grounded in real NHS and frontline experience proportionate, defensible governance, and delivers practical implementation that teams can sustain.
Clinical Safety Officer (CSO) Practitioner 2-Day Course
Join DPM on one of eight spring dates.
Led by Dean Mawson, Clinical Director and Founder of DPM Digital Health, this course equips teams and individuals with practical knowledge and tools to manage clinical risk effectively in healthcare projects and digital health delivery.
AI safety training.
Turn the potential of AI in health and care into safely governed capability.
Healthcare-specific AI governance and risk management, grounded in patient safety.
Best for all healthcare organisations and digital health developers using or exploring AI.
CSO-as-a-service and CSO mentoring.
Access senior, independent Clinical Safety Officers without the cost or delay of permanent recruitment.
If you already have a CSO, DPM provides the UK’s only structured CSO mentoring programme, designed to support CSOs navigating complex, high-pressure roles.

