When it comes to implementing ‘new’ technology in the NHS, the process is frequently protracted and beset with delays.
This is most clearly demonstrated in the NHS’s plans to go “paperless”, in favour of a digital health record that would follow patients from one healthcare setting to the next across the health and social care system. The plan was first announced in 2013, with an implementation target set for 2018. This was then pushed back to 2020 before being delayed once again, owing to the COVID-19 pandemic, to March 2025.
Whether this target, which could finally enable pharmacists to access more patient information, is met next month remains to be seen. In the meantime, there is another newer technology that also has the power to transform healthcare, but even more rapidly and without needing to wait for government plans and deadlines — artificial intelligence (AI).
To the average user, AI in the form of ‘large language models’ (LLMs), such as ChatGPT, Gemini or Copilot, present an easily accessible — and often free — panacea to the time scarcity we all face. This makes it particularly attractive to those working in the NHS, and more specifically in pharmacy, where this is an acute issue. A survey carried out by The Pharmaceutical Journal in April 2024 of 1,243 pharmacists found that 61% of respondents feel ‘moderately stressed’ or ‘very stressed’ in their job, with a high demand for services and a lack of staff cited as the most common reasons.
On the surface, using AI tools seems like low-hanging fruit for a very busy profession
AI is already being used to alleviate some of that burden. Darren Powell, chair of the Royal Pharmaceutical Society (RPS) Digital Pharmacy Expert Advisory Group, wrote in a blog for the RPS that AI is supporting tasks including “prescription accuracy checking, clinical decision support and appointment scheduling” to help pharmacists “spend less time on repetitive processes and more on direct patient care”.
On the surface, the time-saving benefits of AI are clear and easy to grasp. Using AI tools seems like low-hanging fruit for a very busy profession.
Less obvious, and possibly more technical for some, are the risks associated with AI solutions.
Any AI model — of which LLMs are one of many — can return a ‘hallucination’ in its answer. These are answers that look correct based on the context in which the information is presented but actually contain incorrect information. They stem from the way AI models analyse data and produce an output that mimics its source dataset. An AI model does not check its own work, meaning that it is still up to the user to verify that the information is correct. This presents a possible patient safety risk if AI is used to access clinical information and may negate any time-saving advantages.
The nature of AI and the hallucinations they produce mean some level of error is still inevitable
There are some checks and balances when it comes to regulating the routine use of AI systems in a clinical setting. Before a company can implement any new technology into practice, it must be registered with the Medicines and Healthcare products Regulatory Agency and meet medical device regulations. These require the company to show, using clinical evidence, that the benefits of the device outweigh its risks and achieve the level of performance that it claims.
However, the nature of AI and the hallucinations they produce mean some level of error is still inevitable, regardless of the software’s use case. The question then is, how should a pharmacist approach these errors?
Doctors, for example, have been told by their regulators — the General Medical Council (GMC) — in the latest edition of ‘Good medical practice’ that they “must report adverse incidents involving medical devices (including software, diagnostic tests and digital tools) that put the safety of a patient or another person at risk, or have the potential to do so”. The GMC clarifies that this includes AI tools.
There is no similar guidance for pharmacists. This is where the General Pharmaceutical Council must step in with prompt and agile regulation of the pharmacy profession’s use of AI tools, establishing clear guidelines to protect patient safety and professional accountability. Clear guidance will help pharmacists understand their professional responsibilities when using AI, while proactive regulation will ensure that AI enhances the role of pharmacists in patient care rather than creating additional liabilities.
Any delays could see the regulator on the backfoot, as it is with online pharmacy regulation, reacting to problems that have already happened, rather than preventing them in the first place. PJ