The Medicines and Healthcare Products Regulatory Agency (MHRA) said doctors should report all suspected inaccuracies caused by AI to the regulator’s yellow card scheme, which is used to monitor adverse incidents and safety concerns over medicines and medical devices.
As with all AI-enabled tools, there is a “risk of hallucination” where the tool produces information which is factually wrong, misleading, or entirely made up — but present it in a way that sounds confident and plausible. It’s called a hallucination because, like a human experiencing a hallucination, the AI is perceiving (or in this case, generating) something that doesn’t match reality. AI has no awareness it has made something up, it works based on it’s training.
Manufacturers should actively seek to minimise and mitigate the potential harms of their occurrence. It therefore recommends GPs and healthcare professionals only use tools which are registered medical devices which meet the required standards of performance and safety.
MHRA guidance defines ‘adverse incidents’ with software as ‘an event that caused (or almost caused) an injury to someone or affected the treatment or diagnosis one could receive.’