SLOPPY handwriting has been a complaint levelled at doctors over the ages – be it true or not and maybe less so now with the advent of digital record keeping.
“But to a computer, their typing isn’t much better,” says Dr James Teo, a consultant neurologist at King’s College Hospital NHS Foundation Trust and King’s Health Partners.
By this he means how clinicians tend to add information to electronic health records in the form of narrative – such as: Abdo cramps, vomiting, dysuria and urinary frequency. Off food. Examination: apyrexial, abdo soft, mild generalised tenderness. Diagnosis: likely gastroenteritis. Appendicitis? Seek urgent advice if more unwell.
Such data is important in the treatment of individual patients but can also feed into research to improve healthcare in general. However, unstructured narrative text is very difficult for a computer to analyse.
Individual health records must be reviewed by clinical coders who assign standardised codes based on particular words, conditions or treatments. A staggering volume of health information is recorded each day in the NHS and this makes the process of clinical coding time consuming, expensive and risky.
A study published in the Bulletin of the Royal College of Surgeons of England states that coding errors for primary discharge diagnosis are believed to range from 8.8 to 31 per cent, with similar error rates reported for primary procedure coding.
Such errors have a range of impacts, from underestimating the need for some clinical services to invalid research conclusions.
Teaching computers to “read”
Dr Teo has a particular interest in how clinical data is managed in his other role as joint director for Data Science and Artificial Intelligence (AI) at King's College Hospital and Guy's & St Thomas' Hospital NHS Foundation Trust. Here he has led the development and deployment of a clinical analytics system known as CogStack.
CogStack is an “information retrieval, extraction and natural language processing platform” that has been developed by researchers at the NIHR Maudsley BRC and King’s College Hospital in partnership with the University College London Hospitals NHS Foundation Trust BRC.
In plain terms, it is an AI system designed to “read” electronic health records, summarise the data and suggest standardised codes for clinical coders to verify.
“When you visit your doctor or attend hospital, a lot of information is collected about you on computers including your symptoms, tests, investigations, diagnosis, and treatments,” explains Professor Richard Dobson, group lead for CogStack and head of the Department of Biostatistics & Health Informatics at NIHR Maudsley BRC.
“Across the NHS this represents a huge amount of information that could be used to help us learn how to tailor treatments more accurately for individual patients and to offer them better and safer healthcare.
“The challenge we face is that most of the information held within these records is in written form which is difficult to use and learn from. We have developed the CogStack AI tools to read and understand this information and as part of this project will scale this capability across the NHS.”
CogStack was one of 38 projects awarded development funding from NHSX and the Accelerated Access Collaborative (AAC) in the latest round of the Artificial Intelligence in Health and Care Awards in June 2021.
The system has been tested at four London NHS Foundation Trusts and is being adopted by trusts in Manchester and East Anglia and internationally. It has also been awarded funding of £16m from the Office for Life Sciences (OLS) for dissemination to 11 further Trusts.
The system has already been used to process over 12 million free text documents and over 250 million diagnostic results and reports at King’s College Hospital alone.
Complex algorithms
CogStack works by using natural language processing (NLP), analytics and visualisation technologies. NLP is used to read languages in other industries but the technology must be tailored for use in healthcare systems to contend with medical terms, jargon, acronyms and local dialects.
“NLP algorithms translate the language used in free text into a standardised, structured set of medical terms that can be analysed by a computer,” Professor Dobson and Dr Teo explain in an article in The Conversation.
“These algorithms are extremely complex. They need to understand context, long strings of words and medical concepts, distinguish current events from historical ones, identify family relationships and more.
“We teach them to do this by feeding them existing written information so they can learn the structure and meaning of language – in this case, publicly available English text from the internet – and then use real medical records for further improvement and testing.”
They believe that using NLP algorithms to analyse and extract data from health records has huge potential in healthcare research, as much of what’s captured in narrative text in a patient’s notes is normally never seen again.
“This could be important information such as the early warning signs of serious diseases like cancer or stroke,” they explain. “Being able to automatically analyse and flag important issues could help deliver better care and avoid delays in diagnosis and treatment.”
Covid application
In 2020 the system was used to investigate whether drugs commonly prescribed to treat high blood pressure, diabetes and other conditions could increase the chances of becoming severely ill with Covid-19. Much of the information needed to answer this question was recorded both as structured prescriptions and in free text in medical records.
NLP tools were used to interrogate the anonymised records of 1,200 Covid-19 patients, comparing clinical outcomes against use of angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin receptor blockers (ARBs). It found that patients taking ACEIs or ARBs were no more likely to be severely ill than those not taking the drugs.
CogStack has also been used to investigate the effectiveness of the NEWS2 hospital early warning score system to predict 14-day outcomes for the most seriously ill Covid-19 patients, to improve safety in prescribing methotrexate for rheumatology patients, and to identify outpatient orthopaedic procedures missed by manual coding, leading to a reported £1.25m gain in annual NHS Trust revenue.
Speaking to the Healthtech Alliance, former Secretary of State for Health and Social Care Matt Hancock cited the potential of systems like CogStack.
He said: “It’s a clear example of the latest AI helping us fix the basics, because once you’ve coded up and digitised your patient records, you can start to solve fundamental problems, like how to share those records across different parts of the NHS.
“You’re also freeing up staff time and capacity to do more of what people came into healthcare to do: looking after others and solving problems.”
Jim Killgore is publications editor at MDDUS
This page was correct at the time of publication. Any guidance is intended as general guidance for members only. If you are a member and need specific advice relating to your own circumstances, please contact one of our advisers.
Read more from this issue of Insight Primary
Save this article
Save this article to a list of favourite articles which members can access in their account.
Save to library