IT'S no accident that the title of this article sounds like one of Ian McEwan’s latest novels, Machines Like Me. I read the book recently and it set me thinking about human factors. Interestingly, it seems to imply that rather than trying to create robots which act like humans, we should pay closer attention to our own ‘circuitry’, along with the external factors which influence or affect why we do what we do.
No clinician goes to work with the intention of making a mistake - perhaps apart from the few who end up in criminal courts and even they may have been well-intentioned in their misguided actions. Human factors come into play in everything we do: our environment, the people around us, the systems in which we work and the design of equipment and technology we use. These aspects can help or hinder, sometimes leading to mistakes.
Human factors are now more widely recognised and are starting to be given greater weight by regulators. In a 2016 review, the Care Quality Commission recognised that skilled analysis was needed for serious incident investigations to "move the focus of investigation from the acts and omissions of staff, to identifying the underlying causes of the incident” and “use human factors principles to develop solutions that reduce the risk of the same incident happening again".
Clinical human factors can be characterised as "enhancing clinical performance through an understanding of the effects of teamwork, tasks, equipment, workspace, culture and organisation on human behaviour and abilities, and application of that knowledge in clinical settings" (Professor Ken Catchpole). These factors help us understand not only why things go wrong, but also why they go right.
MINIMISING RISK
Human factors analysis has been utilised for some time in the aviation industry to minimise risk and error, with the implementation of specific training, changes to protocol, standardised terminology and combined crew input during aircraft operations. Similarly, the World Health Organisation’s (WHO) Surgical Safety Checklist has been introduced to medicine to minimise adverse incidents. An example of which is the implementation of ‘time out’ prior to operating to allow team members to confirm the patient’s identity, procedure and known risks. Clinicians are only human and so systems or checklists, such as this, can help avoid important steps being missed when the pressure is on.
Consider the high-profile case of Dr Hadiza Bawa-Garba who was found guilty of manslaughter in her treatment of six-year-old Jack Adcock and later removed from the GMC register (though reinstated on appeal). Campaigners argued on her behalf that the junior doctor was hampered by system and technical failures in the hospital. In response to its handling of the case, the GMC said it would consider the backdrop of such failings when reviewing conduct, and its case examiners, clinical experts and decision makers were to receive human factors training.
A 2018 white paper published by the Chartered Institute of Ergonomics and Human Factors explained that: "Human Factors uses measurements, observations, conversations and understanding about human physical and cognitive capabilities to make practical improvements to tools, software, furniture, workplaces and environments to initiate and support change for processes, techniques, interactions and communications".
In general dental practice the opportunity for error is high. The Swiss cheese model of safety incidents demonstrates the impact of human factors. If we can increase the layers of defence (solid cheese) and reduce latent conditions, such as poor design, incomplete procedures and flawed decision-making (holes in cheese), we can reduce the frequency of active failures (patient safety incident).
Perhaps the most obvious patient safety incident in dentistry is wrong site extraction. During a busy day, time pressures, stress and fatigue can contribute to errors. Teamwork protocols, somewhat similar to the WHO surgical ‘time out’, can be implemented to check patient details and the tooth to be removed in order to prevent wrong site surgery. An example of this type of protocol is the LocalSafety Standards for Invasive Procedures (LocSSIPs) forwrong site extraction in dentistry, which (using the toolkit) implements the principles of the National Safety Standards for Invasive Procedures.
AUTHORITY GRADIENTS
Another important area of research in human factors has been the part that ‘authority gradients’ can play in adverse incidents. The term was first defined in aviation where it was observed that pilots and crew may not communicate effectively in stressful situations and where there is a significant difference in experience, perceived expertise or authority.
An Institute of Medicine report, To Err is Human, first explored the concept in the practice of medicine, yet relatively little to date has been published regarding the potential role of authority gradients in clinical errors. In any organisation with different levels of professional stature and seniority, authority gradients can be intrusive – especially when senior staff have influence over job security and progression in those being supervised. This can make it extremely difficult to speak up and challenge the decisions of people in positions of power or authority.
Some organisations recognise these risks and seek to maintain what is known as a ‘shallow authority gradient’, whereby everyone is actively encouraged to contribute opinions/suggestions so that an overall consensus emerges which is then acted upon. This can be a desirable approach for managing more routine, non-critical decision processes where there is the luxury of time. The downside to a shallow authority gradient is that in times of stress or crisis, where leadership and decisiveness are required, critical decisions may not be taken promptly, with adverse consequences resulting from delay.
Conversely, other leaders and managers may opt for a ‘steep authority gradient’ where they are seen as the decisionmakers and expect instructions to be acted on without question or further discussion. This may be desirable in times of crisis but it does not serve to foster shared responsibility and decision-making, nor empowerment in staff to speak up and challenge transparently flawed decisions. Dealing with authority gradients effectively requires situational awareness and flexibility within organisations in order to adjust to prevailing conditions and threats. Openness should be viewed as a positive attribute to minimise errors and poor decision-making – with all team members encouraged to speak up and challenge decisions without fear of recrimination.
IN CONCLUSION
Human factors analysis is concerned with the interplay between the clinician and the other elements of a system. Taking this system-based approach it is important to consider equipment, buildings, spaces, patients and team members. Human error may initially seem to be the cause of an incident but a human factors approach looks wider at root causes.
Layers of defence to reduce errors will include education and training, practice policies, healthcare technology, co-ordinated teamwork, communication and check lists. Open dialogue when things do go wrong must be encouraged, with the emphasis being on continual learning and improvement of patient care.
Sarah Harford is a dental adviser at MDDUS
SOURCES
- Care Quality Commission. Learning from serious incidents in NHS acute hospitals. 2016
- Chartered Institute of Ergonomics and Human Factors. Human Factors for Health & Social Care (White Paper). 2018
- Institute of Medicine. To Err is Human: Building a Safer Health System. 2000
This page was correct at the time of publication. Any guidance is intended as general guidance for members only. If you are a member and need specific advice relating to your own circumstances, please contact one of our advisers.
Read more from this issue of Insight
Save this article
Save this article to a list of favourite articles which members can access in their account.
Save to library