Patient safety: the CAREFUL view

We believe that patient safety is of paramount importance in healthcare

For many people – patients in particular – that might seem an unnecessary statement.

A platitude, perhaps.

We need to state this boldly because, despite many decades of work by dedicated staff and leaders, avoidable harm is caused to patients all over the world, in all settings, at a level that should not be tolerated.

image1 areful
Photo by JC Gellidon on Unsplash

Avoidable patient harm is the unacceptable face of modern healthcare

At least 7–10% of all hospital admissions result in an adverse event, according to the WHO. ‘Never events’, despite their name, occur regularly.

In the USA, medical error is considered by many to be a leading cause of death. Fully 15% of all hospital expenditure worldwide is wasted on such errors.

In comparison to other high-risk industries, such as those in transport and energy, healthcare lags far behind. There are many reasons for this: medicine is partly science and partly art; there is much uncertainty in any diagnosis; expertise requires gestalt and takes many decades to acquire.

Most importantly, healthcare was until relatively recently grossly underinvested in in terms of technology. Before 2011, healthcare was last-but-one in the league table of IT spending as a proportion of revenue. Only mining did worse.

It is well proven that there are many ways to improve the odds. Some hospitals and healthcare facilities are clearly safer than others. Technology plays an important part in this – but it’s not the only answer.

Patient safety culture

The most important factor in any safe environment – whether in an aeroplane, an oil refinery, or a hospital – is the attitude of staff. Patient safety culture strongly correlates with the degrees and likelihood of harm. And patient safety culture is not wholly ‘soft’ or aspirational. It can be measured and benchmarked by industry standard tools, the most widely used being those produced by the AHRQ.

For our part, here at CAREFUL, we believe that all institutions should adopt a programme of patient safety culture improvement. We support the STEP-up programme, devised and implemented by our Founder, Dr DJ Hamblin-Brown, a write-up of which can be found here.

Improving a patient safety culture using a programme such as STEP-up is a prerequisite to the successful use of technology. So, before we talk about technology, let us examine a definition of patient safety culture and some of the aspects of the STEP-up programme.

What is a patient safety culture?

A culture of safety comprises four elements. This definition is a based on the work of James Reason in his well-recognised work Managing the risks of organisational accidents.

  1. A culture of reporting – recognising and identifying both actual incidents and opportunities for harm or near misses
  2. A culture of openness – talking about patient safety issues without fear of recrimination or blame and to make this part of the normal conversation within the organisation
  3. A culture of justice – ensuring that errors are seen and treated as products of the environment, context, and system, and not about individual culpability
  4. A culture of Improvement – creating the desire, capacity and capability to change the organisation’s systems and context to make error less likely in the future.
    Patient safety specialists: a worthy investment

Patient safety specialists: a worthy investment

STEP-up – and almost any other successful change programme – requires some form of internal champion. In STEP-up we refer to these as STEP-up Champions. Their role is to bring the idea of patient safety culture to the level of the ward and department and to encourage the individual to change their behaviour.

These champions are themselves specialists in patient safety. They understand the risks and how to approach them.

Many institutions feel dedicating the time of staff – often senior staff – is not good value for money. Yet the cost of poor patient safety – which the WHO estimates to be 15% of all hospital healthcare spending – is so large that any institution will see investing in such champions as worthwhile, financially.

Monitoring patient safety incidents

It may seem self-evident that counting the number of patient safety incidents is important (“if you can’t measure it you can’t manage it”, being a well-known adage). However, many healthcare institutions do not measure the real number of such incidents. The reason is straightforward: it is usually only when a patient is actually harmed that something is recorded. And yet a patient safety incident is more likely to be a near-miss. Such near misses are often simply ignored, or seen as business as usual.

Training staff to see such ‘no-harm’ incidents as worth reporting and used as a cause for work is hard. Unlike in other high-risk industries, reporting errors is often seen negatively.

Human factors in avoidable harm

One of the most insidious aspects of patient harm and patient safety events – which are usually errors of some sort – is that they are multifactorial. Put another way, they are caused by many things.

There is, contrary to popular myth, no such thing as a single ‘root cause’ – and the purpose of a root cause analysis (RCA) is not to find the ‘smoking gun’, the single event that caused the error. Adverse events and incidents with harm are caused by several sources of errors ‘lining up’ creating an opportunity for such errors to reach the patient.

This, now famous, ‘Swiss cheese’ idea, promoted by Professor James Reason, identifies these opportunities for error as ‘holes’ in systemic barriers. Many of these barriers are human in nature: whether someone is tired; whether there are enough staff on duty; whether they have been properly trained; whether the team is cohesive and works together well; whether someone in a position of authority is overly assertive or aggressive, such that people won’t speak up; whether everyone speaks the same language…the list of such human barriers is dauntingly long.

Training staff to understand such human factors is essential to bringing the causes (plural) of avoidable harm into the light. For insitutions that lack a cohesive process to ‘catch’ avoidable harm, the use of tools like NEWS are a good basic starting point but are not a long term solution for more ingrained poor communication and handover.

The avoidance of blame

We will add here one small note. Given the complexity of the human factors and the systemic, multi-faceted nature of patient safety incidents, the avoidance of direct blame –the holding to account of individuals for larger failures – is imperative.

This has been called ‘A culture of Justice’. Without it, there is no chance that reporting will happen. Issues will be hidden and more harm will come to patients. Note that ‘blame’ does not equate to bullying. A senior doctor, for instance, may ‘shoulder responsibility’ for an error and apologise to the patient or family. 

But that suggests that that doctor is solely the cause and therefore nothing can be done to prevent it happening again. Even when blame is good-natured, it is insidious. In the absence of malfeasance, the system needs examination and correction, not the individual.

How to improve patient safety continuously and deliver effective and sustainable change

Which brings us to the main reason for addressing patient safety culture in the first place: to improve performance in safety and delivery.

The STEP-up programme is primarily intended to reach this common goal: to encourage and allow staff to make changes to working practices, in order that patients’ lives and health are better secured. Which is why reporting near misses is so important. They are free lessons.

But improvement is hard because changing working practices is hard. It takes long periods of communication, engagement, planning, testing, and documenting – even before such changes are monitored, audited and measured.

To add complexity, some changes will be introduced that will unintentionally make matters worse. Technology is a particular culprit. By digitising data, it can be hard to find rapidly what you need to rapidly make a decision.

The important thing therefore is to be able to engage staff in this to the extent that they want to change working practices. Developing skills in change management is a topic in itself – suffice to say that continuous improvement in patient safety requires a dedicated effort.

Safety measures, patient outcomes and patient experience

Before we move on to discuss digital technologies, we might pause a moment to talk about how patient safety and patient outcomes are related. Outcomes are, in the end, what the patient wants; they want to be made better if they are ill and to remain well for as long as possible. Outcome measures vary according to the type and level of disease as well as patient factors: age, co-morbidities, expectations.

What is clear is there must be a causal relationship between poor safety and poor outcomes. If nothing else, for the small number of patients who are severely damaged, the outcomes can be terrible. We know of one patient who, after robotic surgery, was left partially paralysed after developing bilateral iliac clots. This is not what they expected.

What is less clear, however, is that patient experience also has a direct bearing on patient safety. Safety and outcomes can be compromised if staff don’t listen to patients effectively, or if patients are frightened, rushed or are separated from those that may better account for their illness. Healing is still as much an art as a science and the positive impact of a good patient experience has been many times demonstrated to improve outcomes.

If patients feel disempowered, evidence shows they will fail to speak up when they know they are being mistreated. It is imperative therefore to regard the patient experience – and the experience of their family and loved-ones – as being an integral part of improving patient safety and therefore patient outcomes.

Reducing patient harm with new digital technologies

Not all digital technology is an improvement

Let us now turn to the subject of digital technologies as an adjunct to improving patient safety in NHS culture. It should be apparent from the preceding discussion that we believe that digital technologies play a secondary part in the development of safer healthcare. To slightly modify a well-known phrase: “culture eats technology for breakfast”.

We would also counsel against the default assumption that digital technologies inevitably make things better. The fact is that they introduce their own risks as well as changing, sometimes reducing, the availability of data.

One example of this is in Intensive Care Units, where the progress of a patient has – for years – been captured on large (A0) pieces of paper at the end of the bed. All relevant information for a single day: medication, ventilation, blood results, fluid intake and output, vital signs, level of consciousness, even family visits – are dutifully recorded in one huge piece of paper over a metre wide. Stacks of these – one for each 24-hour period – document the patient day by day, as a stack of sheets at the end of the bed.

As a result, the entire ICU critical care patient’s status and history can be seen, quite literally at a glance. With a quick flick of the wrist, the same information for the previous day is available, in full. The system even survives a power outage.

Even with large LCD screens, nothing comes close to providing this level of detail in this concise, easily summarised form. The capacity for such forms to create a gestalt for an experienced clinician is unparalleled. 

Clearly, there is more manual work involved (although arguably that may help ICU nurses and doctors attend to important data), and auditing this is a manual task no one enjoys. It is arguable, however, that ICU patients enjoy better monitoring using paper.

Three types of error in patient safety

There are three types of error that lead to patient harm: slips, lapses and mistakes.

‘Slips’ mean that an action or object is replaced by another, incorrect action (or actions are done in an incorrect order).

Lapses‘ mean that an action (or object) is missed or forgotten.

‘Mistakes‘ are when incorrect procedures or a lack of – or wrong – information leads to an incorrect decision.

Photo by Volodymyr Hryshchenko on Unsplash

Medication errors and safety

Let’s look at these in the context of medication errors – one of the most common causes of both error and harm.

How can digital technologies reduce medication errors?

Reducing slips

This means preventing Drug B from being administered instead of Drug A. Currently, systems exist, although they are not in widespread use, that link the patient with the individual drug package dispensed.

This is called ‘closed loop’ dispensing. Barcodes on an individualised drug package match barcodes on the patient – and these are both linked to the EMR/medicines management system. 

Drug administration can only take place if these both match. This forces error attention on error reduction further back into the dispensary, where the individual packages are created. It removes, almost completely, the opportunity for a slip in a busy ward where there are many distractions.

Reducing lapses

The failure to administer a drug on time (missed dose) is the primary lapse in medicines management. Digital technologies can address this by issuing reminders to the drug administrator (often a nurse, but also the patient or family member), which act as alarms and which collect data to ensure that all doses are correctly administered.

Reducing mistakes

This is a more difficult area. A medicines mistake (as a specific form of error) would, for instance, involve prescribing the wrong antibiotic. This might be against the recommendations of prevailing microbiology advice  – so that Amoxycillin rather than Nitrofurantoin was administered in asymptomatic UTI.

Mistakes are much more difficult to catch, but digital technologies can clearly provide additional decision support. Recently such systems have been augmented with AI (although we recognise that this is controversial). Other systems can, if linked for instance to a list of allergies, provide further safeguards against accidental misprescribing. 

Choosing patient safety technology partners

From the previous section, it is clear that digital technologies can create risks, errors and harm as well as mitigate and improve patient safety. The question is how users, providers, payors and patients can ensure that this balance is well understood and that measures are taken to ensure the balance is clearly in favour of the patient.

Our answer is twofold. The first – as already stated – is to ensure that there is a clear and well-supported patient safety culture. The second is to generate digital-technology partners who understand and promote patient safety as part of their offering.

Understanding the approach that such digital partners take to patient safety and working with those partners actively, in order to seek out and address potential risks is arguably more important than understanding and implementing the technology itself.

What does a patient safety strategy look like?

For most healthcare organisations, patient safety is given priority in presentations and in annual reports; improvements are trumpeted, and poor performance is hidden. Despite this, patient safety is often not considered a focus of strategic thinking.

In our view, this is a mistake. To continuously improve patient safety is of strategic importance for all healthcare organisations. Every organisation should have a strategy that sets out its context, objectives and – most importantly – clear measurable goals that support patient safety.

As an example, it is worth considering that the UK NHS has taken this seriously producing clear ambitions from the very top.

The current NHS patient safety strategy – a future view

Originally set out in 2019, the post-Covid update in 2021 document provides a good template for ‘who, what, when and how’ with a clear context. What it lacks is measurable targets for NHS improvement.

Health Education England and patient safety improvements

HEE has set out a complete syllabus for NHS staff that will ensure that everyone has a clear understanding of why patient safety is important – and how staff can be educated.

World Patient Safety Day

We will finish with one thought. World Patient Safety Day is on 17 September each year. For every organisation in every part of the world, this should be a time to celebrate the fact that patient safety issues are now reaching the top of the healthcare agenda.

It is also a day to recommit to reducing avoidable harm and drive improvements in outcomes across all care settings.

For too long, patient harm has been ignored, misinterpreted, misunderstood or misreported. Now is the time for all organisations – of any size – to think clearly about how to benefit all patients, the world over.