Image source: Karolinska University Hospital; photo: Paulina Lundberg
News • Analytics of patient outcomes
Hospital uses AI platform to achieve ‘extreme transparency’
‘Extreme transparency’ is helping clinical teams at Karolinska University Hospital to deliver some of the best outcomes in the world. Gunnar Sandersjöö, head of the largest trauma centre in Sweden, explains the mechanisms behind a relentless pursuit of quality.
Karolinska University Hospital is one of the most transparent hospitals in the world. High survival rates and very low complication levels, revealed in publicly available outcomes data, go some way to explaining its openness.
Supported by a technology platform called CRAB, the hospital can accurately demonstrate where lives have been saved, and harm avoided for some of its most unwell and injured patients. In using the platform to precisely measure expected outcomes for its patients, as well as outcomes observed, Karolinska has been able to show that it far surpasses expected survival and complication rates for orthopaedic, multi-trauma and emergency surgeries. Results from these measurements reinforce Karolinska’s media ranking as one of the best hospitals globally.
Now we report our problems, and as many as possible on a systemic level, to see the reasons for something going wrong. We are using that as a way of improving healthcare rather than punishing the one person
Gunnar Sandersjöö
But an approach to “extreme transparency”, is not just an exercise to prove to the world the excellence in care being delivered, says Gunnar Sandersjöö, head of the hospital’s trauma centre, the largest centre in Sweden for severely injured adults and children. “This has become more than just a bragging tool,” he says. “We are trying to use this as a way of finding where we are bad. We want to find all the things that really are sub-optimal, and see if we can do something about it.”
A drive to move away from a culture of individual blame, created the conditions for Karolinska University Hospital to become one of the first hospitals anywhere to use AI to continuously assess the quality of its care.
One senior surgeon in the department, a former Swedish Air Force pilot, and colleague of Sandersjöö, had been accustomed to a culture of reporting problems before they escalated during his military career. And so back in the 2000s, the hospital became an early adopter of CRAB, an AI-supported analysis tool, in order to highlight areas of concern early. “Comparing the healthcare system, and the air force, we realised we had to do something differently,” says Sandersjöö. “This actually has changed the Swedish system. Now we report our problems, and as many as possible on a systemic level, to see the reasons for something going wrong. We are using that as a way of improving healthcare rather than punishing the one person.”
Rather than individual doctors assessing performance in isolation, a collaborative culture of learning has emerged. Sandersjöö explains: “Every section has doctors who are responsible for taking care of complication rates, which are registered in CRAB. They are shadowing CRAB reports as part of their job, trying to find ways of improving the healthcare we're giving.”
The system might show instances, or rises, in cases of pressure ulcers, or pulmonary embolism, for example. Doctors can then decode the system to find individual patients behind the numbers, and assess their treatment and care journeys to determine if sub-optimal outcomes might be down to human error, or if something more systemic might need to change. “It all comes down to identifying each patient behind every complication, and then going through charts and records to see if anything stands out,” he says.
The system informs this process, by showing clinical teams the expected complications and severity of illness patients face. “If someone is severely ill and you know that the expectancy rate of survival is very low, clinicians know to expect complications. If someone has heart failure, they will continue to have heart failure. But if something pops up that shouldn't be there, then we can identify if we are causing a complication, and when to act. “We have a very low rate of so-called easy patients, and CRAB is helping us to see if outcomes are within reason or not, and finding complications that shouldn’t be there. This is extreme transparency,” he adds. “Transparency used in a good way is improving healthcare and allowing us to learn from each other.”
‘Patients have needs other than surviving’
Karolinska was one of the world's earliest adopters of CRAB, a system developed by Cambridge based health tech provider C2-Ai, to help hospitals find hidden risks, and identify performance issues. More than 15 years later, there are still new ways that surgical teams plan to use the system.
In lots of parts of the world, really bad results means being shut down. What’s needed is the permission to say that you need help, and a system that listens to help you overcome the problems you are having
Gunnar Sandersjöö
The ability to understand what looks normal for specific patients, is expected to help Karolinska’s trauma centre to better serve and understand the needs of increasing numbers of geriatric patients, for example. And discussions are beginning to see how the system might be used to measure the impact of sustainability initiatives.
A closer examination of patient outcomes beyond the immediacy of their surgery, is also a key priority: “Trauma survivors might have other needs than surviving,” says Sandersjöö. “But we have little idea about the wellbeing of a patient a year after trauma. We don't know how many are working, or if they're married, or if they are in pain. We want to go beyond understanding survival and complications in measuring our outcomes and how we are really performing.”
With few organisations showing their outcomes to the world in such detail, but with so much to be gained, Sandersjöö says wider adoption of such an approach means society accepting when help is needed. “It is very easy to be transparent when you have good numbers,” he says. “In lots of parts of the world, really bad results means being shut down. What’s needed is the permission to say that you need help, and a system that listens to help you overcome the problems you are having.”
Source: C2-Ai
19.11.2024