If air travel were 99.99% safe, there would be a fatal plane crash every four days. In the U.S., there’s been only one in 19 years. The aviation industry has maintained a mind-bogglingly consistent safety record because of rigorous adherence not only to technical best practices, but to a universal communication standard called crew resource management. Known as crisis resource management, or CRM, in the healthcare industry, it’s been around in various forms since the 1990s, but its application is inconsistent. One doctor is working directly with United Airlines’ National Flight Training Center in Denver to change that.
The origins of crew resource management
Just before midnight on December 29, 1972, Eastern Flight 401 deferred its landing at Miami International Airport and went into a holding pattern over the Everglades. The light indicating deployment of the landing gear hadn’t gone on. Outside, it was pitch dark.
The captain sent the flight engineer to manually check the status of the landing gear. The co-pilot engaged autopilot and rejoined the captain in trying to fix the lightbulb, but before he did, he made a disastrous error: He nudged the joystick, tipping the plane into a descent so slight no one noticed until three minutes later, when he realized the altimeter read just 250 feet.
Ten seconds later, the plane crashed, killing more than 100 people.
In the wake of that incident, among others, NASA developed a set of standardized communication practices designed to minimize human error. First adopted by United Airlines in 1981, CRM is now the standard from commercial airlines to the military.
CRM: Applicable to medicine, but inconsistently applied
In the healthcare industry, teams and researchers have been deploying CRM concepts in various forms since 1990, but with inconsistent results.
“In my team trainings I found there was a consistent performance gap when it came to non-technical skills,” says Robert Bishop, MD, Director of In-Situ Simulation in Children’s Hospital Colorado’s Heart Institute. CRM offered a model for improvement, but there wasn’t much in the medical literature on how to make it work successfully. For Dr. Bishop, it made sense to look to the source.
“United Airlines’ national Flight Training Center is five miles from the hospital,” he says. “There was a phone number on Google maps.”
The unacceptable truth of medical error
Medical error is the third leading cause of death in the U.S. It kills some 250,000 people annually — the equivalent of three planes crashing every day of every year with no survivors.
“That’s unacceptable,” Dr. Bishop says.
He sees the dynamics play out in the Simulation Lab. Teams execute their technical skills perfectly, exactly as trained. But when simulations go south, communications break down. Roles are unclear. Teams don’t know who’s in charge, who’s delegating and what to delegate. Confusion sets in. Mistakes get made.
If medical professionals get CRM training at all, it’s generally voluntary, mostly single-day sessions of varying length. The airline industry, on the other hand, mandates CRM training every 6 to 12 months. The frequency is key.
“Practitioners have to take CPR every 2 years,” Dr. Bishop says. “But if you don’t practice it every 3 to 6 months, you’re horrible at it. The data on that is very strong.”
CRM is the same, but the situations that most demand it — high-stress, high-risk, high acuity — are also low-frequency.
“Sometimes these are seen as one-off events that may happen once in a career,” says Joe Grubenhoff, MD, Associate Medical Director of Clinical Effectiveness at Children’s Colorado. “But the data tells a different story.”
Managing crisis situations through data
“Here’s a phrase you used to hear all the time: ‘Let me tell you how I manage that,’” says Lalit Bajaj, MD, MPH/MSPH. “I don’t hear that too much anymore.”
As Chief Quality and Outcomes Officer, Dr. Bajaj leads Children’s Colorado’s long-running pursuit to put data-derived best practices into clinical action. Those efforts aren’t limited to CRM, but CRM is playing a role.
Tara Neubrand, MD, leads this effort in the pediatric Emergency Department, where more than 500,000 different possible combinations of teams might need to perform resuscitation for cardiac arrest during a given shift. Now, four times a week, twice for day shift, twice for night shift, the designated resuscitation team for that shift practices a 5-minute simulation.
“When we started doing these, the room was pretty chaotic,” says Dr. Neubrand. “Now it’s silent. Everyone knows exactly what they’re doing.”
Better yet, the ED has improved its cardiac resuscitation rate from 36% to more than 90%.
Elsewhere, Dr. Grubenhoff is employing the CRM concept of shared mental model to build timeouts that minimize diagnostic error. Dr. Grubenhoff’s team is beginning to work timeouts into clinical pathways designed to call out critical neurologic conditions that may appear, at first glance, to be simple migraines.
“Providers as a group tend to succumb to similar faults in diagnostic reasoning, leading to similar errors, he says. “So now we’re working on creating a process to alert clinicians, ‘Hey, you may want to consider this as well.’”
Building crisis systems that correct error
The Google Maps phone number didn’t work. Repeated calls to United’s headquarters didn’t either. Dr. Bishop knew where the Flight Training Center was and considered just going, but it was pretty well gated in. He’d just about given up when a colleague mentioned her husband was a United pilot.
“Within a week I was connected,” he says.
The connection was Rob Strickland, Senior Manager of Human Factors and Pilot Development, and he was enthusiastic enough to arrange an initial visit for Dr. Bishop and his team that’s led to two more visits since. They’ve sat through the airline’s CRM trainings and flown in their flight simulators. Dr. Bishop and Strickland even co-authored a research paper.
One key learning point Dr. Bishop has taken away is that mistakes are inevitable. The idea of CRM is to build systems capable of absorbing them and restoring smooth operations before one mistake leads to more.
To do that, teams need a shared mental model based on three basic facts: what has happened, what is happening, and what needs to happen next. That model starts with a meticulous look at risk.
Drawing out behaviors that lead to mistakes
Dr. Bishop and team have started by looking over the Heart Institute’s cardiac arrest data over the last five years, categorizing and looking for patterns of failure. They’re already using those cases to build simulation scenarios that test teams’ ability to cope.
“The idea is to draw out behaviors that cause mistakes, so you can identify them and understand while they’re made,” he says.
Then you can build systems that absorb them. Dr. Bishop’s team is currently considering options that include color-coded badges that correspond to roles and zoned room layouts that designate positions. If you walk into the room with the situation already underway, Dr. Bishop wants you to immediately be able to apprehend the situation and your role.
Eventually, the plan is to develop a standardized CRM training model that extends beyond the operation room to every team member in the hospital. The codes will be different across disciplines, he says, but the skills are fundamentally the same.
“In the aviation industry the skill set of CRM is held to the same standard as technical skills, exactly the same level of importance,” he says. “I want to see that happening here.”