WFH NETWORK

Lessons from NASA: What is an appropriate level of risk?

Since the 1980s, when about 10,000 people with hemophilia tested positive for HIV after receiving contaminated blood, assessing and managing risk has been a core principle in the bleeding disorders community. This applies not only to blood supplies that can be infected with both known and unknown viruses, but also to the development of new treatment products. 

In the Wednesday morning plenary “Managing the Risk of Human Space Flight: Lessons From 50 Years of NASA Human Space Flight,” Michael Lutomski, a former international space station risk manager for NASA, noted that both the hemophilia community and NASA operate in environments that do not tolerate risk or failure. And yet, risk is unavoidable. So what is the best way to deal with these types of situations?

One key is continuous risk management, Lutomski said. We already do this almost every day of our lives, he said, using the example of choosing a flight to Orlando to attend this Congress. For instance, you may have managed the risk of missing a connecting plane by choosing a longer connection time.

NASA has a mind-boggling level of risk, Lutomski said, with a 3.4 percent failure rate in launches each year. “Can you imagine crossing the street or driving a car with those odds—you’d never do it,” he said.

And yet, Lutomski said the common perception—even at NASA—was that space flight had the same level of risk as flying on an airplane. However, in 1986, the Challenger explosion changed that—much like the AIDS crisis changed the common perception that blood transfusions were inherently safe.

NASA responded to the Challenger crisis by rethinking how it handled risk. It instituted a risk-based decision-making framework and a risk threshold. Astronauts sign statements noting that they understand risks like a death rate of one out of every 270 crew members on a six-month expedition to the International Space Station. “We now have a more healthy realization of what risks we’re really taking,” Lutomski said.

One of the best ways to mitigate risk is to self-report, Lutomski said. But people have many reasons for not participating in risk assessment and reporting. They think they have no risk, their programs are too small, making risk public will kill a program, they prefer to deal with problems as they arise, identifying risks is bad for their career, it’s not their job to fill out bureaucratic forms, or they can’t assess risk because they can’t predict the future.

Not only does a successful risk-management system need to overcome those arguments, but it also needs to be humble and open to new information, Lutomski said. At NASA, that translates into continually questioning performance, looking at risks, and responding appropriately to failures when they occur.