top of page
washington-dc-panorama-photo-l.jpg
White Structure

RISK & NEAR-MISSES

The ghosts of future disasters hide in near-misses we're trained to ignore

Organizations from manufacturing to healthcare to aviation now use incidence reporting systems to catch warning signals before they escalate.

​

When we narrowly avoid catastrophe, we often learn the wrong lesson—seeing success instead of warning. This research, commissioned by NASA after shuttle disasters and applied by the Department of Homeland Security for terrorism analysis, reveals how near-misses quietly normalize danger and what organizations must do to recognize these signals before disaster strikes.

 

The National Academy of Sciences appointed me to three separate committees—on intelligence analysis, military culture, and decision-making under uncertainty—to apply these insights to government agencies where errors cost lives. During the COVID-19 pandemic, Fox News sought expert analysis on why risk perception wasn't enough to change behavior. On ITV news during the 2016 presidential election, saw the implications of this discounting for high-stakes political decision-making.

Professor Tinsley appeared on CNBC's power lunch to discuss why companies must change their approach towards sexual harassment. She outlined the risks of normalizing such deviant behavior, and why employees are reluctant to report small infractions before they escalate. She has also published another risk that companies take when they neglect difficult or inconvenient conversations.

“At NASA, mistakes can be fatal. That is why we took Dr. Tinsley’s research on normalization of deviance and near-miss learning seriously enough to commission workshops, incorporate her findings into leadership programs, and bring NASA engineers into her Georgetown classroom.  Her work directly shaped tools we now use to catch warning signals before they escalate. In a high-risk, high-tech environment, the ability to recognize a near-miss as a warning—not a success—is the difference between a close call and a disaster. Dr. Tinsley helped us see that difference.”
 

— Ed Rogers, Chief Knowledge Officer (retired), NASA Goddard Space Flight Center

bottom of page