Conference: B-Sides Las Vegas
Title: Cultural Cues from High Risk Professions
Speaker: Gal Shpantzer
In this B-Sides LV talk,Gal Shpantzer employed the Swiss cheese model of catastrophe as a parallel for the information security industry. The model was originally developed by James Reason of the University of Manchester and Dante Orlandella, and used to analyze the causes of systematic failures in aviation, engineering and healthcare. The model likens organizational problems to swiss cheese – where each problem can be viewed as a hole in a piece of swiss cheese. The layers in the systems and processes are designed to catch mistakes before they become catastrophic. But, if the holes in each layer align, serious problems can result. Much like a hole going all the way through the piece of cheese.
For example, Korean Air at one point in time had 17 times as many catastrophic incidents per million miles as United Airlines. Investigation revealed that it came down to differences in processes and protocols. Whereas, at United Airlines, volunteering information and seizing controls under emergency circumstances, etc were incorporated into the official cockpit protocols. The captain was the authority but could be questioned. This was also discussed in depth in the context of cultural influence in Malcolm Gladwell’s book, Outliers, there was an atmosphere of over-deference in the cockpit where one does not question the captain. And, it wasn’t just Korean Air where this happened. There were other airlines headquartered in countries where respect for authority is so ingrained in the culture – like in Colombia.
In the info security space, Gal Shpantzer proposed protocols where there is responsibility but people are not afraid (i.e. penalized) for volunteering information. Pain and hostility shuts people down and leads to swiss cheese. In the medical profession, it was found that the more expert the physician, the more likely that physician was to miss simple things like administering aspirin before/after operations that reduce probability of cardiac problems.
Summary: I find little to disagree with. This is one of those common sense, obvious when you hear it talks that is none the less worth mentioning because when you don’t hear it, it tends to not get done. No product ideas, but good general security philosophy.