How cyber security can learn from wingsuit flying: the inconvenience halo

Humans are an integral part of cyber security, as security engineers, risk managers, or as the ultimately biggest source of cyber risk, the end user. No matter how structured and analytical an approach to cyber risk management is, it involves these humans, and they are inherently irrational. It is a fallacy to think that humans can always decide rationally. We are all subject to strong psychological effects in how we perceive risk. They are known as cognitive bias – I call them irrationality pitfalls. Having dealt with risks both as a wingsuit basejumper and as a cyber security professional it is my goal to unveil these pitfalls and to create awareness in the cyber security domain.

Why I think this is important

I have been flying wingsuits for 12 years now. Back in the days you would find me flying low and dirty in between gullies and trees, counting the leaves passing by at 200km/h. Then this got boring. Therefore, my focus shifted to exploring new mountains to jump off, working with film productions to bring my passion to the big screen, and lately combining flying with mountaineering. This is more sustainable and provides a continuous challenge. I fly wingsuits because for me it is the purest form of flying I have ever experienced. And I very much dislike walking down after scaling a mountain. Besides that, I like soaring in more analytical fields: After having graduated from ETH Zurich, I have been working in IT security for eight years. Currently I am making a living as a public speaker and self-employed cyber security consultant, specializing in identity and access management, web application security, and risk management.

20453264jpg

These two seemingly antagonistic activities actually benefit each other and produce great synergies. On the one hand, an analytical and structured approach to extreme sports has prevented me from jumping head-first into many overly risky situations. On the other hand, the lessons learned about psychology, decision making under pressure and knowing when and how to listen to your gut have helped me a great deal in my professional life.

In my opinion, the human aspect and its inherent irrationality are not being paid enough attention in IT risk management. Correctly recognizing, or perceiving, risk is the first step in risk management, and if this step is flawed then the rest is largely pointless. Therefore, I will begin by covering several irrationality pitfalls in this risk perception blog series. One of those, and a solid contender for the throne is the inconvenience halo.

The Matterhorn situation

Before we start, in order to fully appreciate the analogy presented hereafter, please do me a favor. Think of my comfort level in wingsuit basejumping as what riding a bicycle is for you. Now, picture the following setting. It is September 2016 and the two of us, we want to jump off the famous Swiss postcard mountain and bucket list summit, the Matterhorn, 4478m. Conditions on the mountain are good, apart from being crowded as usual. The forecast announces stable weather and very low winds at high altitude – the number one success factor for high-alpine wingsuit jumping. We could not be more stoked. After spending the night in the Hörnli hut we climb to the summit early in the morning without any issues. However, on the summit, we come to a sudden halt: The winds are much stronger than anticipated and forecasted; gusty crosswind, among the worst.

mountains2jpg

We are optimistic and decide to wait for an hour in hope that the winds will calm down. Unfortunately, after an hour there is no change. Despite all optimism and waiting we must now accept the fact that we are dealing with marginal conditions for a jump. This crosswind situation, gusty, unpredictable, poses an additional risk to a jump that is already quite challenging. But how big actually is this additional risk?

The rationale

In retrospect from the comfort of a chair it is easy: The crosswind is so strong and gusty that jumping in these conditions results in a considerably increased probability that the launch, the initial couple of seconds, might go wrong. And since this is a jump with a below average margin for error, this could have a high, fatal, impact. On the other hand, we have ample time to climb down, the weather is stable, we are in good shape, and the chance of rockfall or a tumble on descent is ever-present but not out-of-the-ordinary. It is a clear case for climbing down and coming back for the jump some other time.

The mind under the inconvenience halo

In the field, right there, that moment, the risks are perceived quite differently. Aided by the thin air, the human mind is now playing tricks on our judgement. First, there is the reward situation, that feels somewhat like this: “This is what we came for, let’s do it, it’s the freaking Matterhorn, we will be down in Zermatt in three minutes, it will be a great flight and a great achievement for us, yay this will be so much fun, can’t wait to post a video of this. However, not jumping and climbing down sucks big time, it is long and tedious, we will be back late, and we will have failed.” Down-climbing is screaming inconvenience as loud as it gets. This sets a very strong emotional backdrop for the risk analysis, and skews the perception heavily in favor of jumping, much beyond its inherently higher reward. 1:0 for jumping.

Second, the skewed risk analysis sounds like this: “The wind is gusty, so I’m sure we can time it right, so that we get a few calm seconds right after we jump. Plus, I mean, we still have a margin and we are well-trained, so even if a gust hits us hard, the chance that there will be real problem is tiny.” We consider it an extremely low probability. And the impact, potentially fatal, is abstract and seems far-fetched. Overall, we heavily downplay the risk. “On the other hand, dude, let me tell you about downclimbing. Some rockfall is almost certain, and it could hit us. I might twist my ankle. Also, there’s a big chance that the downclimb will take longer and be even more tedious than expected, so that we might eventually even miss the last train back home”. Those are all risks with a higher probability, yet with much lower impact. But our mind is clinging onto every possible excuse to jump – remember the emotional backdrop set by the inconvenience. Additionally, another irrationality pitfall comes into play: “High probability – low impact” risks (downclimb in our case) are much more tangible for humans than “low probability – high impact” risks (jump). So even if mathematically the latter returns a bigger risk rating, humans are more afraid of the high probability risk. 2:0 for jumping. All in all, we have a perfect hotpot for irrationality and severely impacted decision making, totally opposite of the rationale.

mountains6jpg

The inconvenience halo in cyber security

The very same humans are subject to irrationality pitfalls when taking decisions in daily IT life. It would be wrong to assume that formal risk management will cover that. These systems are necessary and great, yet they have their limits. There are so many daily situations where decisions need to be made on the spot, in time, without guidance, that we are constantly subject to step into these psychological traps. And the situations where we think we are being the most rationale might be the ones where we actually get fooled the most.

The perfect example for how the inconvenience halo affects IT security is the everlasting issue with enforcing strong passwords. From a rational point of view, it is rather simple: Pick a password, minimum 8 characters, with a mix of small & capital letters, numbers and special characters, so that brute-force attacks are thwarted; not containing entire words, so that dictionary attacks are mitigated; and finally, do not pick anything related to you personally. Additionally, do not write it down in clear-text, neither offline nor digitally, change it regularly, and use a different password for every website or service you use, in order to contain the damage if one gets compromised. Sounds like a no-brainer.

From a user’s point of view, it looks quite different. The risk of getting hacked when using weak passwords seems quite abstract, and while the impact may be high, the probability is still low. However, the inconvenience when using strong passwords is huge and to be honest, even then the risk of getting hacked is still there, right? The adverse effects of strong passwords are highly tangible, the inconvenience is real. The Matterhorn state of mind says hello.

What can be done? Well, first and foremost, education of users and creating awareness are key and must be ensured. There is no way around that. Yet, alone it might not be enough. In the end, there is a saying that security without usability is worthless. I believe in this very much. So, embrace usability and empower your users by implementing single-sign-on, or deploy easy-to-use password managers, or forego the flawed password altogether and switch to biometric authentication where possible.

IMG_9315JPG

Now wait there. Is this a disease that is solely affecting our users or are we, cyber security professionals, sinners too? Obviously, we are being paid to push beyond inconvenience. But have we not all before fallen victim to holding onto well-known but possibly outdated threat models and the related mitigation, because they are tangible, and we feel like we can control them – instead of shifting ourselves and our paradigm to new risks and threat models, even though it means we might have to leave our comfort zone? Or have we not all before simply relied on best practices and the popular thinking, instead of differentiating and identifying the risks that are actually relevant and specific for us – just because it is convenient?

I for one am pleading guilty to these charges. But in the end that is the whole point: Accepting the fact that we are all subject to psychological effects. Then over time we can get better at identifying the situations where they can severely impact our sound and rational judgement, and thus be able to counteract our genetically imprinted emotional and mostly subconscious processing of information. That is not to say that the famous gut feeling is a bad thing. Listening to your gut is important – but you have to know when you can trust it, and when it is trying to trick you.

What’s next?

DSC08427jpg

You are probably wondering what happened with us on the Matterhorn. As this picture suggests we ended up climbing down. It was the gut that told us “Hey, watch out, you might be thinking irrationally”. Obviously the downclimbing initially led to some frustration. However, very soon we realized we were dealing with a luxury problem here, as we had just safely climbed and descended the Matterhorn, a feat that many people might be dreaming of for their whole life. And we could be proud of another good decision taken: We ended up having a beer in Zermatt after a day in the mountains and that is all that counts.

There have been other tales and learnings from many years of wingsuit flying that I am looking forward to write about in my next blog articles about risk perception. Also, I would be happy to share my experience with you in the form of a tailored keynote or seminar - check out my services and feel free to get in touch with me.

This is my entry into the world of blogging and obviously I am anxious about how this will turn out and if my words will fall upon open ears. I guess if you are reading this you bothered to bear with me this far – I am flattered. If you liked this article and would like to see more coming, you would be doing me a huge favor by letting me know by subscribing to my blog newsletter or sharing this article – thank you very much.