No Expert, No Cry

Why you shouldn’t trust (awareness) experts, what should you trust instead, and my 2018 new year resolution.

By Eh’den Biber



Prologue – SANS

During the SANS European awareness summit, I’ve ended up in an interesting debate on twitter with one of the attendees (John Scott). The debate was on the observation I made that science was not part of the agenda in this major awareness summit. There was not a single scientist on stage to talk about their breakthrough research, and none of the tweets about the event (#SecAwareSummit) included any science in them.

My observations didn’t go that well with John, who seems to have taken it a bit personal. To show me I was wrong he mentioned that Jessica Barker gave a talk. Yes, she did, and yes: she’s a (civil design) doctor, and I barely finished Kindergarten.

When SANS finally posted the presentation slides of the event (including the workshops that occurred before), it seems that the only one who provided external references in their slides was Jess (well done). She mentioned 5 academic papers (from 1996, 1999, 2008, 2008, 2009), one reference to TED talk (2012) and one book (2017). Only one of the research mentioned was focused on information security (2009, Self-efficacy in information security: Its influence on end users’ information security practice behaviour), it used social cognitive theory, and the results suggested that simply listing what not to do and penalties associated with a wrong doing in the users’ information security policy alone will have a limited impact on effective implementation of security measures.

I’ll let Iago express my feelings about that one:


Show me the science

If we wish to change behaviour we need to be able to measure it. What to measure, when to measure, where to measure, how to measure – all of these are elements that the scientific method been the most effective approach we have as humans so far.

Yet when it comes to awareness to infosec and privacy we seem to be totally ignoring science. The SANS summit is just one sad reflection. Very few, if any, vendors of “information security awareness training” material or services will provide deep details of the scientific approach they have used to develop their solutions, not to mention any conversation on an evidence. Is there any double blind, placebo-controlled trial that shows the effectiveness of one method over the other? Not that I know of.

In many ways, it’s astonishing. We are in 2018, for Christ’s sake! Awareness to security and privacy elements of information systems is important to many technologies dependent societies, as well as companies. If that’s how much science is involved, no wonder nothing really changes.

The Single Most Important Measurement in Awareness Training

The book “How to Measure Anything in Cybersecurity Risk”, by Douglas W. Hubbard & Richard Seiersen, was written to explain why the current methods used by most organizations to measure cyber security risk are not fit for purpose, and suggest a quantified risk management approach. Chapter 4 in the book was originally called “The Single Most Important Measurement in Cybersecurity”, and I will follow its structure to talk about the single most important measurement in awareness training.


First point – awareness training matters.

Doing awareness training just to be “compliant” is simply insufficient these days. Having the ability to provide proof employees have passed awareness training will not protect your organization from the current risk landscape. Take, for example, GDPR. Without awareness to information security and privacy across all stakeholders, organizations cannot achieve “privacy by design” and “security by design.”. As such, their risk level is increased: it will increase the magnitude of a future loss event in case of a breach due to increased secondary risk (regulator) caused by GDPR lack of compliance. The checkbox days in which organizations only had to prove they have records that employees did a CBT on information security and privacy are over. Organization that has high levels of awareness to information security and privacy will have lower risk of regulatory related actions, and will outperform organizations that only perform awareness activities for compliance reasons.


But how do you know which method of awareness training works? What do you measure? Is it possible that your awareness training doesn’t work at all? Even more importantly, how can you measure the awareness training method itself?


“We often hear that a given method is ‘proven’ and is a “best practice.” It may be touted as a “rigorous” and “formal” method—implying that this is adequate reason to believe that it improves estimates and decisions…Some satisfied users will even provide testimonials to the method’s effectiveness. But how often are these claims ever based on actual measurements of the method’s performance?” (How to Measure Anything in Cybersecurity Risk, Douglas W. Hubbard & Richard Seiersen)


If your organization is using an awareness training method that can’t show meaningful measurable improvement, or, even worse, makes awareness levels to drop down, then the method itself becomes the single biggest risk related to awareness, and improving the method will be the single most important awareness activity’s priority.

We need to find either a measurement that already been proven to work, or if we don’t have one, we need to propose a measurement that will allow us to identify a good awareness training method, as well as what we measurements we shouldn’t be measuring.


Regardless of the method you currently use to educate people about information security and privacy, the question you must ask yourself first is: does it work, and how do I measure its success?
How can you tell if whatever baseline you measured at first was the right baseline to measure, and how can you tell if your measurements were accurate? Take, for example, the typical “phishing” exercise so many organizations tend to use as part of their baseline analysis – what exactly are you measuring there? If someone didn’t click on a phishing email, does it mean that they will not click on the next phishing email if they are checking their email on their mobile phone? If someone reported on phishing email, does it mean he will design information systems which will follow the “privacy by design” principles?


The Awareness Placebo

Meet the “analysis placebo,” or the “overconfidence effect”—the feeling that some analytical method has improved decisions and estimates even when it has not.  There are numerous studies in various fields, which showed that when someone is engaged in training it leads to improved level of confidence but not to an actual performance improvement. Here is one example: a 1999 study had subjects which some of the participants were trained in lie detection, and the others didn’t. When both groups were given video tapes of investigations the group who was trained in lie detection had more confidence about their lie detection skills vs. the other group. However, they actually performed worse than the untrained subjects. Another study showed that clinical psychologists became more confident in their diagnosis and their prognosis for various risky behaviours by gathering more information about patients, even though the patient observed outcomes of behaviours did not actually improve.


If you work on the field of awareness training, the fact you are exposed relatively more information on the subject than others will not make you an expert, or improve your ability to decide if the awareness training you choose will be able to deliver what it promised. Actually, even calling “awareness placebo” is wrong – in the field of medicine placebo medication has shown positive effects on patients who took it believing it will help them, while “awareness placebo” has zero benefits for the state of awareness, and in fact reduces it. Remember the phishing exercise I mentioned before? The fact someone successfully detected a phishing email can actually make that person act in a less secure way, because it might increase his perception of good judgement about information security and privacy, when, in fact, might be no real improvement.


In Science We Trust

The take-home message is – do not rely on “experts” just because of their credentials, or experience. If you are the one in charge of delivering awareness training – remember your biases, study them, always insist to use reason and evidence to reach a conclusion about awareness training methods and their capabilities. In other words – use critical thinking, ask for the scientific methods behind the awareness training, challenge the numbers, and never trust your own perception, as you are most likely unaware of your own biases that blind you.


Which bring me to my new year resolution:


International Cyber Security and Privacy Awareness Coalition (ICSPAC)

Cyber security and privacy awareness training should provide measurable increase in people’s ability to act and react correctly with regards to information security and privacy-related decisions and actions, and maintain that ability for a pre-defined window of time across different states of being.

The challenge is that the current platforms used by “awareness experts” to share and exchange their work are not provided by an objective body. They are either being provided by vendors (e.g. SANS) who have their own methodology, or by information security and privacy professional bodies, that are biased due to their inner politics, or by governments, who have no clue what is awareness is.


Since I’ve written extensively on the subject of awareness (see reference below), and since science is continuously being ignored I decided to be the founder of a new organization, called International Cyber Security and Privacy Awareness Coalition (ICSPAC). This will be an open, non-partisan, non-profit organization that aims to educate policy makers, organizations, professionals and the public about conclusive science which can be used to improve the level of awareness in the fields of cyber security and privacy awareness / culture, and where science is absent, to encourage additional research.


Please join if:

  • You wish to found out what can be an effective, evidence-based approaches to cyber security and privacy awareness training.
  • You wish your awareness related metrics to deliver meaningful indication to the state of cyber security and privacy awareness training.
  • Encourage vendor/political/professional bodies agnostic conversations and debates.


In my next article, I will provide some science-based insights to awareness training, some of which might be surprising.


Everything gonna be alright…

Embedded video:



I published about 35 articles related to the subject of awareness and culture. Here is the list:


  • Collective Corporate Judgement – suggestion to tackle social network risk is by a concept I will call collective corporate judgement.
  • Killing Social Engineering – talking about human manipulation as a neurological phenomena.
  • Amygdalala-land – understand the neurological limitations and advantages (of) our human brain.
  • Play Dead – “helping your user and friends’ community can only be done if you find a way to empower them, not scare them to death.”
  • The Metrics – biological, biochemical and neurological examples why people might say they will behave responsible and they will believe it – but will not act responsible
  • Men without hats are living on the edge – How to solve the Clash between ethics, personal integrity, “the system” and hacking?







  • Awareness Myth Busting – Why attempts to raise the level of awareness to information security are failing, and what to do in order to change it.
  • The Revolution – How I became part of an invisible hacking revolution.
  • GDPR “Unknown Unknowns” – The art of privacy, and why what you don’t know (about the GDPR) WILL kill you.
  • #Cyberblind – Why salaries and job ads are superb indicators to your organisation cyber security maturity, how it can be improved, and why your organisation won’t do anything to fix it.
  • Uber and Under the Breach – Everything you need to know about the Uber data breach, and much more on Uber culture.


© All rights reserved 2018

Leave a Reply

Your email address will not be published. Required fields are marked *