Cybersecurity awareness training is definitely NOT a one-size-fits all. But how do you customize it to fit varying needs while still being able to scale? Join us as we speak with Alexander Kharlamov, Behavioural Data Scientist and Engineer to apply insights on human behavior for a more customizable approach.
In order to scale cybersecurity awareness campaigns and any relevant interventions first there needs to be some way to categorize and understand the personas within an organization. Alexander worked with colleagues to create the Cyber Domain-Specific Risk Taking (CyberDoSpeRT) scale that aims to measure individual risk taking and risk perception towards cyber risks across 5 different areas.
One insight he noted is the relationship between risk perception and risk taking is inverse - the more risk one perceives, the less likely that individual tends towards taking the perceived risk. However, he noted the nuance that culture also plays an important role, "There is an influence of your own experiences as expected but [more so] there is a contextual cultural set of values that will drive your relationship in the cybersecurity space." In this paper, they focused on cultures in the United States versus United Kingdom. In short, culturally, they show that the US population tends to exhibit higher levels of risk taking in cyberspace than the UK population.
Alexander commented that "Cultures that are oriented towards self are very different from those that are oriented towards the community. They will respond differently to how regulations are imposed and policies formed in the cyberspace. That needs to be taken into account. More often than not we have very diverse organizations...it will be good to take [into account] that diversity into the design of your interventions and your communications...and design a mix of strategies that you noted that will pique someone's attention."
Culture aside, the the CyberDoSpeRT focuses on four personas:
Relaxed - those who exhibit high risk taking, but perceive low risk
Ignorant - those who perceive low risk and are unaware of risk (but in reality engage in much risk)
Opportunistic - those who perceive the risk and yet still take the risk
Anxious - those who perceive risk everywhere and are thus low risk taking (Alex noted that risk averse cultures tend to exhibit this persona more pervasively).
He noted that it's important to understand that there is not one 'better' or 'worse' personality type according to their personas. Rather, he encouraged managers to accept that they just 'are' and adapt your approach to those different personalities for more effective communication. He also added that "more often than not, there is a strong relationship between a risk attitude and [a person's] job nature or role type."
For instance, those who work in highly regulated industries typically have a zero risk appetite which leads to creating systems with personalities that tend towards the more Anxious type. From a risk perspective this can seemingly be a positive. However, too much can induce 'operational paralysis' which inhibits productivity and the flow of operations becoming detrimental to overall performance. Kharlamov observes, "It's important to balance awareness with anxiety."
Utilizing fear as a motivator in cybersecurity awareness programs is becoming less of an approach as it proves to be largely ineffective in changing behavior. However, it's important to note that done in a limited capacity to the group of individuals known as Ignorant - according to theCyberDoSpeRT scale - can actually produce positive change in behavior initially.
Kharlamov observed that for this audience-type, when introducing the topic of risk, these individuals do generally benefit from some messaging that leans towards the more fear-inducing content and performance tends to improve initially but after the initial awareness has been awakened, positive approaches will have a longer-lasting impact.
For the other three personas, on the other hand, utilizing fear as a motivator will actually produce an inverse effect and these individuals - with heightened risk perception - will perform worse if you use fear as the mechanism.
He went on to explain that their research showed that any type of negative motivation showed it had an adverse affect on people's ability to perform.
The natural inclination is to think that if we want people to stop engaging in risk they should also have a heightened perception of risk but if that perception is unreasonable or unbalanced or heightened for no real reason, that actually leads to underperformance.
In instances where negative messaging was used, they found it actually led to worse performance than if the messaging was simply neutral.
Positive messaging, on the other hand, that praised employees for their ability to keep the organization safe with their awareness and that used fun and engaging interactions saw performance significantly better to the other two methods.
"You are trying to make people better than their previous selves."
Alexander recommends to maintain a regular sample of employees through working with HR to embed the collection of profiles into the normal onboarding procedure for new hires. If it is not possible to obtain information on every employee, work to get at least a representative sample across the organization.
For the organization in general, performing a critical mass of sampling can allow you to map out the risk profile across the business to identify hotspots and create more safeguards for those who might be more prone to risk behaviors.
From an intervention perspective, Kharlamov encourages managers to think about feedback loops continuously and stressed the importance on the moment between an interaction and the feedback to an individual - it must be as short as possible.
For instance, if you're deploying a phishing simulation ensure the feedback is explanatory, meaningful, and immediate as individuals engage. Going back to the importance of positive reinforcement, feedback should also be included even if they succeeded in an interaction as it will help them understand what happened - they might have succeeded by pure luck, or it might have been intuitive but the feedback will help to distinguish the two.
In creating the cyber security training, ensure it stays relevant to the screens and environments representative of the tools and platforms that employees use in their day-to-day activities. Pick the top 10 or 20 tools, keep the same logic and approach, but change the 'skin' of the interaction cultivate better situational awareness in their own habits through familiar environments in the simulation.
It's difficult to measure the impact of interventions without 'longitudinal thinking and design" around impact assessment. Often it can be hard to see whether the absolute number of incidents is increasing or decreasing, because it's all related to the baseline. But what is overlooked, oftentimes, is that the baseline has to do with the industry as a whole.
Threat in general are rising but the question to consider is "how is my organization faring against that baseline?". Alexander emphasizes this point is extremely important to understand and to convey when summarizing to leadership.
While there may be an increase in incidents within the organization, if that increase is much slower than the general trend across the market, this demonstrates a positive performance. Typically, the general trend and the organizational trend are treated as two separate occurrences, but it's critical to remember they are dependant - even if a particular trend increases for your business, you want it to increase less than the industry baseline.
While it's easy to think it's impractical to provide individual feedback to users for large enterprises, Alexander counters that it is possible. He recommends utilizing a semi-automated solution that has a tree-based build and apply conditionals to different situations.
For example, if you have several layers or points of failure for the person who takes a phishing simulation, create a response with ranges and then mix and match based on their performance; or take the known metadata of what just occurred and use that to provide a segmented response based on what is known that just happened to help the individuals understand a broader picture.
1. Design with impact assessment in mind - not only to reduce incidents but also to include a measurement system to track performance in order to redeploy a measurement system that can be comparable throughout time.
2. Keep it positive and try to avoid fear - even if you're unable to drill down into each individual within an org, making it positive will have a greater ROI in terms of behavioral change and adaptation.
3. Close the gap between the feedback and the action - the smaller the gap the higher the chance of each individual learning from their mistake or their success. Success is equally important to show and celebrate. Meaningful feedback can help drive impact at scale.
If you had to choose just one, focus on positivity.
Connect with Alexander on LinkedIn and while you're there check out our Security Awareness Manager community.
Resources from Alexander's Research:
Predicting the performance of users as human sensors of security threats in social media
A CYBER DOMAIN-SPECIFIC RISK ATTITUDES SCALE TO ADDRESS SECURITY ISSUES IN THE DIGITAL SPACE
Other Resources:
Looking for awareness training that is short, relevant and engaging? Check out Wizer’s free security awareness video library.