Information security is often treated as a technical matter: patch management, firewalls, logging, MFA, all essential. But the reality is that most incidents can ultimately be traced back to one weak link: human behaviour. Not because employees are ‘stupid’ or ‘careless’, but because behaviour is shaped under pressure, through routine and within context. The well-known Portuguese neurologist António Damásio, renowned for his groundbreaking research into the relationship between emotion, consciousness and decision making, captured this succinctly: “We are not thinking machines that feel, but feeling machines that think.” In other words, behaviour is not purely rational. It is driven by emotions, habits, social norms and context.
If you truly want to address the human factor in information security, awareness alone is insufficient. An awareness training can increase knowledge, but behavioural change requires a broader, systemic approach. To achieve this systematic approach, you can use five behavioural lenses. This practical behavioural model helps you to better understand and effectively influence human behaviour.
In the upcoming security blogs, my colleagues Hediye Kamalizade and Kim Keenswijk and I will reflect on the human factor within information security. This blog is the first part of a three-part series, in which we show why information security in practice rarely fails because of technology, but far more often because of human behaviour. We also address the tension between policy and practice. We show how behaviour comes about, why awareness or policy alone is insufficient, and how organisations can gain real control over information security behaviour through targeted behavioural interventions.
We often judge behaviour in information security by the outcome. Someone clicked on a phishing link. Someone reused a password. Someone did not report an incident. Behind these outcomes are layers such as motivation, habits, emotions, social pressure and environment. Behaviour is visible, but it arises from internal drivers and external factors. Behaviour is always logical, if you understand the context in which it occurs.
This is why behavioural change often fails when we approach it as an “information problem”. Knowledge is rarely the bottleneck. The key conditions for behavioural change are:
That is precisely why small interventions, such as nudges (subtle steering without coercion), reminders, feedback and social norms, are sometimes more effective than yet another awareness e-learning.
People are not unpredictable risk factors. Their behaviour is largely explainable and influenceable. The question is therefore not whether people make mistakes, but how the organisation is designed around human behaviour. By approaching the human factor as a design challenge, the focus shifts. Instead of correcting employees, you start to look at the environment in which they work. Which choices are facilitated? Which routines are rewarded? Where is secure behaviour made unnecessarily complex?
The use of behavioural lenses can help to analyse that environment systematically. They make visible how habits, knowledge, awareness, motivation and repetition come together in everyday work behaviour. In doing so, they form a bridge between policy and practice, and between technical measures and human action.
The obvious question is: how can you design human behaviour? During a large-scale project on the origins and persistence of problems in the fields of health and energy and the environment, a practical, theoretical behavioural model was developed at HU University of Applied Sciences Utrecht: the Persuasive by Design model. The model translates scientific behavioural principles into five practical, non-overlapping perspectives to analyse and influence behaviour. These lenses are particularly applicable to information security, because secure information behaviour is often a mix of automatism, knowledge, motivation and context. Below, the lenses are translated into concrete information security applications.
“Clicking often happens on autopilot.”
Much security behaviour is routine based. People respond reflexively to notifications, pop ups and emails. This is comparable to the well-known Stroop test: our brain automatically chooses the “fast route”, and it takes effort to interrupt it. The more often someone performs the same action in the same situation, the greater the chance that this action becomes a habit. The same applies to impulses: fast, often emotionally driven reactions that occur before someone consciously reflects. The problem is that information security incidents are often addressed as if they result from conscious choices, while this type of behaviour takes place largely unconsciously.
Many information security incidents arise precisely in this automatic domain:
How do you break this pattern?
In short: make secure behaviour easier and more automatic than insecure behaviour.
“Knowing what to do is not the same as actually doing it.”
This lens focuses on knowledge, beliefs and misconceptions. Behaviour is often influenced by a mix of facts and assumptions. People may know that phishing is dangerous, but do not understand why a specific behaviour matters, or they may experience resistance.
Secure behaviour is often hindered because people:
Awareness plays a role here. Without insight into the why and the how, change is difficult. How can you break through this?
This is where the “know-want-can” model effectively comes into play. People first need to understand (know) before they are willing to change (want) and able to act (can).
“People overestimate themselves, until you make it visible.”
A major pitfall in information security is optimism bias: employees believe they are less susceptible to mistakes than others. People are often barely aware of their own behaviour because it feels self-evident. According to this lens, you break through that pattern by providing feedback based on measurable data, so that people start to recognise their own behaviour.
Many organisations invest in awareness, but forget to provide feedback on it. As a result, information security remains abstract, something that belongs to IT or Compliance. When you make behaviour visible, it becomes more personal and a sense of urgency emerges.
How do you break through this?
It can therefore help to show teams or departments how often phishing is reported versus opened. Provide personal feedback after simulations, or create incentives for teams or departments in the area of information security.
“Motivation and skills must be aligned.”
This lens focuses on the combination of motivation and skills required to perform the desired behaviour. Motivation can be intrinsic or extrinsic. Intrinsic motivation is often more sustainable, but without sufficient motivation, behavioural change will not occur.
Even when people are motivated, they may still be unable to perform the behaviour. Think of employees who want to report incidents but do not know how, or who become anxious because of the reporting procedure. That is why it is important to investigate why desired behaviour is not followed. This reveals where skills or conditions are lacking.
Policy often fails because organisations assume that employees “do not want to”, while the problem is often that they:
What works?
If reporting an incident is complicated, or employees do not know what is considered “suspicious”, motivation alone is not enough. You need to train skills and remove the barriers for them.
“Intention without repetition is empty.”
Behavioural change only becomes sustainable when it is embedded in daily routines and the working environment. This requires repetition, practice, and support from colleagues and managers. It is not a one-off interaction but a routine. If the daily environment continues to reward old behaviour, people will revert. Behavioural change is not a linear process and requires repetition, feedback and time.
What works?
For example, work with security role models (people who lead by example), make information security a fixed agenda item in team meetings, and design processes so that working securely does not require extra effort.
When security measures are not followed, the first reflex is often: “employees are not sufficiently aware” or “they do not take security seriously”. In practice, however, secure information behaviour is rarely a purely motivational issue. Most employees want to do their job well and understand the importance of information security.
What is often missing is not the willingness, but the right context. Behaviour is influenced by workload, routines, unclear processes, social norms and system design. If secure behaviour takes more time, requires extra steps or creates uncertainty, it is logical that people fall back on familiar patterns. Not out of unwillingness, but because our brain chooses the path of least resistance.
By approaching secure information behaviour solely as an awareness or motivation issue, organisations continue to invest in solutions that have only limited effect. Behavioural change requires a broader approach: insight into where behaviour gets stuck and why that happens. The behavioural lenses offer exactly that perspective. You need to look through multiple lenses and design interventions that fit the everyday reality of employees.
Behavioural change does not have to be a large or complex programme. In fact, small, well-chosen interventions are often more effective than extensive initiatives. A good first step is to select one specific security behaviour within your organisation that regularly goes wrong in practice.
Analyse this behaviour using the behavioural lenses and determine one improvement action per lens. This can range from simplifying a reporting process to making feedback visible or disrupting existing routines. It is important to monitor the effect and adjust interventions where necessary.
By working structurally with the behavioural lenses, a more realistic and effective information security policy emerges step by step. Not through more rules or training, but by making secure information behaviour logical, feasible and more self-evident in daily work.
Concretely, this could involve:
- Behavioural change is not a one-off project; it is a continuous improvement process
In the next part of this blog series, Hediye Kamalizade builds on the human factor in information security. She examines why rules and policy alone often fail to achieve the desired effect and what this means for the way organisations influence employee behaviour. A more in depth perspective that helps to apply secure information behaviour in practice and influence it more effectively.
Would you like to learn more about security? Read our security compliance blog.