Why your best employee still clicks that wrong link

It’s Monday afternoon, 4:45 p.m. The deadline for a crucial acquisition is approaching, and your inbox is filling up faster than you can manage. While you’re on the phone with a demanding client, a notification pops up: “Final changes to the purchase agreement, please approve immediately.” Without really thinking, you click the attachment. You’re an expert. You know the risks of phishing, and you’ve aced every internal training. Yet in the heat of the moment, urgency overrides common sense. Why does this happen, even to the very best?

The vulnerability of our biological hardware

In a previous blog, my colleague Laurens Rüpp laid an important foundation: we are not “thinking machines that feel,” but “feeling machines that think.” Of course, technology forms the backbone of solid security. With firewalls, strong encryption, and Multi-Factor Authentication (MFA), the digital walls of most organizations stand firm. Still, technology is not a cure-all.

Systems are rarely resilient against the unpredictability of human behavior. International research by Verizon¹ shows that the human element still plays a role in no less than 74% of all data breaches. Our biological “hardware” has vulnerabilities that simply cannot be fixed with a software update. You cannot “patch” the human brain. That’s why the solution rarely lies in yet another training session, but rather in designing smarter systems and workflows.

Even the most loyal and knowledgeable employee can unintentionally pose a risk when the work environment does not support secure behavior. No matter how sound the rules may be, in the heat of the moment they often prove to be little more than a paper tiger. This has three underlying causes that we frequently overlook.

1. The brain’s autopilot

Research shows that as much as 95% of our daily behavior is unconscious and automatic. The brain is a remarkable instrument, but it is extremely energy-efficient. To process the constant flood of information, it relies on mental shortcuts. We react reflexively to what we recognize. An incoming email from a “colleague” triggers immediate action, before the conscious part of our brain has a chance to critically assess the sender.

The data is clear. According to research from Stanford University², “cognitive load” is one of the strongest predictors of workplace errors. When we multitask excessively or operate under high pressure, our ability to recognize risks declines. In that state, autopilot takes over. We don’t click because we don’t know the rules; we click because our brain chooses the quickest route to conserve energy.

Take the notorious Christmas hamper phishing campaigns, for example. During the holiday season, our minds are often not fully on work. We’re busy wrapping things up, juggling personal commitments, and planning vacations. At such times, emails about “a company gift” or a “Bol.com voucher to claim” slip through more easily. Dutch organizations such as Maastricht University and various municipalities have fallen victim to such campaigns when vigilance was lower.

In those moments, the routine of “quickly taking care of something” outweighs the abstract warning from security training. We train people to make conscious decisions, yet mistakes occur when we operate on autopilot. Effective information security therefore starts not with stricter rules, but with understanding how people actually make decisions under pressure.

2. The gap between policy and practice

“Be alert to suspicious signals.” It’s a wonderful statement for an annual report, but on a busy work floor, it becomes meaningless noise. Security policy fails when rules are not workable in daily practice. Here, we believe in practical advice: rules must not only be correct, they must also be workable.

When rules are imposed from the top down as an obligation, without employees understanding how they relate to their specific tasks, friction arises. Employees feel hindered rather than supported by security measures. People want to understand why something is necessary. Imagine a lawyer required to use a secure environment that is slow or doesn’t function on mobile.

Under client pressure, that lawyer may still send the attachment via regular email. That’s not malicious intent or sabotage; it’s a professional trying to do their job efficiently and well. In that moment, the rule becomes the enemy of productivity.

We believe in workable policies designed with usability in mind. This ensures that information security does not become an unnecessary burden, but a logical part of daily operations. That’s why we design policies that are not only legally sound, but also tested in practice. If the secure method is not the easiest route, it is not sustainable policy.

3. Work pressure as a security underminer

The greatest enemy of a secure organization is excessive workload. When overwhelmed with information, we lose the ability to make sound judgments. Our mental battery gradually drains throughout the day, leaving no room to properly assess risks. In a culture where speed and results are the highest priority, attention to security quietly slips away. That’s why it is crucial for company culture to actively foster an environment in which employees have the time and space to make conscious, well-considered decisions.

Information security is only truly effective when it is a shared responsibility across the organization. If we demand full focus on content from employees, while designing an environment filled with constant distractions and high pressure, we are essentially asking the impossible. You can only stay sharp if the context allows you to be sharp.

An organization is therefore never “done” with a firewall or a training session. Real security emerges when everyone contributes to a safety net that catches mistakes. By putting the human factor at the center and collaborating to build a supportive environment, we create a practice that is genuinely resilient to everyday challenges.

Reflection: Do you dare to look in the mirror?

As an employer, you trust in the expertise and commitment of your team. But knowledge alone is never the complete solution. To become truly resilient, we must shift the focus from the employee to the organizational context. Ask yourself and your team these critical questions:

  • Does our process make it easy for employees to work securely? How can we make the safe route the easiest route? If the unsafe option is three steps shorter, people will eventually take it.

  • Do employees feel safe raising concerns when workload threatens their carefulness? Or is there a taboo around setting boundaries?

  • Are our security rules written in the language of the lawyer, the consultant, and the secretary? Or are they checkboxes primarily meant for the auditor?

  • How do we respond when something almost goes wrong? Do we immediately look for someone to blame, or do we examine which environmental factors, such as time pressure or unclear processes, triggered the error?

Information security is a journey, not a destination

Information security is not a project with a clear beginning and end. It is not an annual training to check off before returning to business as usual. It is an ongoing process that requires continuous attention to the human dimension. By accepting that mistakes are human, and often logical within the context in which we work, you can begin designing an environment that absorbs errors rather than merely punishing them.

Sustainable security arises when you approach people, processes, and technology as one interconnected system. In the next blog, my colleague Kim Keenswijk will explore the crucial role of organizational culture in greater depth. Because how do you ensure that a culture of security is truly embedded in your firm’s DNA, even when life suddenly throws unexpected challenges your way?

If you have any further questions regarding this matter, please do not hesitate to contact us.

Contact us


¹Verizon Data Breach Investigations Report (DBIR) 2023.

² Stanford University, Human Error in Information Security, 2020.

Back to overview