Seeky

How deepfakes turn employees into a weakness

Date of issue

3. 10. 2025

Are you interested in the described topic?

contact us
How deepfakes turn employees into a weakness

The morning that turned into a crisis

Petra, the management assistant, came into the office with the usual to-do list. Prepare contracts, arrange payment documents, go through emails. Nothing out of the ordinary.

When a new email from the CEO appeared on her monitor, she didn’t pause. The address was correct, the company signature was not missing. The text was brief and urgent:
“Petra, I’m at a conference and I don’t have access to the system. I need you to make an immediate deposit to the vendor. I am sending the account number in the attachment. It is urgent.”

Petra didn’t hesitate. She knew that situations like this sometimes happened. She clicked on the attachment, checked the details and prepared the payment.

Shortly thereafter, her phone rang. The director’s number was on the display. The voice she heard was unmistakable. The same intonation, the same turns of phrase, even the same way he pronounced the contractor’s name. “Petra, please, let’s not let this stall. We need that payment today.”

A familiar voice. Authority. Urgency. Petra completed the payment.

Reality: the perfect illusion

None of these contacts were real. The attackers combined

  • Spear-phishing: precisely targeted email based on knowledge of company processes.
  • Deepfake voice: generated from publicly available recordings of the director.
  • Social engineering: a combination of authority and time pressure that eliminated room for doubt.

The payment went into the account, where it disappeared within a few hours. The company firewall, EDR system and multi-factor authentication did not play a role. Defense was bypassed by a human.

Why technology is not enough

At first glance, it seems incomprehensible that a company with advanced security measures would fall for such a simple trick. But that’s exactly what makes deepfake attacks so dangerous.

  • Technological barriers are not the target of the attack.Attackers don’t need malware if they can convince a human.
  • Petra didn’t break any rules.On the contrary, she acted in the interests of the management.
  • Processes were not set up for crisis situations.Double verification was not required for exceptional payments.

Deepfakes use psychology: time pressure, trust in authority and the natural inclination to help. These factors can disable critical thinking in even experienced employees.

How does deepfake work in practice?

The technology for generating synthetic voice and video has advanced so much that a few minutes of recording is sufficient for basic imitation. In practice, this means that anyone who has ever spoken at a conference or given an interview can be exploited as the “voice of the attacker”.

  • Voice cloning: neural networks train a model of the voice from available recordings. The result is so convincing that it is virtually indistinguishable to the human ear.
  • Synthetic video: more advanced attacks also use video chat, where the face and facial expressions are copied in real time.
  • The OSINT connection: attackers analyze public information about the company – who has what authority, who signs contracts, who has access to accounts.

The result is an attack that is not technically complex but psychologically devastating.

Questions every company must ask itself

  1. Is it possible to stop a deepfake attack with technology alone?
    No. While tools exist to detect deepfake voices and video, reliability is still limited. Processes and human habits are critical.
  2. Who is the real first line of defense?
    Not company firewalls, but employees. They need to be given clear instructions on how to respond to suspicious requests – and, most importantly, the support of being able to say “stop” to the CEO.

How to defend yourself: strategies for modern companies

Multi-channel verification
No emergency payment should be made without double confirmation – ideally by two independent persons and through two different channels.

Training based on real-life scenarios
Phishing training is no longer enough. Employees must experience simulated attacks involving deepfake calls and video conferencing. This is the only way to develop a “stop and check” reflex.

Safety Culture
If an employee feels that challenging a management request is a sign of disloyalty, the company has lost before the attack. The culture must clearly say: “Safety comes before obedience.”

Zero Trust mindset
The principle of “trust no one until proven” no longer applies only to network traffic. It must also apply to voice and video. Every request must be validated.

What this means for management

Petra’s story is not about the failure of an individual. It is an example that even a company with strong technology can fail if it underestimates the human factor and process security.

Deepfake attacks exemplify a new era of cyber threats – an era where it’s no longer just about code and viruses, but about manipulating emotions, trust and time.

Are you interested in this topic? Jan Marek from Cyber Rangers will guide you through practical examples at the Cybersecurity Summit!

More posts

We live with digital technologies. And that’s why we write about them.

Latest Articles
More posts
1/10

Or contact us directly

Martina Plisková

Martina Plisková

office coordinator

Contact us

Fill out our form, we will contact you within a few days with a proposal for a non-binding consultation.

Kontakt - Martina Pliskova