All calculators
Risk calculator

Estimate the business impact of a fake executive

Model how a cloned CEO, CFO, board member, investor, or legal advisor could pressure employees into payments, disclosures, access changes, or confidential actions.

Executive impersonation attacks use authority, urgency, secrecy, and trust to move employees into unsafe decisions. AI voice and video tools make those attacks more believable by letting attackers imitate leaders in calls and meetings. Use this calculator to estimate your exposure across finance, HR, legal, IT, and operations workflows.

01

Organization profile

Podcasts, interviews, earnings calls, and conference video give attackers source material.

02

Impersonation targets

Select all that apply.

03

Sensitive actions

Select all that apply.

04

Communication surfaces

Risk is highest on phone, video, Teams, or Zoom — the live channels where AI voice and video impersonation works best.

05

Current controls

Employees must verify urgent executive requests out of band
Payment or access exceptions require dual approval
Executive requests are never approved based only on a live call
Staff are trained on AI voice and video impersonation
Employees can challenge executive requests without penalty
Sensitive requests use pre-agreed verification phrases
The company monitors live calls for AI voice or video risk
Unusual confidentiality demands are treated as risk signals
06

Business impact

Share these values

Send this link to a colleague — it loads the calculator with the same values you entered.

Estimated annual exposureHigh risk
$557K

Modeled range: $278K$1.1M

Modeled attempts / yr
9.36
Expected successful incidents / yr
0.98
Estimated cost per incident
$569K
Top risk drivers
  • Live voice and video approval paths
  • Severe gaps in verification controls

Illustrative estimates based on public reporting, government data, and modeling assumptions. Not financial, legal, or security advice.

Detailed risk report

Request a Diopter risk test above and a team member will follow up with a more detailed risk report tailored to your organization, as well as work to schedule an authorized voice and video impersonation simulation against your executive workflows.

Methodology

How we model executive impersonation exposure.

We start from a company-size base attempt rate informed by public BEC, voice clone, and deepfake executive fraud reporting, then multiply by executive public exposure, urgency-exception frequency, the mix of high-authority targets that could be impersonated, and the live communication channels where those requests arrive.

A control-gap multiplier amplifies exposure when out-of-band verification, dual approval, training, and live call monitoring are inconsistent. Incident severity combines the largest payment an employee could approve under exception, the value of sensitive data exposure, and incident response cost — weighted by which actions an attacker could plausibly extract.

Results are returned as a conservative, expected, and high range — not a single number — because real exposure depends on attacker targeting, employee judgment under pressure, control consistency, and how quickly the organization detects and stops a live impersonation attempt.

This calculator produces illustrative estimates based on public reporting, government data, security industry research, and modeling assumptions. It is for educational planning only and does not constitute financial, legal, insurance, or security advice.

FAQ

Executive impersonation, answered.

More calculators

See the AI Wire Fraud Calculator and the Deepfake Hiring Fraud Calculator, or browse all Diopter risk calculators.

View all calculators
Walkthrough · 30 min · NDA-safe

Walk an attack arc with Diopter.

In 30 minutes, we will replay a real deepfake incident, show the signals Diopter would score, and map the verdict your team could act on.