Deepfakes are images, videos, or recordings that have been digitally altered and manipulated鈥攖hrough either legitimate sources or artificial intelligence (AI)鈥攖o impersonate a real person doing or saying something that was not actually done or said. Financial scammers use deepfakes to trick people into believing they are someone else with the intention of obtaining their personally identifiable information (SSN, passcodes, etc.) or enticing them to send funds (digital transfers, wires, gift cards, etc.) to 鈥渇ix鈥 a made-up problem, 鈥渉elp鈥 someone who鈥檚 supposedly in trouble, or 鈥渋nvest鈥 in a fake opportunity.

Potential targets of this growing, increasingly sophisticated threat may be approached 聽directly (email, text, phone or video call) or indirectly, via social media or the internet. Common scams that manipulate victims into revealing private financial data include:

  • Executive or celebrity impersonation comes into play when artificial senior executives request urgent wire transfers. And in cases of the rich and famous, images of superstars such as or, with .
  • Social engineering scams take place with videos that use cloned voices of close relatives (often a grandchild) to make believable, urgent monetary requests for bail, investments, or other emergencies.
  • Customer service exploits often involve fake video chats from scammers posing as bank personnel. Even customers who may otherwise not trust unsolicited outreach tend to believe that bank representatives are legitimate if they have convincingly replicated voices and facial features.

To help spot deepfake videos, look for clues that are:

Visual: unnatural eye movements, blinking or facial expressions, frozen/glitchy/blurry edges around the eyes/mouth/face, lip-sync mismatches, unnatural lighting and shadows

Auditory: robotic or monotone voices that lack the usual inflection and emotion of a real person speaking, along with awkward, unnatural pauses

Behavioral: urgent or emotional asks (e.g., 鈥渟end money now!鈥), refusal to switch to phone or to an in-person contact

Actions you can take to help protect yourself

Be vigilant and employ an abundance of caution to keep the criminals at bay by:

  • Not trusting videos from unknown senders鈥攅specially those requesting money, login credentials, credit card or bank account information, as well as remote access to your phone or computer. Even if the person looks and sounds familiar, it could be an interactive, realistic deepfake simulation created by cybercriminals using AI. Call the person on a trusted phone number to verify before taking action. Pause and confirm every request, every time.
  • Creating a safe word, phrase, or answer to a question for family members. In the event a caller doesn鈥檛 know it, it鈥檚 clearly a scam.锘
  • Asking the caller to perform a specific action (e.g., sing, hold up today鈥檚 newspaper or blink three times, hum a tune or sing a song). If they avoid it or freeze, end the call immediately. When in doubt, hang up and disengage.
  • Never sharing information or sending money in response to a single video or audio call, especially if it鈥檚 from an unknown number. Verify the person鈥檚 identity by hanging up and calling them back on a trusted number.聽[Note: 91大神 Bank employees will never call you and ask you over video to reveal your unique PIN code, or display your debit/credit card.]
  • Being wary of uploading personal videos or voice recordings online, as these can be used to train deepfake models.

If you suspect a deepfake fraud, escalate and report to your local police department and consumer protection agencies, such as the Federal Trade Commission () and the FBI鈥檚 Internet Crime Complaint Center ().

We鈥檙e here to help. If you believe you have been a victim of fraud related to your 91大神 accounts, notify us immediately at聽1-800-724-2440聽(24 hours a day, 7 days a week).

Stay informed. Stay cautious. Stay safe.

For more information, go to聽www.mtb.com/scams.