AI Deepfake Scams Target Working-Age Adults in Thailand

Jeff Tomas - Freelance Journalist

BANGKOK – As more of daily life moves online, scammers keep getting smarter with Deepfake AI software. The Anti Online Scam Operation Centre (AOC 1441) is warning the public about rising online fraud risks in 2026, with working-age adults now one of the main targets.

Fraud groups are mixing AI tools with old scam tactics. That combo helps them create convincing tricks, move money quickly, and leave very little behind.

Chaichanok Chidchob, the caretaker Minister of Digital Economy and Society (DE), told The Nation, stopping online crime remains a top priority for the ministry. The goal is simple: help people spot new scam patterns early, reduce financial damage, and cut the number of victims.

Scams today go far beyond basic calls and texts. AI can produce fake content that looks real in seconds. If people don’t recognize the signs, losses can happen quickly, and recovery is often difficult, Chaichanok said.

Working-Age Group Losses, Over 23 Billion Baht

AOC 1441 data from 2025 points to a clear pattern. People ages 20 to 49, a major part of the workforce and a group with strong buying power, were hit most often. There were 405,929 reported cases in this age range, with total losses reaching 23.40 billion baht.

Many cases involved fake online products or services, fake side-income offers, and fake investment platforms. Scammers often use mule accounts and layered transfers to make the money harder to track and recover.

Four High-Risk Deepfake Scams to Watch in 2026

AOC 1441 expects these four scams to keep spreading through 2026.

SMS and Line link scams

Scammers pose as government offices or private companies. They may claim you have an unpaid bill or promise a refund, then send a link. Clicking it can install harmful apps or send you to a fake website built to steal personal details and empty your bank account.

A key reminder: government agencies and financial institutions don’t send payment links through SMS or social media.

AI deepfake scams that copy faces and voices

AI can now copy a familiar face or voice with scary accuracy. Criminals may pretend to be a friend, a family member, or even an official. They might call or video chat, ask for money, or threaten you with a made-up legal issue to push you into sending funds.

These scams depend on trust, and trust alone isn’t enough protection anymore.

Fake QR codes in public places

Some scammers swap real QR codes at restaurants, shops, or gas stations with fake ones. Others send QR codes through phishing emails disguised as promos. When scanned, the code can send you to a fake page or app that steals your login details or triggers an unauthorized transfer.

Check where the QR code comes from, and stop if anything looks off before entering sensitive information.

Online investment scams with unrealistic returns

Fraud tied to crypto and online investing continues to grow. Scammers build fake profiles, act like experts or well-known investors, and then push people to transfer money. Stolen funds often move through mule accounts or are converted into digital assets to hide the trail.

Protect yourself with the “4 No’s.”

AOC 1441 says awareness is your best defense. Follow the “4 No’s”:

  • Don’t click on links.
  • Don’t trust offers that sound too good to be true.
  • Don’t rush decisions.
  • Don’t transfer money without checking first.

Before you pay or share information, confirm the request through official channels.

Related News:

Meta Steps Up Fight Against Scams and Deepfakes inthe  Asia Pacific

Deepfakes Explained: How the AI Tech Works and Why It’s a Threat to Trust

 

Share This Article
Freelance Journalist
Follow:
Jeff Tomas is an award winning journalist known for his sharp insights and no-nonsense reporting style. Over the years he has worked for Reuters and the Canadian Press covering everything from political scandals to human interest stories. He brings a clear and direct approach to his work.
Exit mobile version