AI Voice Cloning Fuels Holiday Voice Phishing Scams

Finance|
|
By Shin Joong-seop
||
AI tricks you with fake voices... Beware of holiday voice phishing scams - Seoul Economic Daily Finance News from South Korea
AI tricks you with fake voices... Beware of holiday voice phishing scams

Last week, three days before the Lunar New Year holiday, a woman in her 70s living in Busan received a phone call. The voice on the other end was unmistakably her son's. He said he urgently needed to send money before the holiday but couldn't verify his bank account due to phone problems, asking her to transfer the funds instead. He then rushed her, saying he was too busy with work to talk long. Alarmed, the woman began writing down the account number but sensed something was wrong. She hung up and called her son back. It turned out the voice she had just heard was not her son's—it had been synthesized using artificial intelligence.

According to data submitted by the Financial Supervisory Service to Rep. Lee Yang-soo of the People Power Party on the National Assembly's Political Affairs Committee on Feb. 15, a total of 44,883 voice phishing cases occurred during January-February and September-October between 2020 and 2025—periods that include Lunar New Year and Chuseok holidays. Total damages reached 465 billion won ($323 million). Average damage per case surged 2.3-fold from 9.4 million won in January-February 2020 to 21.5 million won. Institutional impersonation was the most common type of fraud.

Voice phishing has become increasingly sophisticated through the exploitation of AI technology. Advances in AI now enable criminals to replicate a specific person's voice, speech patterns, and intonation from just a short audio sample. This has led to a rise in crimes that manipulate the voices of children or family members to apply psychological pressure. Even a simple "hello" in response to a call from an unknown number can provide enough audio for voice cloning. In 2021, a bank in the United Arab Emirates transferred 35 million dollars after receiving a call from someone using deepfake voice technology to impersonate a corporate executive.

With voice phishing crimes expected to surge around the Lunar New Year holiday, financial authorities have distributed "Ten Commandments for Preventing Voice Phishing Damage" and urged vigilance. Authorities emphasized that calls from people impersonating investigators mentioning identity theft or arrest warrants, contacts from those posing as family or acquaintances requesting money, and demands for advance payments or transfers to others' accounts under the pretext of loans are all scams—and calls should be ended immediately.

Authorities also warned against installing malicious apps or clicking links from unverified sources, advising people to report suspicious activity to the police (112) or the Financial Supervisory Service (1332) if needed.

So-called "card delivery scams," where fraudsters claim an unrequested credit card has been shipped and direct victims to call a specific number, have also been increasing recently. Authorities advised hanging up and verifying through the "View All My Cards" service or official card company customer centers.

Using security blocking services is another protective measure. Financial authorities have launched and operate security blocking services for credit transactions, non-face-to-face account openings, and open banking. These services can be applied for by visiting a financial institution branch in person or through the Korea Financial Telecommunications & Clearings Institute's Account Info app or mobile banking applications.

AI tricks you with fake voices... Beware of holiday voice phishing scams - Seoul Economic Daily Finance News from South Korea
AI tricks you with fake voices... Beware of holiday voice phishing scams

AI-translated from Korean. Quotes from foreign sources are based on Korean-language reports and may not reflect exact original wording.

00:0006:07