Company out $35M after scammers stage video call with deepfake CFO, coworkers – National

A multinational firm in Hong Kong is out 200 million Hong Kong dollars (around $34.5 million Canadian) after a financial worker at the company was targeted by scammers using artificial intelligence, culminating in a phony video conference call with numerous deepfake colleagues.

Police discussed details of the incident, without naming the company or worker involved, during a press conference in order to warn the public about the novel scam.

Acting senior superintendent Baron Chan of the Hong Kong Police Force’s Cyber Security and Technology Crime Bureau said the scam began last month when the worker received an email, purportedly from the company’s U.K.-based chief financial officer (CFO).

The email was concerning a “secret transaction” that needed to be carried out, according to the South China Morning Post. The employee had an early “moment of doubt,” as the email appeared to be a phishing scam, but they were eventually fooled after the fake CFO invited them to a video conference call.

Story continues below advertisement

On the call appeared to be numerous other coworkers that the employee recognized — they even sounded like the real deal. But they weren’t the person’s coworkers; they weren’t people at all. They were deepfakes, a type of synthetic media created through machine learning that can mimic a person’s appearance and speech. A digitally recreated CFO and a few outsiders were also present on the conference call.


Get the latest National news.

Sent to your email, every day.

“Because the people in the video conference looked like the real people, the (employee)… made 15 transactions as instructed to five local bank accounts, which came to a total of HK$200 million,” Chan said during the press conference, broadcast by Radio Television Hong Kong.

“I believe the fraudster downloaded videos in advance and then used artificial intelligence to add fake voices to use in the video conference,” Chan added.

Experts have warned that AI voice and video generation is becoming easier to access as the technology improves. It used to take extensive recordings to create a believable cloned voice, but now, it only takes seconds of recorded speech. Any public video can feasibly be used to train an AI model to mimic a person’s voice and appearance.

Story continues below advertisement

“We want to alert the public to these new deception tactics. In the past, we would assume these scams would only involve two people in one-on-one situations, but we can see from this case that fraudsters are able to use AI technology in online meetings, so people must be vigilant even in meetings with lots of participants,” Chan said.

The employee who was scammed didn’t realize the mistake until a week later when they checked in with the company’s headquarters. During that time, the scammers stayed in contact with the victim through instant messaging, emails and one-on-one video calls.

Police also revealed that a handful of other workers at the firm were contacted by the scammers, though details of those interactions were not released.

Chan said that during the fake video call, the scammers had the employee introduce themselves, but there was never any direct conversation between the deepfake coworkers and the victim. The AI-generated colleagues mainly gave orders before the brief meeting ended, the South China Morning Post reports.

Hong Kong police advise that employees confirm details of business dealings through regular communication channels and become suspicious as soon as money is involved, in order to avoid such AI scams.

Police are still investigating the incident and, so far, no arrests have been made.

&copy 2024 Global News, a division of Corus Entertainment Inc.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment