Artificial Intelligence is all the rage these days, not only to increase productivity at work and find answers to questions and solutions to problems, but also increasingly to create fake photos (remember the picture of the pope wearing a Balenciaga puffer jacket) and videos, and even recreate the voice of celebrities and even long deceased people to make the say things they probably never in a lifetime would have uttered knowingly, so called deepfakes.
In this respect take a series of recent scams in Asia, where for example employees were duped in to making big financial transfers, supposedly on instruction by a member of their firm’s executive team: Earlier this year, a finance worker based in Hong Kong for Arup, a British engineering firm, logged in for what he thought was a routine team meeting. On the screen, he saw several colleagues, including the firm’s chief financial officer, who instructed him to transfer $26m to five different bank accounts. He complied. But the man on the call was not Arup’s CFO: it was a deepfake.
It was one of the costliest deepfake scams reported globally. Such scams are increasingly common. Deepfake technology, which manipulates images and video using AI, has become increasingly realistic. It is rapidly being adopted by transnational criminals mostly based in South-East Asia, now the epicentre of online scams targeting people around the world. Victims in East and South-East Asia lost up to $37bn from online scams of all sorts in 2023, according to a new report from the United Nations Office on Drugs and Crime.

Technologies such as generative AI and machine learning are making scams even more effective. Meanwhile the rise of cryptocurrencies and spread of social media have decentralised and democratised transnational crime. Criminals now use social media platforms such as Facebook and WhatsApp to conduct their nefarious activities. Scams have surpassed more traditional crimes, such as burglary, to become the most common felony in Singapore. Several deepfake videos have been released of prominent Singaporeans, including the current and former prime ministers, promoting investment scams. To some extent we even have seen some of these in Europe, with celebrities such as Roger Federer and popular TV presenters ‘promoting’ dubious investment opportunities in pop-up ads on the Internet.
Increasingly sophisticated malware is another challenge. In September last year at least 43 Singaporean victims lost almost $1m to malware-enabled scams on social media, according to Singapore’s police force. When one woman inquired about a Facebook post advertising a one-day trip to a durian fruit farm in Malaysia, the seller contacted her over WhatsApp with instructions to download a smartphone app to browse tour offers. This app infected her Android phone with malware that enabled scammers to steal more than $80,000 from her online bank accounts.
China and Singapore have now started to raise awareness of such scams and taken steps to make it slower and harder for people to make instant, online payments of large sums (for once I suppose it is definitely better to be slower).
It can only be a matter of time before these scams become widespread in Europe and the US. And with the use of social media and online meetings pervasive, the potential pickings for the criminals are no doubt huge. It might be a good idea for the authorities in the west to learn from their Asian counterparts – and maybe for once, and no doubt for a brief period only, to be ahead of the scammers.
And as far as us users are concerned, remember the saying that if something looks too good to be true then it probably is.
These unscrupulous souls [scammers and con men] have no social conscience. Right? They cannot be embarrassed or otherwise deterred by threat of prison. If punishment were adjusted to be death, which at present most would agree is out of the question, that might still might not be sufficient deterrent. Perhaps what we westerners believe barbaric forms of punishment [cutting off hands or unpardonable life at hard labor, for example] might be a deterrent. Whatever, it seems until punishment for such crimes are a deterrent, this will be a problem. A career computist myself, I am embarrassed that first computers, now AI and it’s manifestations brought this about [to say nothing of other computer-instigated ills – suicide, depression, bullying, for example]. Can anyone suggest a solution?
LikeLiked by 1 person
Don‘t be too hard on yourself, there are all the positive aspects of computers nd the internet as well, which weigh much more than the scammers. I suppose like with everything, there are two sides to this coin…
LikeLike