Special Reports
A special report is content that is edited and produced by the special reports unit within The Irish Times Content Studio. It is supported by advertisers who may contribute to the report but do not have editorial control.

Beware the new kids on the financial crime block

AI enables sophisticated scams by people with little or no technical expertise, but it can be a source of solutions too

Fraudsters typically reach out to people via a website or app to try to lure them into building a rapport. Photograph: iStock
Fraudsters typically reach out to people via a website or app to try to lure them into building a rapport. Photograph: iStock

Financial criminals are highly innovative and are constantly coming up with new and more ingenious schemes to part us from our hard-earned cash. One of the worst, and worst sounding, is “pig butchering”.

“Here what the fraudster seeks to do is reach out to you via a website or app and lure you into building a rapport,” explains Justin Moran, head of governance and security at mobile phone company Three.

Justin Moran, head of governance and security, Three Ireland
Justin Moran, head of governance and security, Three Ireland

Whether through fostering an online romance or offering investment opportunities, “they prey on people’s weaknesses and build a level of trust to get you to transfer money”, he says.

Artificial intelligence (AI) is significantly reshaping such crimes, fuelled by the rise of the dastardly “fraud as a service”, or FaaS, sector.

READ MORE

“AI-driven ‘fraud as a service’ platforms enable individuals without technical skills to conduct highly advanced cyberattacks,” explains Colm O’Flaherty, director, financial crime, at Deloitte Ireland.

Colm O'Flaherty, director, financial crime, Deloitte
Colm O'Flaherty, director, financial crime, Deloitte

“Criminals use AI to automate phishing campaigns, create realistic fake profiles and generate convincing deepfake videos and voices, deceiving victims into fraudulent transactions or investments.”

The problem with fraud as a service is that its scale and accessibility make it even more dangerous, points out Alessia Paccagnini, associate professor at the University College Dublin School of Business.

Alessia Paccagnini, associate professor at UCD School of Business
Alessia Paccagnini, associate professor at UCD School of Business

“FaaS describes how skilled hackers rent or sell their knowledge, software, and infrastructure – typically on the dark web, like mercenaries – as part of the commercialisation of cybercrime tools and services,” she says.

“This business model enables sophisticated fraud operations to be carried out by people with little to no technical expertise. These services can range from malware, phoney websites, and phishing kits to credit card data theft, money laundering, and even customer service for cybercriminals.”

But while AI driven, FaaS-backed scams such as “pig butchering” enable fraudsters to use realistic fake profiles to build trust with victims online, tricking them into investing large sums in bogus cryptocurrency schemes, AI can help provide the solution too.

“Businesses can protect themselves by adopting a proactive cybersecurity strategy incorporating AI-driven threat detection systems. Leveraging AI-powered analytics can help companies rapidly identify suspicious patterns and respond swiftly to threats,” says O’Flaherty.

“Continuous employee training focused on recognising AI-enabled scams is also essential. In addition, organisations should enhance authentication procedures and consistently verify transactions through secure, multi-step verification processes. Staying informed about evolving threats and collaborating closely with cybersecurity experts can significantly strengthen a company’s defences against these increasingly sophisticated AI-driven financial crimes.”

Sandra O'Connell

Sandra O'Connell

Sandra O'Connell is a contributor to The Irish Times