AI is yet again wreaking havoc on the internet, and this time it's being used by fraudsters to scam vulnerable people on dating websites and apps. While romance scandals and catfishing are nothing new, this AI technology is becoming dangerously harder to spot – leaving even the most tech-savvy of us at risk of gullibility.
Can we trust anything any more? Police also believe that super-convincing AI voice-cloning technology is on the rise. In one instance it has been used by scammers to commit identity theft and trick seniors out of thousands of dollars, believing that their actual grandchildren had called them for help!
The internet used to be a dark, scary, but yet still kind-of-safe space for us all to communicate and content create online; but now it's just gotten straight-up horrifying. Everything we own has the potential to be hacked – especially if you're a Ring doorbell user – which is a scary thought in the digital age.
Shouting that the sky is falling and the robots are taking over might have seemed ridiculous two years ago, but now even the likes of Elon Musk and a thousand other experts in their respective fields are calling for a pause on any AI developments before it gets out of control.
As reported by PetaPixel, a recent romance scammer has made away with a staggering $430,000 (approximately £349,000 / AU$643,000) stolen from a victim who met the fraudster via online dating. The scammer then proceeded to use AI images as well as deepfaked video technology to convince her of a legitimate relationship and eventual proposal, by generating an AI image of a man holding a "Will you marry me?" sign that, of course, wasn't real.
This encounter led to the victim shifting over a hefty sum of money to the fraudster and dipping into her pension pot early, after finding out that her online love interest had been supposedly "held hostage and tortured" by those he was indebted to.
Get the Digital Camera World Newsletter
The best camera deals, reviews, product advice, and unmissable photography news, direct to your inbox!
The concerning thing is that this isn't a rare or abnormal case, and the Daily Mail has even put together a list of seven ways to determine if your online fling is actually an artificially intelligent chatbot that's attempting to scam you.
Recent reports from the Washington Post have also highlighted the dangers of realistic AI voice cloning scams, after Police in Newfoundland shockingly discovered that this technology had been used to take money from an elderly couple believing that their grandson was calling and in need of bail, and a similar case involving a car accident.
The Federal Trade Commission has also issued a warning surrounding these cheap online tools, that can create audio files replicating someone's voice in a matter of minutes.
AI-generated text, images and now audio is a pretty worrying combination, and it's even more alarming that all it takes is a 30-second (if that) clip of yourself speaking on TikTok or across social media for a scammer to impersonate your voice.
You might also be interested in these controversial AI topics:
A staff writer for Digital Camera World, Beth has an extensive background in various elements of technology with five years of experience working as a tester and sales assistant for CeX. After completing a degree in Music Journalism, followed by obtaining a Master's degree in Photography awarded by the University of Brighton, she spends her time outside of DCW as a freelance photographer specialising in live music events and band press shots under the alias 'bethshootsbands'.