AI-Powered Voice Scams: How Thieves are Deceiving People Out of $10 Billion in 2023

Thieves Utilize AI Voice-Cloning Technology to Steal Funds

The rise of AI voice scams has led to thieves using voice cloning to deceive people into sending them money quickly. In 2023, consumers lost a staggering $10 billion to fraud, with imposter scams being the most common. Scammers typically ask victims to wire money or pay using money transfer services like Venmo or Cash App. To protect yourself, the BBB suggests being cautious when receiving unsolicited calls requesting money. Always question the legitimacy of the request and secure your accounts to avoid falling victim to these scams.

Nicole Cordero of BBB Eastern Carolinas warns that the scammers can sound like someone you know and create a sense of urgency to pressure you into sending money without thinking. By using advanced technology, scammers can mimic voices and phrases to trick individuals into falling for their ploy. Computer science professor Paul Cerkez, who specializes in artificial intelligence at Coastal Carolina University, explains how scammers can create a database for cloning a person’s voice. By recording snippets of that person’s voice, scammers can generate text that is spoken in the cloned voice to appear authentic and convincing.

The use of AI voice cloning has made it easier for criminals to carry out imposter scams, which have become more prevalent in recent years. As technology advances, it becomes increasingly difficult for individuals to differentiate between real and fake voices, making it easier for criminals to deceive their victims into sending them money under false pretenses. To stay protected from these types of scams, it is important for consumers to remain vigilant and question any unsolicited requests for payment before taking action.

Leave a Reply