


Cybercriminals have no difficulty sourcing original voice files to create their clones. Of the people who reported losing money, 36% said they lost between $500 and $3,000, while 7% got taken for sums anywhere between $5,000 and $15,000. One in ten of people surveyed in our study said they received a message from an AI voice clone, and 77% of those victims said they lost money as a result. In all, the approach has proven quite effective so far.

Either way, the bogus message often says they need money right away. They will use the cloning tool to impersonate a victim’s friend or family member with a voice message that says they’ve been in a car accident, or maybe that they’ve been robbed or injured. With a small sample of a person’s voice and a script cooked up by a cybercriminal, these voice clone messages sound convincing, 70% of people in our worldwide survey said they weren’t confident they could tell the difference between a cloned voice and the real thing.Ĭybercriminals create the kind of messages you might expect. Further, our research team at McAfee Labs discovered just how easily cybercriminals can pull off these scams. Our recent global study found that out of 7,000 people surveyed, one in four said that they had experienced an AI voice cloning scam or knew someone who had. The aim, most often, is to trick people out of hundreds, if not thousands, of dollars. With a small sample of audio, they can clone the voice of nearly anyone and send bogus messages by voicemail or voice messaging texts. Cybercriminals have taken up newly forged artificial intelligence ( AI ) voice cloning tools and created a new breed of scam.
