(PatriotPostNews.com) — Many scammers are committing the same crimes as they always have, but now AI does the heavy lifting.
Scammers now have the ability to clone voices. The technology can even copy the voices of friends or family. In April 2024, a high school athletic director in Maryland faced criminal charges after he cloned the voice of his boss. The director used AI to copy the voice of Pikesville High School Principal Eric Eiswert. He then used the voice to create a fake recording where Eiswert made racist comments. The athletics director, Dazhon Darien, was arrested and charged with disruption of school operations, theft, and stalking.
AI voices are essentially exact duplicates. The only practical way to determine if a voice call is authentic is to directly contact the caller yourself.
Video messages can also be faked by AI. Runway’s vivid five-second has gone viral on social media. Many of the videos display people in scenes indiscernible from reality. For the moment, AI videos have some obvious glitches.
AI can collect a person’s personal data from across the web, and use it to make scams believable. AI phishing emails are one of the hardest forms of scams to detect. The only way to be sure is to be vigilant about which emails and attachments one opens. Again, the best means is to contact the sender directly.
AI can render accurate images of you, your friends, coworkers, and your family. Scammers can render a photo of you in any situation, including explicit adult scenarios. These images can be uncanny and used as blackmail.
AI is in its infancy, but it can already practically copy an entire person. AI can call your bank with your voice and your personal information. AI is a scammer’s dream because the majority of AI crimes are undetectable. Aside from directly confirming with someone in real life, there is virtually no way to discern AI from the real thing.
Copyright 2024, PatriotPostNews.com