Celebrities at Maximum Risk of AI ‘Voice Cloning,’ Experts Say: Tips for Falling for Vishing Scams

CELEBRITIES have the highest chance of falling victim to identity theft attempts, thanks to the growing popularity of AI voice cloning software.

However, not having star prestige is not enough for you.

Cybercriminals locate audio clips online and integrate them into commercially available software to produce words and even sentences complete with someone’s voice.

This procedure is known as voice cloning, and the result is called deepfake audio.

The term “deepfake” was coined in 2017 to describe illicit photographs and videos in which celebrities’ faces appear superimposed on other bodies.

And it turns out that those who are noticed are once again in danger.

New from Podcastle, an AI-powered podcasting platform, it surveyed 1,000 Americans to get their opinions on which celebrities are most at risk of voice cloning.

Respondents believed Arnold Schwarzenegger posed the greatest danger because of his “easiest voice to reproduce. “

86% of respondents feel that the former California governor’s “distinctive and immediately recognizable accent” puts him in danger.

Schwarzenegger followed Donald Trump, Kim Kardashian, Sylvester Stallone and Christopher Walken.

Nearly one in four (23%) said Kardashian has a “consistent tone and tone,” making her voice easy to replicate.

Meanwhile, 39% said Trump’s voice is easy to reflect due to their familiarity with his regular media appearances.

Gen Z respondents considered Trump to be at maximum risk, and their perspectives were likely shaped by the unprecedented political unrest in the media landscape.

Celebrities and politicians have become the most common victims of deepfakes on social media.

A wave of doctored photos on X, formerly Twitter, led the platform to ban searches for Taylor Swift’s call in January.

And last week, Elon Musk posted a fake video of presumptive Democratic presidential nominee Kamala Harris on X.

Deepfakes are not a new phenomenon. The U. S. Department of Homeland SecurityThe U. S. Department of Homeland Security mentioned them in a 2019 report, saying the threat comes not from generation “but from people’s natural tendency to what they see. “

As a result, the report continues, “deepfakes and artificial media want to be complex or credible to be effective in spreading disinformation. ”

Although respondents in the study were asked to comment on the potential misuse of AI voice cloning technology, corporate executives expressed apprehension.

Podcastle CEO Artavazd Yeritsyan told the U. S. Sun that he is well aware of the use of AI voice cloning generation through malicious actors.

“No matter what generation you introduce, there will be other people who will use it for bad things and others who will use it for smart things,” Yeritsyan said.

Users can record and edit audio without having to leave the Podcastle platform. This includes AI to generate words or words that they haven’t recorded.

Yeritsyan says the platform’s goal is to “automate” the production procedure rather than “replace a human being. “

The platform has also placed controls to prevent you from creating audio deepfakes.

A user will have to record spoken words to verify that it is a genuine user speaking, rather than a cybercriminal placing fragments of another person’s voice into the system.

“That content is then stored and encrypted securely so that no one else can hear your voice,” Yeritsyan explained.

While they are positive about potential long-term programs such as text-to-speech accessibility features, Podcastle’s key representatives are obviously aware of the risks.

“I think the biggest threats are similar to phishing, where a criminal requests data about your bank account through the voice of a family member or friend,” Yeritsyan said, describing a phenomenon known as voice phishing.

All a cybercriminal wants is a few seconds of audio (commonly discovered on social media) to create a deepfake, which is then weaponized to trick unsuspecting victims into divulging their personal information over the phone.

Cybersecurity experts refer to this phenomenon as “voice phishing” or “vishing. “

An effective defense against this emerging form of cyberattack starts with the symptoms of a scam.

Criminals sometimes ask their victims to act urgently to correct fraudulent allegations or verify non-public information. A strong strategy deserves to activate cautionary signals.

You should exercise caution as caller ID may not be sufficient to determine identity.

Security experts suggest hanging up and calling the organization or user if they receive a call they suspect is fraudulent.

In general, offer sensitive data such as passwords, credit card numbers or bank account details over the phone.

Here, Mackenzie Tatananni, science and generation reporter for the U. S. Sun. In the U. S. , it explains how a scammer can get your information.

Scammers typically download phone numbers through data breaches, which occur when a hacker gains access to a personal knowledge base, managed through corporations such as service providers and employers.

This data would possibly be shared and disseminated online, adding to the dark web, where there are forums committed to sharing leaked data.

Another common strategy called Wardialing uses an automated formula that aims to express domain codes.

A recorded message will prompt the listener to enter information, such as a card number and PIN.

There’s also a much more painful possibility: your phone number may be indexed online without your knowledge.

Data brokers are eager to buy and sell your data. These corporations collect online data from public sources, aggregating social networks and public records.

Its main objective is to create databases of other people and use this data for tailored advertising and marketing purposes.

Much of this data ends up on public records sites, which demonstrate data such as your phone number, email address, home address, and date of birth so that you can see it.

In the United States, those sites are legally required to delete your information upon your request.

Locate your profile and follow the instructions to unsubscribe, but be careful: those sites do not make it easy for you and are intended to prevent you from completing the unsubscribe process.

For the sake of simplicity, you can also use a tool to delete your internet account.

Norton offers one of those services. The tool, called Privacy Monitor Assistant, discovers data online and requests its deletion on your behalf.

It is also imaginable that your phone number is connected to a social media account and publicly displayed on your profile; this happens quite often on Facebook.

Make sure your privacy settings and check that this data is hidden from prying eyes.

Podcast representatives expect voice cloning generation to be used to increase productivity and automate tedious processes.

However, they know that much of the task of fighting bad actors falls on them.

“We need to get to the point where we just don’t give other people a chance to use it for the reasons,” Yeritsyan explained.

“I think that the maximum products should be regulated so that this type of thing does not happen. “

©News Group Newspapers Limited in England No. 679215 Registered Office: 1 London Bridge Street, London, SE1 9GF. “The Sun”, “Sun”, “Sun Online” are registered trademarks or trade names of News Group Newspapers Limited. This service is provided under News Group Newspapers’ Limited’s popular terms and conditions, in accordance with our Privacy & Cookie Policy. To request information about a license to reproduce material, please visit our distribution site. Check out our online press kit. For additional inquiries, please contact us. To see all content from The Sun, use the sitemap. Sun’s online page is regulated through the Independent Press Standards Organization (IPSO).

Our hounds try to be precise, but we make mistakes. To learn about the additional main points of our court case policy and to file a complaint, please click on this link: thesun. co. uk/editorial-court cases/

Leave a Comment

Your email address will not be published. Required fields are marked *