Now scammers are making people their victims through AI voice cloning, kill them immediately otherwise they will cause harm.

Technical News Office – Thanks to artificial intelligence (AI)-based tools, many tasks have become easier, but at the same time, their misuse has also started. Every day, many such scams are revealed, in which users are deceived using AI. Bharti Airtel Chairman Sunil Mittal also recently said that scammers speak in the voice of others with the help of AI and ask to transfer money. Let us tell you why this new type of scam is dangerous and how to avoid it.

What is AI voice cloning?
AI voice cloning is a technology using which an exact copy of a person’s voice can be created. For this process, a small part of a person’s voice is used as the audio click and after training, the AI ​​model can speak with the same voice. The voice created by AI cloning sounds so real that it is impossible to distinguish the real voice from the cloned voice. The aim behind designing such tools and technologies was to provide a personalized experience in different areas ranging from text-to-speech. Especially in the creative field, this type of cloning offers many options for using your voice and there is no need to record it over and over again on the microphone. However, scammers use it with bad intentions.

How to clone someone’s voice?
With the help of advanced artificial intelligence (AI) based tools, it has become very easy to clone someone’s voice. As soon as you search on Google, you will find many tools and websites to clone a person’s voice. Apart from a few free tools, some platforms create high-quality voice clones for as little as $5 (around Rs 420), which can be used. All you have to do is upload a 30-second clip of someone’s voice and the AI ​​model does wonders in no time. You get a copy of that voice and the text you wrote can also be heard in the same voice. This means that you easily have the ability to use someone’s voice.

How Do AI Voice Scams Work?
When you speak on a smartphone, your voice identifies you to the other person. If you hear the voice of someone you know on the phone, there’s no question of distrusting or doubting them. Scammers may pretend to be your friends, relatives or a police officer and try various methods to trick you. For example, if you receive a call with a friend’s voice asking you to transfer money, you probably won’t think twice before transferring the money. Scammers act by luring, enticing and threatening by mentioning a sudden accident, so that one trick or another will work.

How to avoid AI cloning scams?
You should always be careful in matters such as transferring money. If you keep the below-mentioned things in mind, you will be able to avoid such scams.

1. Always pay attention to the phone number
Even if scammers clone someone’s voice, they will use their own phone number. In such a situation, if you receive a call from an unknown number or a number with a code from another country, then it is important to be vigilant. Decide immediately why the acquaintance is calling from the new number.

2. Verify your identity before sending money
If it involves monetary transactions, verify your identity without haste. To do this, you yourself need to call the existing number of this acquaintance or WhatsApp. If you want, you can also confirm with a family member or friend whether they actually need the money or not. Ask the caller to take the money in cash rather than online.

3. Be alert if the voice sounds fake
No matter how well the AI ​​copies the voice, its tone and manner of speaking are slightly different from the human accent. If you feel strange during the call or feel that the voice is wrong, disconnect the call immediately. You need to know the voice and manner of speaking of the person whose voice you have already heard.

4. When in doubt, force a video call
If you suspect someone, ask them to make a video call. Nowadays, video calls have become very easy thanks to the availability of the Internet and the many existing platforms. Make it clear that I want to talk via video call. Obviously, the scammers have copied the voice and cannot speak during video calls. Be aware of such scams and also warn your friends and relatives, so that they can avoid such dangers.

Leave a Comment