Now scammers are making people their victims through AI voice cloning, kill them immediately otherwise they will cause harm.
Tech News Desk – With the help of Artificial Intelligence (AI) based tools, many tasks have become easier but at the same time its misuse has also started. Every day many such scams are coming to light, in which users are being cheated using AI. Bharti Airtel Chairman Sunil Mittal also recently said that scammers are talking in the voice of others with the help of AI and asking to transfer money. Let us tell you why this new type of scam is dangerous and how to avoid it.
What is AI voice cloning?
AI voice cloning is a technology with the help of which an exact copy of someone's voice can be made. For this process, a small part of a person's voice is used as an audio click and after training, the AI model can talk in the same voice. The voice created by AI cloning sounds so real that the real voice and the cloned voice cannot be distinguished. The objective behind designing such tools and technology was to provide personalized experience across different areas ranging from text-to-speech. Especially in the creative field, this type of cloning gives many options to use your voice and there is no need to record it on the mic again and again. However, fraud scammers are using it with bad intentions.
How can you clone someone's voice?
With the help of advanced Artificial Intelligence (AI) powered tools, it has become very easy to clone someone's voice. As soon as you search on Google, you will find many such tools and websites through which someone's voice can be cloned. Apart from some free tools, selected platforms create high-quality voice clones for just $5 (about Rs 420), which can be used. All you have to do is upload a 30-second clip of someone's voice and the AI model does its wonders in no time. You get a copy of that voice and the text written by you can also be heard in the same voice. That means you easily get the option to use someone's voice.
How do AI voice-based scams work?
When talking on a smartphone, your voice identifies you to the other person. If you hear the voice of a known person on the phone, there is no question of not trusting him or doubting him. Scammers may impersonate your friends, relatives or a police officer and try various methods to trap you. For example, if you get a call in a friend's voice asking you to transfer some money, you probably won't think twice before transferring the money. Scammers work by luring, luring and threatening by mentioning a sudden accident, so that one trick or another will work.
How can you avoid AI cloning scams?
You always need to be cautious in matters like transferring money. If you keep in mind the things mentioned below, you will be able to avoid such scams.
1. Always pay attention to the phone number
Even if scammers clone someone's voice, they will use their own phone number. In such a situation, if you get a call from an unknown number or a number with a code from another country, then it is important to be alert. Immediately decide why the acquaintance is calling from the new number.
2. Verify identity before sending money
If it comes to money transactions, verify identity without haste. For this you yourself should call that acquaintance's existing number or WhatsApp. If you want, you can also confirm with a family member or friend whether he really needs the money or not. Ask the caller to take the money in cash instead of online.
3. Be alert if the voice sounds fake
No matter how well AI copies the voice, its tone and way of talking is slightly different from the human accent. If you feel strange during the call or feel that the voice is fake, disconnect the call immediately. You should know the voice and way of talking of the person whose voice you have heard before.
4. If in doubt, force a video call
If you suspect someone, ask them to make a video call. Nowadays, video calling has become very easy due to the availability of internet and many existing platforms. Say clearly that I want to talk on video call. Obviously the scammers have copied the voice and they cannot talk on video calls. Be alert yourself about such scams and also warn your friends and close ones about it, so that they can avoid such dangers.
Comments are closed.