AI Voice Clone: Woman cheated neighbor of Rs 6 lakh by changing her voice, police arrested her
AI Voice Clone: A woman named Karishma Kaur threatened her neighbor by calling her repeatedly in someone's voice and took Rs 6.6 lakh in several installments. The two women never met and neither do they know each other.
The police have arrested a woman who cheated her neighbor of Rs 6 lakh by changing her voice with AI. The woman whom the police arrested was named Karishma. She called her neighbor like a man and took money from her by threatening her.
The police claimed in their report that Karishma Kaur threatened the neighbor through repeated calls in someone's voice and received Rs 6.6 lakh in several installments. The two women never met each other, nor do they know one another.
Karishma told the police that she urgently needed money. Thus, by cloning a man's voice using AI, she called up her neighbor to threaten her and demand money. It is not yet clear whose voice the woman cloned and what relation the man has with the victim woman.
Identify AI voice clone fraud in these four ways:
- Getting a call suddenly from someone very close: If you get a call from a person very close to you and that too from a new number, then you need to be very careful. Also, remember the time of the call.
- Emergency: The thugs clone your voice using AI-based software and make a call to the victim using that cloned voice on the pretext of an emergency. For example, they might say someone close to you is in trouble or lying in a hospital. In these circumstances, money becomes urgent for them.
- Speaking style: AI can copy someone's voice but not the way of talking and style. Listen to such calls carefully and decide whether it is a robotic call or a human has made it.
- Demand for money: While someone calls you and asks for money, then be aware. Apart from this, if any person asks for your bank account details, then do not give it, even if his voice resembles the one whom you know.