Be Aware of AI Voice-Cloning Scams | How it Works and How to Avoid Them
Indian authorities have recently alerted citizens about emerging phone scams that leverage AI to imitate people’s voices. These attacks clone the voices of relatives and friends to sound desperate for help before tricking victims into online transactions.
Fraudsters first gather personal information about potential
targets through social media sites, matrimonial profiles, or data leaks. Then
they use AI techniques like deepfakes to clone the target’s voice or a family
member’s voice stored in the background of previous calls.
Comments
Post a Comment