It is a new form of fraud. Once a scammer finds an audio clip of someone’s voice online, they can easily mimic it using artificial intelligence. That way they can make the voice say whatever they want. The police urge you to always ask for information that only your loved one would know, or try to reach the person in some other way.
LOOK ALSO. 6 million phishing reports in 2022: how do you recognize it?
LOOK ALSO. Watch out for this new trick from rogue car buyers