
AI Used to Gather Sound Metadata to Clone Voices
Don’t speak a word when you get a silent call. This is exactly what one should do to avoid becoming a victim of the latest AI-generated silent call scam, says the Malaysian Communications and Multimedia Commission (MCMC).
The MCMC had recently posted a 2-minute video on social media to warn Malaysians that AI technology is being used to gather sound metadata to clone voices.
The AI-cloned voice is then used to scam the target’s family by requesting help due to an emergency, to get a company staff member to transfer money, or to by-pass voice verification used by certain commercial institutions.
The silent call is a tactic used by scammers to phish for victims. It begins with ringing up the target, but the caller deliberately leaves the line silent when the call is answered.
By answering, a person is deemed to have an active number and is placed on a target list for scam messages or calls impersonating banks or other authorities.
If the target answers, his voice is recorded and later cloned with AI for impersonation purposes.
What to look out for:
- Repeated calls from unknown numbers where there is no response from the caller
- Unfamiliar numbers, including foreign phone numbers
- Lacking in background noise during such calls
- Subsequently receive messages of call impersonating banks or othe authorities
What to do:
- Keep silent and do not attempt to engage with the caller.
- Hang up immediately.
- Avoid saying “yes” or confirming your name.
- Do not share personal data or click on links if you receive subsequent calls or messages.
(If the call is genuine and important, the caller will try to call, message or leave a voice note.)
How to avoid becoming a victim:
- Block suspicious numbers.
- Report the number to the MCMC.
- Avoid sharing phone numbers on social media.
- Never call back unknown numbers.
- Install apps with scam call protection or filter.
References: MCMC; The Star (4 December 2025); Bernama

