6X5J+PR5, Shivalaya St, Pokhara 33700
6X5J+PR5, Shivalaya St, Pokhara 33700
In 2025, the use of AI-generated voices has become widespread. From viral Drake covers on TikTok to realistic customer service bots, synthetic audio is no longer Sci-Fi but part of our lives. AI voice generators like ElevenLabs, OpenAI Voice Engine and Google’s Voicebox have made ultra-realistic cloning accessible to all. With just a 10 seconds clip, anyone can be made to sound like Beyonce or your favorite streamer and even you!
There are more imminent harms than breaches of copyright for the entertainment industry to consider while they are lagging behind. When your voice is cloned without consent, many things can go wrong, including deepfake audio scams that trick people out of money or sensitive information. Imagine receiving a call from someone claiming to be your boss asking for secret credentials only for it not to be them at the other end - Disinformation laced with trust deception can prove explosive.
This exposes dangerous new dilemmas of who controls your identity if it could be replicated? Most legislations around audio likeness do not exist in bulk countries making their framework impractical putting voices under no safeguard as instead of protecting one’s identity risk exposing it. Many social media personalities and public figures started this battle but is remains invisible mainstream without any knowledge from common folk.
Latest Comments