Rising scams use AI to mimic voices of loved ones in financial distress
- With this feature, Microsoft seems to be attempting to dodge any scandals by limiting what impostor celebrity voices can be prompted to say.Microsoft did not respond to Ars’ request for comment on how well safeguards currently work to prevent the celebrity voice emulator from generating offensive speech.
- Before sending funds, try to contact the person who seems to be asking for help through methods other than a voice call.AI voice-modeling tools have been used to improve text-to-speech generation, create new possibilities for speech editing, and expand movie magic by cloning famous voices like Darth Vader’s.
- But the power of easily producing convincing voice simulations has already caused scandals, and no one knows who’s to blame when the tech is misused.Earlier this year, there was backlash when some 4chan members made deepfake voices of celebrities making racist, offensive, or violent statements.
- Gizmodo pointed out that, like many companies eager to benefit from the widespread fascination with AI tools, Microsoft relies on its millions of users to beta test its “still-dysfunctional AI,” which can seemingly still be used to generate controversial speech by presenting it as parody.
- Time will tell how effective any early solutions are in mitigating risks.In 2021, the FTC released AI guidance, telling companies that products should “do more good than harm” and that companies should be prepared to hold themselves accountable for risks of using products.
- Will Maxson, an assistant director at the FTC’s division of marketing practices, told the Post that raising awareness of scams relying on AI voice simulators is likely consumers’ best defense currently.
11 with AI models designed to closely simulate a persons voice are making it easier for bad actors to mimic loved ones and scam vulnerable people out of thousands of dollars, The Washington Post re [+4801 chars]