Amazon Teaches Alexa To Imitate The Voice Of Dead Parents

Advertising

A hot potato: Amazon is developing capabilities that will allow its voice assistant Alexa to mimic any human voice after hearing it speak for less than a minute. Dismissing the potential creepiness of the feature, some worry about the potential for abuse.

Rohit Prasad, who leads the Alexa team at Amazon, said the project’s goal was to “make the memories last” after “so many of us have lost someone we love” to the pandemic. .

Alexa could be trained to mimic a voice using pre-recorded sound, meaning the person doesn’t have to be present — or even alive — to serve as the source. In a video segment shown at a conference this week, a child asked Alexa if Grandma could finish reading The Wizard of Oz. Sure enough, Alexa changes her voice to make fun of the child’s grandmother and finish reading the story.

Advertising

Prasad said during the presentation that Alexa now receives billions of requests per week from hundreds of millions of Alexa-enabled devices in 17 languages ​​in more than 70 countries around the world.

The potential for abuse appears high. For example, the tool could be used to create compelling deepfakes for disinformation or political propaganda campaigns. Fraudsters could leverage capabilities for financial gain, such as in 2020 when scammers tricked a bank manager into transferring $35 million to fund an acquisition that didn’t exist.

What are your thoughts on the subject? Is Amazon taking the concept of voice cloning a bit too far, or are you intrigued by the idea of ​​having a “conversation” with someone from the grave?

Image credit: Jan Antonin Kolar

Similar Posts

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.