Amazon’s Alexa could soon mimic the voice of your loved ones.

The implementation will undoubtedly raise new privacy issues and ethical considerations around permission.

0
14
Amazon’s Alexa could

Even if a family member is deceased, Amazon’s Alexa may soon be able to mimic their voice. The feature, which was presented at Amazon’s Re: Mars conference in Las Vegas, is under development and would allow the virtual assistant to imitate a specific person’s voice based on a recording that lasts less than a minute.

Rohit Prasad, senior vice president and head scientist for Alexa, said at the event on Wednesday that the desire behind the feature was to build greater trust in the interactions users have with Alexa by putting more “human attributes of empathy and affect”.

“These attributes have become even more important during the ongoing pandemic when so many of us have lost ones that we love,” Prasad said. “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”

A small child asks Alexa in a video that Amazon presented during the event, “Alexa, can Grandma finish reading me the Wizard of Oz?” After acknowledging the request, Alexa changes to a different voice that imitates the child’s grandmother. After that, the voice assistant keeps reading the book in that same tone.

To create the feature, Prasad said the company had to learn how to make a “high-quality voice” with a shorter recording, opposed to hours of recording in a studio. Amazon did not provide further details about the feature. The rollout is bound to spark more privacy concerns and ethical questions about consent

LEAVE A REPLY

Please enter your comment!
Please enter your name here