Alexa from Amazon might soon be able to imitate the voice of a family member, even if that person has died. The feature, which was shown off at Amazon’s Re: Mars conference in Las Vegas, is in the works and would let the virtual assistant imitate the voice of a specific person based on a recording of less than a minute.
Rohit Prasad, Alexa’s senior vice president and head scientist, said at the event on Wednesday that the goal of the new feature was to help users trust Alexa more by giving it more “human attributes” like empathy and affect.
“These traits are even more important now that so many of us have lost loved ones because of the pandemic,” Prasad said. “AI can’t take away the pain of loss, but it can keep their memories alive.” A young child asks Alexa, “Can Grandma finish reading me The Wizard of Oz?” in a video that Amazon showed at the event.
Amazon shows off Alexa feature that mimics the voices of your dead relatives https://t.co/VrRkPTkWnc pic.twitter.com/feUHysAI9t
— The Verge (@verge) June 23, 2022
Alexa then says “OK” and changes to another voice that sounds like the child’s grandmother. The voice assistant then keeps going with the same voice. Prasad said that the company had to figure out how to make a “high-quality voice” with a shorter recording instead of spending hours in a studio.
Read More-
- Bungie Files A $7.6 Million Lawsuit Against A YouTuber For Making False DMCA Claims!
- Apple Invites Developers To WWDC (Latest News)
- Nearly $100,000 Is Raised For A Mother Whose Three Children Were Killed In Round Lake Beach!
This was needed to make the feature. Amazon didn’t give any more information about the feature, which is sure to raise more privacy concerns and questions about what is right and wrong when it comes to consent. Microsoft, a competitor of Amazon, said earlier this week that it would cut back on its synthetic voice services and set stricter rules to “ensure the active participation of the speaker” whose voice is being recreated.
Microsoft said on Tuesday that it is limiting the number of customers who can use the service. However, it is still highlighting good uses, such as an interactive Bugs Bunny character at AT&T stores, as examples of what can be done with the service.
In a blog post, Natasha Crampton, who heads Microsoft’s AI ethics division, said, “This technology has exciting potential for education, accessibility, and entertainment, but it’s easy to see how it could also be used to impersonate speakers in a bad way and trick listeners.”