Amazon is figuring out how to make its Alexa voice assistant deepfake the voice of anyone dead or alive with just a short recording. The company demonstrated the feature Wednesday at its Re:Mars conference in Las Vegas, using the emotional trauma of the ongoing pandemic and grief to sell interest.
Amazon’s Re:Mars focuses on artificial intelligence, machine learning, robotics and other emerging technologies, with tech experts and industry leaders on stage. During the keynote the other day, Rohit Prasad, senior vice president and lead scientist for Alexa AI at Amazon, showed off a feature being developed for Alexa.
In the demo, a child asks Alexa, “Can Grandma finish reading to me? Wizard of Oz?” Alexa responds, “Okay,” in her distinctive sensual, robotic voice. But afterward, the child’s grandmother’s voice comes out of the speaker to read the story of L. Frank Baum.
You can watch the demo below:
Prasad said only that Amazon is “working” on the Alexa capability and did not specify what remains to be done and when it will be available.
However, he did provide subtle technical details.
“This essential invention where we had to learn to produce high quality sound with less than a minute of recording versus hours of recording in a studio,” he said. “The way we’ve done it is formulating the problem as a voice-conversion task, not a speech-generation task.”
Of course, deepfaking has earned a controversial reputation. Nevertheless, there have been some attempts to use the technique as a tool, and not as a means of crawling.
Audio deepfakes in particular, as noted by The Verge, have been reported in the media to help take advantage of, say, a podcaster jumbled up a line or when a project’s star suddenly dies. It goes, as happened with the Anthony Bourdain documentary. roadrunner,
The publication noted that there are also examples of people using AI to create chatbots that act to communicate as if they are a lost loved one.
Alexa also won’t be the first consumer product to use deepfake audio to fill out a family member who can’t be in person. The Takara Tomi smart speaker, as Gizmodo reported, uses AI to read children’s bedtime stories with a parent’s voice. The parents reportedly upload their voices, so to speak, by reading a script for about 15 minutes. However, it differs notably from Amazon’s demo, in which the product owner decides to provide his vocals instead of the product, using the voice of someone unable to give his permission.
Aside from concerns about deepfakes being used for scams, fraud, and other nefarious activities, there are already some troubling things about how Amazon is building this feature, which doesn’t even have a release date yet. .
Before showing the demo, Prasad talked about Alexa giving users a “collaborative relationship”.
“In this associative role, the human qualities of empathy and influence are key to building trust,” said Execution. “These features become even more important in this time of the ongoing pandemic, when so many of us have lost someone we love. While AI cannot eliminate that pain of loss, it sure can May make their memories last.”
Prasad said the feature “enables lasting personal relationships.”
It is true that countless people are in dire quest for human “empathy and influence” in response to the emotional crisis triggered by the COVID-19 pandemic. However, Amazon’s AI voice assistant isn’t the place to address those human needs. Alexa also can’t enable “permanent personal relationships” with people who are no longer with us.
It’s not hard to believe that good intentions are behind this developing trait and hearing the voice of someone you miss can be a great comfort. We could even theoretically see ourselves having fun with a feature like this. Making a friend make Alexa sound like they said something silly is harmless. And as we discussed above, there are other companies that are taking advantage of deepfake technology that is similar in performance to Amazon.
But crafting a developing Alexa capability as a way to rekindle a connection with departed family members is a giant, unrealistic, problematic leap forward. Meanwhile, it seems unnecessary to pull at the heartstrings by bringing on the grief and loneliness related to the pandemic. There are some places that don’t belong to Amazon, and Grief Counseling is one of them.