At the Re:Mars conference, the senior vice-president of Amazon's voice assistant showed off a new capability. There is no timetable as to when or if this feature will be made available to the public.

Amazon framed the ability as a way to remember the lost. It played a video in which the voice of the child's grandmother could be heard. The company was looking for ways to make the technology personal. He said that it could definitely make the memories last. After being trained on as little as a minute of audio of the individual it's supposed to be replicating, the new skill can make a synthetic voiceprint.

Deep fake audio tools, which use text-to-speech technology to create synthetic voices, could lead to a flood of new scam. A bank manager in the United Arab Emirates was fooled into transferring $35 million by fraudsters pretending to be a company director, thanks to voice cloning software. Deep fake audio crimes are still rare and the tools available to scam artists are relatively primitive.