Science & Technology

Channels: Science & Technology | Financial Markets | Artificial Intelligence | Blockchain


Amazon's AI rewrites ‘millions' of Alexa user commands to reduce defects by 30%

[2019.11.07, Thu 17:05] The AI underlying assistants like Alexa gets better in part through manual data transcription and annotation, which takes outsized time and effort. In a paper detailing their work, they say that the automated self-learning system they deployed reduced errors across "Millions" of Alexa customers. Errors arise from automatic speech recognition, where an utterance like "Play Imagine Dragons" could be misinterpreted as "Play maj and dragons." Natural language understanding errors include examples like "Don't play this song again, skip," which Alexa would understand only if it was phrased "Thumbs down this song." And then there are comprehension issues, like "Play Bazzi Angel" rather than "Play Beautiful by Bazzi." Tackling theses challenges required developing a "Query rewriting" technique that reformulates voice commands to convey the same meaning. At a high level, Alexa comprises three components: an ASR system, an NLU system with a built-in dialog manager, and a text-to-speech TTS system. Alexa recognizes a user's voice by ASR and decodes it into plain text, which the NLU module interprets and passes on with the corresponding action to execute to the TTS. The TTS generates the appropriate response as speech back to the user via Alexa, closing the interaction loop. According to the research team, the engine ingests anonymized Alexa log data from "Millions" of customers on a daily basis to learn from users' reformulations and updates the database, enabling it to maintain the viability of existing rewrites. Explicit feedback here refers to corrective or reinforcing feedback from direct user engagement, the researchers say, principally events where users opt to interrupt Alexa with an interjection.
Read on VentureBeat.com >   Google the news >>

<< Back


(c) 2019 Geo Glance