Skip to main content

Amazon opens up its crowdsourced Alexa Answers program to anyone

Amazon opens up its crowdsourced Alexa Answers program to anyone

/

Be the brains behind Alexa

Share this story

Amazon is expanding its crowdsourced Alexa Answers program — which lets Alexa users add answers to questions that Alexa doesn’t know — to anyone today, after an early, invitation-only beta launched in December.

Users will be able to filter questions by categories like “most frequently asked questions,” “newest questions,” or general topic areas, like science or geography, and then submit the answer (assuming they know it). Amazon is also looking to gamify the system. Users will earn points when Alexa uses their answer, and they can compete on leaderboards to contribute the most helpful responses. You’ll also be able to track how often Alexa uses your question.

The main concern is fact-checking. Amazon doesn’t seem to have any formal system to confirm that the answers being submitted are correct. Alexa users will be able to give responses as a thumbs-up or thumbs-down when provided with the user-submitted response, along with an Amazon-style star-based rating system that exists on the Alexa Answers website, but it’s entirely reliant on customers rating on their own. Users can also flag answers that they think are incorrect, but beyond that, the entire setup seems to rely on the honor system.

Additionally, if more than one answer is given, Amazon says that “Alexa may rotate between answers until she gains enough feedback to determine which answer is the most useful,” which seems like a poor substitute for actually determining which one is correct.

Still, plugging the holes in the knowledge of smart assistants is a key part of making them smarter, and getting your customers to volunteer their time to do it for free (as opposed to, say, secretly having paid human contractors to listen to customers’ recordings) isn’t the worst way to accomplish that task.