At re:MARS, Amazon Alexa evolves from ambient to generalizable intelligence

At re:MARS, Amazon Alexa evolves from ambient to generalizable intelligence

Posted on

Amazon Web Services Inc.’s chief Alexa developer Rohit Prasad today spoke about the increasingly prominent role his company’s digital assistant plays in people’s lives, citing this as evidence of the rise of “ambient intelligence”.

Prasad defines ambient intelligence as artificial intelligence that has become so pervasive that it appears all around you, responding to requests as and when required, but otherwise fading into the background when it’s not needed. It’s AI that is embedded everywhere in the environment, and can be both reactive, responding to someone’s needs, and proactive, anticipating them. It also relies on a wide range of sensing capabilities, including vision, sound, ultrasound, mechanical sensors and more. And it takes actions, for example finding information or playing your favorite song, or buying products online.

It’s clear that digital assistants like Alexa clearly fit that definition. Many thousands of people use Alexa on a daily basis, with Amazon claiming billions of customer interactions every single week.

Prasad said this makes Alexa the quickest and easiest route to so-called “generalizable intelligence”, or the stuff of Sci-Fi. He explained that Alexa is made up of more than 30 individual machine learning systems, each of which can process different sensory signals. The real-time orchestration of these systems means that Alexa is one of the most complex AI applications in the world.

“Still, our customers demand even more from Alexa as their personal assistant, advisor, and companion,” Prasad said. “To continue to meet customer expectations, Alexa can’t just be a collection of special-purpose AI modules. Instead, it needs to be able to learn on its own and to generalize what it learns to new contexts. That’s why the ambient-intelligence path leads to generalizable intelligence.”

According to Prasad, generalizable intelligence does not refer to some kind of super-powerful, all-knowing AI that can accomplish anything. He said his definition of the term is much more pragmatic, and that GI agents should have three key attributes. That is, they should have the ability to accomplish multiple tasks; rapidly evolve to ever-changing environments; and learn new concepts and actions with minimal external input from humans.

“For inspiration for such intelligence, we don’t need to look far,” Prasad said. “We humans are still the best example of generalization and the standard for AI to aspire to.”

Prasad believes that AI is already well on the way towards being able to generalize. He said a good example of this is “Foundational Transformer-based large language models trained with self-supervision”, such as the Alexa Teacher Model. This model, he said, is already able to capture knowledge around language understanding, speech recognition, dialogue prediction and visual-scene understanding.

The other key element of AI generalization is being able to learn with little or no human input, Prasad said. Once again, Alexa can do this already thanks to its self-learning mechanism, which automatically corrects tens of millions of defects each week.

“Customers can teach Alexa new behaviors, and Alexa can automatically generalize them across contexts, learning that terms used to describe light settings can also be applied to speaker settings,” Prasad explained.

Looking ahead, Prasad said the next step towards AI generalization is to make Alexa more knowledgeable through “conversational explorations”, so it can better answer people’s questions. The idea is that Alexa can save someone the hassle of pulling out their phone to Google something. So if a user asks Alexa a question about a product they’re interested in, it will respond with specific information to help them make a decision, such as an excerpt of a product review.

“If that initial response gives you enough information to make a decision, great,” Prasad said. “But if it doesn’t — if, for instance, you ask for other options — that’s information that Alexa can use to sharpen its answer to your question or provide helpful suggestions.”

Customers will soon be able to see this in action themselves. Conversational explorations is a new feature announced at Amazon’s re:MARS event that will be coming to Alexa soon.

Image: Amazon

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *