express.co.ukexpress.co.uk - September 13 view article

RISE OF THE ROBOTS: Machines could get ‘EMOTIONS’ through Alexa-style technology

By Rebecca Flood

PUBLISHED: 07:43, Wed, Sep 13, 2017 | UPDATED: 07:44, Wed, Sep 13, 2017

GETTY

A robot, Baxter, was created by Rethink Robotics

As the field of Artificial Intelligence and considerable progress is made to give them life-like looks, work remains to make them think and act like man and woman. 

Movements have been refined and blinking has been factored in but little tangible function lies beneath the pretty facades. 

Mapping the extremities of human personality, emotion and meaning is still an obstacle scientists have yet to conquer. 

A tool being used to overcome this is similar to voice-software Alexa, using a system dubbed ‘ComText’, aka commands in context. 

Wed, July 26, 2017

Cloud Climax / SWNS.com

1 of 16

An android, Baxter, was created by Rethink Robotics’ former MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) director Rodney Brooks.

Researchers trained Baxter in ComText, teaching it to understand a range of commands within a contextual framework, bridging an important gap. 

Using ComText, Baxter was able to execute the correct command 90 per cent of time. 

GETTY

Using ComText, Baxter was able to execute the correct command 90 per cent of time

CSAIL postdoc Rohan Paul, who worked on the project, said: “Where humans understand the world as a collection of objects and people and abstract concepts, machines view it as pixels, point-clouds, and 3-D maps generated from sensors.

“This semantic gap means that, for robots to understand what we want them to do, they need a much richer representation of what we do and say."

Robots should have different kinds of memory, just like people

Rohan Paul

And one of the co-leaders, research scientist Andrei Barbu, added: “The main contribution is this idea that robots should have different kinds of memory, just like people.

"We have the first mathematical formulation to address this issue, and we're exploring how these two types of memory play and work off of each other."

In the future they hope to expand using multi-step commands, intent and describing objects only by their properties. 

GETTY

In the future they hope to expand using multi-step commands

Luke Zettlemoyer, an associate professor of computer science at the University of Washington, said: "This work is a nice step towards building robots that can interact much more naturally with people.

"In particular, it will help robots better understand the names that are used to identify objects in the world, and interpret instructions that use those names to better do what users ask."

  • Boris Johnson attends robot demonstration during trip to Japan
  • Robot expert warns AI could one day replace humans
  • Experts react to Bill Gates proposed 'robot tax'

Skype is another tool being used to teach robots about facial expression and meaning. 

Facebook’s AI lab is mining the information contained in conversations, using an algorithm to train robots. 

MIT/CSAIL

Baxter the robot, created Computer Science and Artificial Intelligence Laboratory (CSAIL)
  • Rising AI to threaten human jobs and cause major identity crisis
  • SUPERHUMAN robots only decades away from taking over all human tasks

Dividing the human face into 68 key points, including blinks, nods and mouth movements, they asked the robot to identify the appropriate response. 

The findings will be presented at the International Conference on Intelligent Robots and Systems in Vancouver, Canada, later this month.

  • Toy Story 2 shock: SEX dolls and vibrators come to life too?
  • Third of Brits admit they WOULD have use sex robot
  • Fears for the future as AI robots are 'prejudiced' against women

express.co.ukexpress.co.uk - September 13 view article