Intelligent Robot Technology Software Project

Intelligent Communication RT Modules
for Providing Informational Services
in Public Environments

ATR
Eager Corporation
OMRON Corporation
Mitsubishi Heavy Industries

Japanese version
Overview Research Content Publications Software Framework

Research goals

Environmental situational awareness module

Develop a module to detect the location of action within 5m in front of the robot, such as the number of people within proximity.

Speech Recognition Module

Develop a robust automatic speech recognition module that works even in noisy environments, assuming a distance of 1m between the robot and its conversation partner.

Speech Synthesis Module

Develop a speech synthesis module to produce high quality speech, enabling easy listening even for children, elderly people, and bystanders in the environment.

Facial motion estimation module

Develop a module to understand the status and intentions of a person from face and eye information, by measuring the mouth opening and closing state.

Communicative intention recognition module

Develop a speech recognition module to recognize the intended emotion category of intentions that are conveyed by the attitude and speaking styles of dialogue partners (focusing on non-lexical speech).

Motion Generation module

Develop three functional modules for the generation of robot gestures according to the situation: "Dynamically Generated", "Automatic Generation", and "Behavioral Synthesis".

Interactive control module

Develop a module for interactive content management, to switch the flow of conversation according to the response of the dialogue and situation, and select the appropriate attributes and conditions of dialogue and interaction for control.

Personal Identification Module

Using facial recognition technology to identify individuals, develop modules tailored to their interests and needs.

Dialogue history management module

Develop a new interactive module that allows information accumulated from dialogue history to be reflected in new dialogue.

This study was conducted as a part of the "Next-Generation Robot Intelligence Technology Project"
sponsored by the Ministry of Economy in FY 2007, and by NEDO in FY 2008.

Copyright (C) ATR Intelligent Robotics and Communication Laboratories
Contact: irc-contact@atr.jp