ARK NETWORK reference.ch · populus.ch    
 
  
LaRA and Robocup at Home 
 
 
Sections

Links

 Home

RH-Y and other robots

(In general: Click on images in order to get them larger) 
 
(For a detailed description of many other aspects, please refer to the Team Description Paper and/or to the slides, which both can be downloaded :re. Home page). 
 
In 2011, RH-Y and OP-Y are upgraded with new resources in hardware and cognitive terms. In particular, a thermal camera, a multipurpose color lighting device, new capabilities for compliant guiding, and more robust communication are added to the previously existing resources, which include, not visible on this picture, a TOF 3D camera system, a 5 dof arm, an ultra-parallel, fined-grained implementation of cognitive agents and a quantitative, cognitic approach.  
 
In particular, a demo has also been developed according to 2011 rulebook specifications, for an integrated laundry application. 
 
 
In 2010, RH-Y and OP-Y have been complemented by a humanoid, the standard platform NAO, in its academic edition. This has lead in particular to the notions of human-machine robotic mediation, and of robot group, RG-Y. Some other features are less apparent, including a novel saturation based, weighted distance estimation between colors, involving a finely varying mix of hue and/or intensity differences. Other major components include a 3D distance correlation procedure in the cartesian plane, for SLAM; and a 2D similarity estimation, making use of the weighted colors differences just defined, for visual recognition of humans (facial and other features) and objects. 
 
 
RH5-Y has been given a new power drive system, with more motor torque (change of motors, gears, amplifiers) and better wheel fixtures (right on the picture). 
 
 
 
 
 
RH4-Y is the 2009 version of our RH-Y robot family. Improvements include integration of a 3D-image ranger, coordination of two arms, and alternative, omnidirectional platform. 
General considerations (social relevance and use) 
Manipulation is a basic ability, expected for home assistance  
One arm is necessary; a second arm yet opens a wealth of other possibilities 
A special point is raising things from floor level, which is sometimes particularly difficult for humans. Another is to have a certain reach in height. 
Here, robot design has been made for home: removable plastic trays, easily cleanable, helping to store goods temporarily, and easy to be further handled by humans. In addition the arms are asymmetric, complementary in possibilities. 
 
 
 
Robocup-at-Home Community of Robots! (Graz, July 09, Photo HEIG-Vd.LaRA/PFG) 
 
This year we have been invited in an industrially sponsored, new demonstration league: Hockey challenge; a strong feature is an industrial-grade, standard platform. Its general use is in educational context, and for some of us there is a meaningful overlap with scientific research in services for ageing society. A poster about it can be downloaded: click here (pdf, 2.5 Mb). 
 
 
Example of match and goal at Festo Hockey Challenge! (Graz, July 09, photo HEIG-VD.LaRA /PFG) 
 
 
RH3-Y is the 2008 version of our RH-Y robot family. Improvements include consolidation of several components: shoulder joint, gripper control, obstacle avoidance and mapping capabilities, collaborative development methods (re. subversion: e.g. very frequent changes made by team members in Suzhou were being systematically updated on our server at Yverdon through the good infrastructure available until both ends of the line), and in general, consolidation of cognitic solutions for Robocup 2007 At-Home test tasks, i.e. basic home assistance capabilities. Improvements have notably been related this year to a velocity control phase at coordination level and to the process of integrating wheel increments for very accurate perception of effective path in 3 dof cartesian space (x,y,alpha). In addition, the HRI takes advantage of the same laser scanner for new driving possibilities: backward guidance and sleep/start commands.  
 
A step further has been made in map management, the Piaget environment allowing now easy acquisition of maps, both statically and during ego or guided motions. The operation allows for localisation and simple modeling of environment. Nevertheless, our mapeditor remains useful for complementary manual editing (forbidden areas, mobility class of objects, etc.) 
The picture above show obstacles detected in simulation map, i.e. indirectly from the a priori map itself. The picture on the right shows Suzhou Home, with 3 simulated persons for Who is who test in simulation mode (for training). In the middle, two high obstacles are shown on the map, which have been detected by RH3-Y in situ: one is a plant, and the other one some competition participants. 
 
Another area has been addressed in connection with the new test "Introduce" of Robocup-at-Home (RAH), which considers the expressive ability of robots; the long-term goal is here the acceptability of robots by potential users, as well as improved non-verbal communication with them(speech synthesis is a field currently quite mature, so new channels may be considered). The picture beside may give an idea of our approach; icons can be displayed with good size, and possible expressions may continuously vary as a fine function of a 2D basis: valence and arousal. 
More generally however, robots can express their emotions in many other ways: Blink lights, Shout (re. RH2-Y in Atlanta), Move (dance?) Waive arm? 
Those who know RH-Y also know the discrete voice of its motors or how the right wheel tends to lag behind when batteries run tired. More information about the expression of emotions can be found in the following file to be downloaded: Click here(.ppt, ca 2Mb), or Click here(.pdf, ca 3Mb). Notice than the commentaries mostly contain the text to be spoken by RH3-Y, along with other expressive items. 
 
 
 
 
A new, omnidirectional platform is in development. 
 
 
 
 
 
 
 
 
RH2-Y Robot has resulted from various improvements performed on RH1-Y. Improvements have been made in terms of collision avoidance and spatial estimation (use of a planar laser scanner), of space representation (implementation of a fast access map with virtual sensors), new work has also led to novel handling capabilities (basic arm and hand), as well as to a better integration of speech synthesis and voice recognition capabilities. 
 
Currently, voice recognition is done at low-level with standard resources (re. MS SAPI 4.4), and at a higher level, Piaget allows for powerful dialogue management. 
 
Nevertheless, as well documented by our RAH Japanese friend Komei Sugiura, vocal interaction between humans and robots is often a very challenging task:  
"The following is a list of noise levels measured in Suzhou competitions: 
 
60-70dBA: "ordinary" for RoboCup@home competitions 
65-75dBA: explanation and translation via loud speakers 
75-85dBA: announcements. 
 
An utterance is attenuated to about 60-70dBA when a microphone is one-meter away, while the noise would be also 60-70dBA, which means that the SNR is nearly zero. In this environment, it is almost impossible for even a state-of-the-art speech recognition system to recognize the utterance.  
 
Estimation of difficulty level of speaker-INDEPENDENT speech recognition in the noise condition of RoboCup@Home competitions: 
 
5cm-20cm: "easy" 
50cm: difficult without noise reduction 
100cm: almost impossible." 
 
 
RH1-Y Robot had inherited a lot of the functionnalities of our previous autonomous robots.  
 
RH1-Y architecture is as follows: 
 
- a notebook runs the supervision software, 
 
- a Beckhoff API manages low-level input and output signals, 
 
- a TCP/IP AXIS camera provides low-level vision to our robot, 
 
- a two axis motor controller permits the robot to move, 
 
- an Ethernet switch provides communication between those elements. 
 
 
Real-time interaction and supervision is programmed in a proprietary multiagent environment, named Piaget. Key features include real-time, embedded system support, extremely fine-grained multi-threading, VAL-style instructions for high-level specifications of compound transforms and trajectories, as well as thorough capabilities for visual engineering, general simulation, and debugging. 
 
Novel additional contributions for Robocup-at-Home application include vision primitives for guide recognition and tracking, multi-sensory data fusion, the integration of vocal communication (analysis and synthesis), wireless remote control ("chat" mode), as well as map and trajectory management. 
 
 
The robot includes two blue containers, which are meant to be used for moving objects at home: mobile telephone, tissue papers, food, games, newspapers, medicines, etc. the top unit is very easily removed, stacked, washed, etc. 
 
>> more details - ARY robots  
 
<< more general - LaRA and Robocup-at-Home  

 

(c) LaRA - Made with the help of Populus.org.
Last modified on 8.07.2011