LaRA and Robocup at Home 



RH-Y Pictures & Videos

Here you will find more information than just presented (above), relating to our RH-Y robot. 
A 3 min. clip made in 2013 shows an overview of our developments for cooperating robots :  
"HEIG-VD Group of Cooperating Robots for Assistance at Home (RG-Y, RH-Y, OP-Y, …), with cognitic capabilities ensured by our Piaget environment (2007-2012); includes sequences taken while participating in world-level competitions". 
Click to play the video on YouTube 
download the clip (mpg, 116 Mb, v013.04.26). 
Some additional pictures and videos follow for the year 2011.  
In particular, compliant guiding, thermal vision,and a laundry application are illustrated by videos. 
For video about compliance, Click here (, ca. 15Mb; in this case, a small acceleration has been selected, for reaching speed targets induced by forces and torques).  
For the demo "laundry", 2 similar videos have been made; to download the clip of the demo, Click here (.mov, ca. 20 Mb).  
Extracted scenes 
Cloth detection process (SbWCD-SaturationBasedWeightedColorDifference correlation):  
Presence of white cloth:  
Cloth grasping and color classification (process similar to above):  
Delivery to appropriate basket:  
Visual analysis for status of laundry baskets (full or not): 
If basket full, navigation to (simulated) family room to warn homeowners vocally (robot mute in the video to allow for comments): 
Scene acquisition in color mode: 
Similar scene acquisition in thermal mode, followed by a location estimation by one of several processes standard in Piaget, based on window, blob or pixel analysis: 
An illustration of the visual location process in Piaget follows. Here white pixels are extracted in 1-of-9 color mode in a window, followed by a position estimation based on last pixel detected (cyan square), global centroid (magenta square) or centroid of the largest blob (green square). Other parameters are visible, in particular for calibration purpose: 
For the "laundry" demo , a variation relies on NAO as a mediator to deliver messages to humans. In particular when a basket of cloth to be washed is full, the information is transmitted by Ethernet-TCP/IP to NAO, who turns toward the home owner and tells the situation; to download the alternate end of the demo, Click here (.mov, ca. 3 Mb). Notice that an improvement here with respect to the 2010 demo in Singapore is a "robust" communication scheme. 
For a robust communication, from Piaget point of view, the concept is to keep repeatedly sending a message until it is acknowledged in return, periodically attempting to reopen the appropriate communication channel. Here is a view of the dialogue screen of the programm developed for implementing and testing this functionnality. On the right, a complementary, minimal test program for NAO, that acknowledges received commands, moving the head and speaking out a given text upon request, as can be meaningfully done in the context of the above "Laundry" demo. 
For the "laundry" demo , as typically all our Robocup@Home applications, the start command can be given in a variety of ways: push button, tactile screen, click of a mouse,distance threshold, or vocal signal. Here the vocal signal is illustrated, along with the "large controls" panel of RH-Y: Click here (.mov, ca. 3.7 Mb).  
In 2010, our team, RH5-Y, has participated to Robocup, in Singapore, in the Robocup-at-Home league, like all previous years since the league was founded (2006).  
This year, two new elements have appeared, which are particularly meaningful.  
In the first case, our team has demonstrated how the mediation by a humanoïd robot can help humans to simply hold a dialogue with other machines ; « Daniel* », sitting on a sofa of the living-room, has answered positively to a proposal made to him by « Nono-Y », our Nao-typed humanoïd, which, driving our omnidirectional platform, OP-Y, has consequently fetched its robot mate RH-Y, in order that the latter brings to Daniel a drink and some snacks (in principle, robust media for machine communication include wifi or even better, as demonstrated in another video below, ranger signals).  
Fig. Nono-Y, our Nao-typed humanoid, on the lower right, ensures the mediation between human and other machines (OP-Y platform where Nono sits ; and RH-Y robot, which has brought drink and snacks) 
Fig. On this picture, Nono-Y is wearing an external microphone, in order to broadcast its spoken words on the general sound infrastructure aiming at the public. 
To download the video of the humanoîd mediation in RAH-2010 Open Challenge, Click here (.mov, 79.9 Mb). 
To play on You Tube: 
In the second case, collectively for our league, a part of the competition developed in a «public» setting: a department store located in a mall ; in particular, our RH-Y robot, guided by the natural motion of a human, has navigated toward the appropriate shelves in order to learn where to fetch, later on autonomously, some specific objects selected by referees. 
Fig. The RH-Y robot follows a member of our team, learning in a first phase how to navigate towards the shelves where several objects specified by the referees are lying; the next phase implies that robots, alone, go, fetch a specific object, and brings it back to the exit location. 
Other videos of RH5-Y in RAH 2010, Singapore competition:  
- "Registration and Inspection" of RH5-Y robots (RH-Y, OP-Y, NAO type humanoid), click here to download (.mov, 47.9 Mb). Success, except for an emergency button judged as lacking in the case of NAO. In parallel, the "Poster Session" was made but is not visible in the video. 
- "Follow me", click here to download (.MOV, 46.1 Mb). RH-Y follows the "professional walker". The test is successful until checkpoint 1; at that point, a referee crosses the path, the robot detects it and announces that it will stop for 3 seconds for security reason; the walker however goes on, and by the time the 3 seconds are over, the walker is too far, i.e. further away than a safety distance defined in RH-Y, and therefore the robot stands still. 
A short video made in January 2010, with clips about some of our robot abilities relating to Robocup-at-Home (Replicate shown motions; follow a person; recognize gesture commands and move laterally; move through stairs and uneven ground - concept) can be watched on You-Tube as follows: 
(A full version of the elements of the clip are available, most of them also below, along with many other tasks and abilities; to download the above clip, click click here to download ,, 15.8Mb Mb ). 
In 2009, RH4-Y has participated to Robocup , in Graz, Austria. Unfortunately some of our Graz performances have not been directly recorded by ourselves (and some existing videos made by others are not easily accessible), but we (PFG) have made many pictures which can be watched at the following URL:
"Introduce" was the first test, where, correctly, RH4-Y autonomously entered the "home", briefly presented itself and our team, and finally left through another door.  
"FollowMe" was another test, where OP2-Y is taught a new home environment just by being guided and controlled by gestures (one of RAH execs, Tijn, is here the "professional walker"). The video can be downloaded: click here (.wmv, 30Mb).  
"OpenChallenge" gives the opportunity to show special features. Here, during preparation in Graz, OP2-Y follows RH4-Y which turns around Max. In the official presentation, one robot was controlling the other by gestures, and OP-Y demonstrated its ability to move laterally. Two arm coordination was in the concept (and in a video below) but not shown within the official 7 minute time slot. 
"WalkAndTalk" is a natural phase following "FollowMe"; the concept is now, in addition to learning paths and topologies, by a natural vocal dialogue, to associate a name to the locations visited. In this way it becomes later on possible to let the robot navigate on its own to the declared locations. The video can be downloaded: click here (.wmv, 112Mb). 
While in previous 2 videos the part of initial guiding and talking is demonstrated in Graz conditions, using OP2-Y, the next video additionnally includes the subsequent autonomous navigation part, in our Yverdon context, and relies on our RH4-Y platform (our Piaget environment supports as well one platform as the other). The video can be downloaded: click here (.wmv, 66Mb). 
Our "BarTender" demo started reasonably well, with a person in the public specifying which drink or mix of drinks she wanted. RH4-Y could handle the dialogue and managed to bring the right bottles near the cup; then some error in fluid dynamics ;-) led it to slightly miss the cup, pouring some liquid on the table! Elements of improvement might include a 6th degree of freedom in arm kinematics, the rotation at lower speed about a point in space closer to upper center of the cup, as well as at least a few preliminary trials in real conditions (Graz tables, carpet, fluid viscosity, etc.) 
Basic abilities encouraged to be demonstrated in Robocup-at-Home include opening doors. Even though this could be done successfully 5 times during preparation phase, at the official moment a rare mechanical failure (wheel loose on its axle) jeopardized efforts done by RH4-A in calibrating itself with respect to home infrastructure, thus failing to align itself with the handle. A video can be downloaded: click here (.wmv, 17Mb). Possible improvements include a reliable fixture of wheel on axle (e.g. as in OP2-Y) and better still, as in OP2-Y, a mobile platform capable of lateral motions (omnidirectional basis), thus reducing the need of rotations and maneuvring in general. 
Cooperating robots and domestic help seem to raise public interest. Three examples of audiences follow: Robocup, Graz, July 2009; Festival de robotique, EPFL,Lausanne, 16 May 2009; and Swiss Eurobot, La Marive, Yverdon-les-bains, 8-9 May 2009:  
(Photo Orkomedix). 
(Photo EPFL). 
(Photo PFG). 
Progress has been made in the direction of basic abilities and of the specific tests defined in Robocup-at-Home rulebook. 
As one of the improvements in basic abilities, with respect to past editions, an omnidirectional platform with good abilities to overcome obstacles in frontal direction has been developed (OP2-Y) (Click here to download video; 39Mb, .mpg). 
Videos illustrate some of our preparation steps for tests defined for the competition in Graz. 
- "Introduce", and 
- "Follow me" are the first two tests.(Click here to download video; 41Mb, .wmv) 
- "Fetch and Carry" is test number ...(Click here to download video ; ca 27Mb, .wmv) 
- "Lost and Found" is test number ... (Click here to download video; .wmv, ca 36Mb). 
- "Bartender demo". An application is prepared for the Bartender demo (Click here to download video ; ca 87Mb, .wmv) 
Last year, RH3-Y has been participating to Robocup 2008, in Suzhou, China.  
FastFollow is one of the tests made by RH3-Y in Suzhou. In competition, it has in particular traveled through a home faster than its opponent! In opposite directions! There has been no crash (speed limits are not enforced yet at home ;-) ). On this topic, the key property remains TaSOC (time and space object continuity); improvements have notably been related this year to a velocity control phase at coordination level and to the process of integrating wheel increments for very accurate perception of effective path in 3 dof cartesian space (x,y,alpha). In addition, the HRI takes advantage of the same laser scanner for new driving possibilities: backward guidance and sleep/start commands.  
The following video shows the official competition in Suzhou, between "RH3-Y" and "Erasers". 
"To follow a human" should be a basic ability of any domestic robot, and this for several purposes. For example the following picture shows Aziz proving the ability to have RH3-Y following him through the venue, carrying on its own the food and other goods for our team. 
Actually to follow is not enough; sometimes the human may lead RH3-Y in a difficult situation; from there, backwards motions are very useful. Similarly, we may like to park our domestic robot in a confined area. The following video shows a test recorded in the training area of Suzhou Competition venue, after the official competitions.Click to download (.wmv, ca 3 MB) 
Version with upright robot (. avi, xVid codec, 5.8 Mb):Click to download  
RH3-Y robot carries objects and follows guide at HEIG-VD for the "Salon de la mobilité 2030 ", in Yverdon-les-Bains, Switzerland, 29-30 Aug. 2008 (.wmv, 5.3 Mb):Click to download  
The following video shows a preparation test in our lab.Click to download (.wmv, ca 16 MB) 
This video illustrates the "Fetch & Carry" ability of RH3-Y, which is another basic feature expected from domestic robots, in the context of Robocup-at-Home.Click to download (.wmv, ca 11 MB) 
The information below mostly describes RH2-Y, which took part in Atlanta Competition (Robocup 2007), and RH1-Y, which did similarly in Bremen (Robocup 2006). 
In fact the development is incremental, and typically even after competitions improvements are made on the same rules (test tasks), as it takes many months for the new rules to be elaborated. 

A very good impression of the possibilities of RH2-Y can be gained from watching the official test "CopyCat", phase 2, which took place in Atlanta. The idea is for the robot to replicate human motions.Click to download (.wmv, 14 MB) 




(Four newspaper articles written in French, about the participation of RH2-Y to Robocup-at-Home 2007, as well as a radio program, are available on our site in French language Click to go
Robocup in general, and more specifically the 2006 World Competition, are described, in German, in a very interesting 30 minutes programm on the German television channel ZDF:,4070,3945796-6-rv_dsl,00.html (The Robocup-at-Home part is shown in the last 4 minutes). 
Here are some of our team and robot pictures extracted from ZDF program. (Click on pictures to enlarge them). 
Our team (right) and Carnegie-Mellon team (left). We can guess the kitchen on the background, left. 
Our robot, RH1-Y, facing a leader with orange coat... 
RH1-Y screen shows that the orange coat has been detected and redrawn for control, in green. 
View of the other half of Bremen "home", with RWTH Aachen Robot. 
Some other pictures and videos of RH-Y... 
Our RH-Y robot (mostly in its versions 3,4,5) is described above and here we give additional past pictures and videos relating to some internal tests and also to its possible use. You find below information relating to previous versions, RH2-Y, which was in competition in Atlanta (Robocup 2007), and RH1-Y, which was similarly present in Bremen (Robocup 2006). 
RH2Y .... 

A very good impression of the possibilities of RH2-Y can be gained from watching the official test "CopyCat", phase 2, which took place in Atlanta. The idea is for the robot to replicate human motions.Click to download (14 MB) 


Here are some detailed tests, made in our lab, either before or after the competition: 
- "Follow & Guide": Follows, stops at obstacle, follows and guides back... with vision, laser and other sensors, vocal communication. Download file (ca. 44Mb) (.wmv, tested with Windows Media Player),  
Go, locate, pick and bring back an Object, 3rd demo 
>Download file (ca. 20Mb) (.wmv, tested with QuickTime),  
Go & Pick an Object, 2nd demo 
>Download file (ca. 19Mb) (.wmv, tested with QuickTime),  
Go & Pick an Object, 1st demo 
>Download file (ca. 38Mb) (.MOV, tested with QuickTime),  
"Manipulate": Go, fetch an object, and carry it for at least 3 meters. 
>Download file (ca. 18Mb) (.wmv, tested with QuickTime),  
"Navigate" : Move autonomously to a spoken place, avoiding possible obstacles on the way  
>Download file (ca. 22Mb) (.wmv, tested with QuickTime),  
"CopyCat": Replicate the motions made by humans (teaching by showing)  
>Download file (ca. 17Mb) (.wmv, tested with QuickTime),  
"Who is who?": Learn to know people and later recognize the ones you know (uses vision and vocal dialogue) 
1st demo: Learn 4 persons and later recognize them and detect a stranger (here with a final decision error) 
>Download file (ca. 29Mb) (.wmv, tested with QuickTime),  
"Who is who?": Learn to know people and later recognize the ones you know  
2nd demo: test done in simulation  
>Download file (ca. 23Mb) (.wmv, tested with QuickTime),  
"Who is who?": Learn to know people and later recognize the ones you know  
3rd demo: recognization phase only  
>Download file (ca. 11.5Mb) (.wmv, tested with QuickTime),  
"Who is who?": Learn to know people and later recognize the ones you know  
4th demo: demo in principle similar to 1st demo, here with minimal representation of humans, linear scanning motion for robot, vocal dialogue, and no error.  
>Download file (ca. 13.5Mb) (.wmv, tested with QuickTime),  
"Lost&Found": Go and look for a specified object  
1st demo: obstacle is on the way  
>Download file (ca. 13Mb) (.wmv, tested with QuickTime),  
"Lost&Found": Go and look for a specified object  
2nd demo: obstacle is not so visible  
>Download file (ca. 11Mb) (.wmv, tested with QuickTime),  
"Lost&Found": Go and look for a specified object  
3rd demo: visit all search areas  
>Download file (ca. 26 Mb) (.wmv, tested with QuickTime),  
Concept of path management in official Robocup-at-home "home" 
Basic tests for following a human: 1. Download file (8.3Mb) (.wmv, tested with Quicktime), vision and other sensors, 
2. Download file (9.6Mb) (.wmv, tested with Quicktime), includes lateral obstacle avoidance,  
3. Download file (6Mb) (.wmv, tested with Quicktime), based on distance control with ultrasonic sensors along trajectory,  
4. Download file (15Mb) (.wmv, tested with Quicktime), visual tracking, 
Basic tests for vocal interaction: 1. Download file (6Mb) (.wmv, tested with Quicktime),  
2. Download file (1Mb) (.wmv, tested with Quicktime),  
Pictures and videos are also available on some of our past robots. For ex.: Dude 2005: 
>> more pictures and videos: ARY robots, description and examples 
<< more general - LaRA and Robocup-at-Home  


(c) LaRA - Made with the help of
Last modified on 27.04.2013