One of the most intriguing visions in robotics is that of a robotic housemaid capable of helping us with everyday household tasks. Researchers at Albert-Ludwigs-Universität Freiburg in Germany have now started to develop a robotic application for cleaning up untidy rooms using the PR2.
Imagine you just had a dinner party with your friends. Everybody has left, and you would head to bed were it not for the mess left behind; dirty plates and half-empty glasses litter the table. As you begin to clear the dishes, you remember a new personal robot application that recently appeared in the App Store. A few moments pass and you've found and downloaded the TidyUpRobot Application. Shortly thereafter, your personal robot enters the room, analyzing the dinner table using its laser scanner and cameras, and begins bringing the glasses and plates to the dishwasher, and the leftover food to the fridge. Science Fiction? Albert-Ludwigs-Universität Freiburg will use their PR2 to continue their work towards this goal.
Over the next two years, the PR2 team from the University of Freiburg will work both on the theoretical and practical problems of enabling a household robot to reliably and autonomously clear objects from a table and return them to where they belong. Such an accomplishment will require progress in a number of robotics research areas including, navigation, perception and manipulation. Initially, the robot must obtain a map of its environment so that it can navigate from room to room. The robot must be able to recognize important items, such as the trashcan and dishwasher, and remember their locations. Additionally, the robot needs to learn how to grasp a wide variety of objects, as different objects require varying grasp positions and handling. The University of Freiburg aims to piece these, and other, open robotics challenges together, and develop robotic capabilities that will make our lives more efficient and productive.
Flexible Skill Acquisition and Intuitive Robot Tasking for Mobile Manipulation in the Real World is a project funded by the European Commission within FP7. The goal of First-MM to build the basis for a new generation of autonomous mobile manipulation robots that can flexibly be instructed to perform complex manipulation and transportation tasks. The project will develop a novel robot programming environment that allows even non-expert users to specify complex manipulation tasks in real-world environments. In addition to a task specification language, the environment includes concepts for probabilistic inference and for learning manipulation skills from demonstration and from experience. The project will build upon and extend recent results in robot programming, navigation, manipulation, perception, learning by instruction, and statistical relational learning to develop advanced technology for mobile manipulation robots that can flexibly be instructed even by non-expert users to perform challenging manipulation tasks in real-world environments. designed to autonomously navigate in urban environments outdoors as well as in shopping malls and shops to provide various services to users including guidance, delivery, and transportation.
The goal of INDIGO is to develop technology that will facilitate the advancement of human-robot interaction. This will be achieved both by enabling robots to perceive natural human behaviour as well as by making them act in ways that are familiar to humans. Besides from the capability of recognizing and producing human speech and facial expressions, the robots will be able to navigate and maneuver in a socially compatible way. This comprises following and guidance behaviours as well as adapting and imitating human motion patterns, e.g. to avoid collisions. The final system will deployed in the premises of a museum to work as a robotic tour guide.
The project aims at integrating leading edge technology in the field of service robotics and to develop an open, extensible system architecture. The project is funded by the German ministery of research.
The main goal of the EU project CoSy is to advance the science of cognitive systems through a multi-disciplinary investigation of requirements, design options and trade-offs for human-like, autonomous, integrated, physical (eg., robot) systems, including requirements for architectures, for forms of representation, for perceptual mechanisms, for learning, planning, reasoning and motivation, for action and communication.
Localization is one of the basic skills of a mobile robot. Much progress has been made in this field over the past years. In particular, the RoboCup competitions have focused their research within robotics on a unified challenge problem, and the yearly rule adaptations aim at increasingly approximating robotic soccer to the real world. Until now, most approaches, however, still rely on special sensors (like laser range scanners, preferably used in the RoboCup rescue leagues) or artificial environments (like color-tagged soccer fields, as used by the RoboCup soccer leagues). In this thesis, a novel approach is presented that can provide compass information purely based on the visual appearance of a room. A robot using such a visual compass can quickly learn the appearance of previously unknown environments. By the level of resemblance, a robot can additionally recognize qualitatively how far it is away from the former training spot. This can be used for visual homing. The visual compass algorithm is efficient, scaleable and can therefore work in real-time on almost any contemporary robotic platform. Therefore, the approach has been implemented on the popular Sony entertainment robot Aibo. Extensive experiments have validated that the approach works in a vast variety of environments. It has been shown that a visual compass can supply a mobile robot in natural environments with accurate heading estimates. Finally, it is shown in experiments that a robot using multiple compasses is able to estimate its translational pose. (more)
The University of Amsterdam organized the 4-Legged League at the Dutch Open this year. At the Open Challenge the UvA students Mark de Greef and Dave van Soest became 3rd with their demonstration of Automatic Color Calibration. In the Soccer competition we won three times from a German Team, reached the semi-finals and became 4th.
For the RoboCup 2006 the Universities of Amsterdam, Delft and Utrecht will concentrate on the Technical Challenges. For the Open Challenge the UvA student Jürgen Sturm demonstrated Panoramic Localization. Combined with the improved penalty shooting demonstrated at the New Goal challenge, the Dutch Aibo Team received the 3rd price in the Technical Challenges. In the Soccer competition the quarter-finals were reached, which means a direct qualification for the RoboCup 2007 in Atlanta.
last modified on 2010/08/17 15:27