A major challenge in robotics and artificial intelligence
lies in creating robots that are to cooperate with people
in human-populated environments, e.g. for domestic assistance
or elderly care. Such robots need skills that allow
them to interact with the world and the humans living
and working therein. In this paper we investigate the
question of spatial understanding of human-made environments.
The functionalities of our system comprise
perception of the world, natural language, learning, and
reasoning. For this purpose we integrate state-of-the-art
components from different disciplines in AI, robotics
and cognitive systems into a mobile robot system. The
work focuses on the description of the principles we
used for the integration, including cross-modal integration,
ontology-based mediation, and multiple levels of
abstraction of perception. Finally, we present experiments
with the integrated "CoSy Explorer" system and
list some of the major lessons that were learned from its
design, implementation, and evaluation.
Paper: [pdf: 570k]
Poster: [pdf: 3.2M]
Bibtex
@InProceedings{zender2007aaai, title = {An Integrated Robotic System for Spatial Understanding and Situated Interaction in Indoor Environments}, author = {Hendrik Zender and Patric Jensfelt and Oscar Martinez Mozos and Geert-Jan M. Kruijff and Wolfram Burgard}, booktitle = {Proceedings of the Conference on Artificial Intelligence}, address = {Vancouver, British Columbia, Canada}, year = {2007}, url = {http://www.informatik.uni-freiburg.de/~omartine/publications/zender2007aaai.pdf}, }
COSY Explorer 2006-11-14 Stockholm Full Run
This video shows a full run of the integrated Explorer system mobile robot as shown in the COSY project meeting in Stockholm, Sweden, in November 2006.