You are here: DASMOD > Projects > AVES


Start of topic | Skip to actions

AVES/DyMoSiHR: Acoustic and Visual Environmental Simulation/Dynamic Modeling and Simulation of Humans in Robot Interaction

Project Topic

The project aims at simulating the spreading of sounds within the close-up range of an autonomous mobile robot that is situated in an office environment. This simulation is used for evaluating the robot's control strategies for identifying sound events and reacting to them. In the project's extension, the developed techniques will be applied to human speech and moving humans as sound sources.

Project Description

The main objective of this project is the development of new methods for simulating the optical and acoustic properties of an indoor scene in order to use these data for evaluating the control algorithms of autonomous mobile robots.

A robot orientates itself inside a room by using the information that is provided by its sensor systems. Besides distance-sensors, optical and acoustic sensors provide these important data. This comprises the core tasks in the collaboration of the research groups involved in this project: In order to enable a robot to interact with its environment and to permit a context-sensitive execution of its tasks, the robot has to be able to interpret this information provided by its sensors. However, appropriate environments and stimuli for testing these capabilities are not always available. In order to test the control algorithms of a robot, this project aims at providing the capabilities for a realistic simulation of the acoustic and visual properties of indoor-environments. For this purpose, the project will use technologies that have been developed by the research groups collaborating in this project, the groups "Image Understanding and Pattern Recognition", "Robot Systems", and "Computer Graphics", as well as DFKI's research lab "Intelligent Visualization and Simulation".

It is especially envisioned to build our work upon the audio-visual Virtual-Reality presentation system that has been developed in cooperation between the University of Kaiserslautern's Computer Graphics group, the Fraunhofer Institute for Technical and Economical Mathematics (ITWM), and DFKI's research lab "Intelligent Visualization and Simulation" in the context of the research project "Acoustic Simulated Reality".

While in the first part of the project the work is focussing on static sound sources emitting one characteristic signal, the project's extension aims at applying the techniques developed in the first part to humans in office environments. This implies the modeling and simulation of moving sound sources as well as the dynamic aspects of speech. The techniques that will be developed here provide a central aspect for enabling robots to interact with humans. As a platform for integrating and evaluating these techniques, the humanoid robot head ROMAN is available at the Robotics Laboratory of the Department of Computer Science.

Project Members

Project Chair

Participating Research Groups

  • Computer Graphics Group, Department of Computer Science
  • Robotics Laboratory, Department of Computer Science
  • Research Department "Intelligent Visualization and Simulation", German Research Center for Artificial Intelligence (DFKI)
  • Research Department "Image Understanding and Pattern Recognition", German Research Center for Artificial Intelligence (DFKI)

Scientific Personnel

  • Dipl.-Inform. Eduard Deines (Computer Graphics Group)
  • Dipl. Inf. Norbert Schmitz (Robotics Laboratory)
  • Dipl. Technoinform. Jens Wettach (Robotics Laboratory)

External Cooperation

Project Events and Achievements

  • Project start date: December 1st, 2005
  • Project end date: December 31st, 2007

Presentations

  • April 12th, 2006: Presentation of the project at the Research Seminar of the Image and Signal Processing Group at University of Leipzig's Computer Science Department by Peter Dannenmann ( Slides, unfortunately only in German )

Project Publications

Simulation and Visualization of Indoor Acoustics for Robot Control

Norbert Schmitz, Jens Wettach, Eduard Deines, Peter Dannenmann, Martin Bertram. In: accepted for publication at the Ninth IASTED International Conference on Computer Graphics and Imaging. International Association of Science and Technology for Development, Submitted, Februrary, 2007

Autonomous Behavior-Based Exploration of Office Environments

Daniel Schmidt, Tobias Luksch, Jens Wettach, Karsten Berns. In: 3rd International Conference on Informatics in Control, Automation, and Robotics. August, 2006

r14 - 14 Mar 2007 - PeterDannenmann

Copyright © University of Kaiserslautern, 2009. All material on this website is the property of the respective authors.
Questions or comments? Contact DASMOD webmaster.