[Documentation] [TitleIndex] [WordIndex



This package provides a 'perception algorithm'-independent simulation of 6-D object localization for 3D object search by a mobile robot: Based on the poses of the searched objects with respect to the current viewing frustum(s) of the robot, the detectability of the objects is estimated.


The system loads objects specified in a object-constellation-file and recognizes them based on the current camera position (for both, left and right camera). For a single object at first the given pose is transformed from the world to the camera frames. For each camera frame the transformed pose is rated based on its position in the camera frustum (the distance to the camera and the angles to the camera's visual axis (azimut and elevation) are used). If the rating is higher than a specified threshold value the pose is valid and the object is marked as found and returned.

Optionally random errors which can occur in real world recognition scenarios can be simulated in three ways:

  1. To simulate frame specific recognition errors random correctly found objects can be dropped every couple frames
  2. To simulate the recognition accuracy in the found position of the object poses random noise is added to slightly alter the position in all dimensions
  3. Similar to the position noise errors in the orientation can be generated for each dimension

The object-constellation file is using the XML-format and needs to have a specific layout:

An example for a valid constellation file could be the following:

    <Object type="Cup" id="011021054100" mesh="package://asr_object_database/rsc/databases/segmentable_objects/Cup/object.dae" angles="quaternion">-1.2902,0.729374,0.755761,0.721985,-0.665815,-0.13442,0.131754 </Object>
    <Object type="CoffeeBox" id="0" mesh="package://asr_object_database/rsc/databases/textured_objects/CoffeeBox/CoffeeBox.dae" angles="euler">1.5,-0.5,1.5,90,0,0 </Object>


Needed packages

If the feature is not turned off there also needs to be a valid transformation between the frames of the camera and the ones of the objects in the configuration file (by default \map). For more information on the transformation check asr_kinematic_chain_dome, asr_kinematic_chain_mild and tf.

Needed software

Needed hardware

No hardware is needed for this package to work but by default there has to be a camera frame which can either be simulated or gathered by using a real camera setup.

Start system

At first set the configuration parameters depending on your usage scenario (see configuration). After you have chosen the object constellation you want to use and set the topics correctly start the process:

roslaunch fake_object_recognition fake_object_recognition.launch

Now call one of the offered services to add or remove objects from the list of detectable entities.

ROS Nodes

Subscribed Topics

Apart from the frames mentioned above there are no input topics for this package.

Published Topics


This package offers two sets of parameters you can adjust, the static ones which you can set by adjusting the params.yaml file located in the param-directory of the package, and the dynamic ones which you can either set by adjusting the launch-file or during runtime by using dynamic_reconfigure.

Static parameters:

Dynamic parameters:

Needed Services

No external services are called by this package.

Provided Services

2024-07-20 12:40