[Documentation] [TitleIndex] [WordIndex

Only released in EOL distros:  

Introduction

Given a bagfile of calibration_msgs/RobotMeasurement collected using the pr2_calibration_executive package (or similar), pr2_calibration_estimation uses this data along with several configuration configuration files to estimate various parameters for the PR2.

Technical Explanation: This is a multi-step nonlinear optimizer that uses scipy's levenberg-marquardt optimizer, scipy.optimize.leastsq, to estimate various parameters of the PR2 using a bundle adjustment like framework.

Stability

This package is still unstable, and does not have a released API.

Concepts

This stack is still very experimental, and we have not standardized (let alone even reviewed) many of the internal APIs in this stack. However, it seems that many people are still interested in porting this unstable code to other robot platforms. We are providing the information below to help others port this stack to their robots, even though it is an unstable stack.

Launching the Estimator

The main goal of the pr2_calibration_estimation package is to provide a reconfigurable nonlinear optimizer for estimating calibration parameters for some defined robot. For the PR2, the estimator is launched using pr2_calibration_launch/estimate_params/config_pr2_beta/estimate_pr2_beta.launch. This launch file loads the estimation configuration onto the parameter server and then launches the estimator (multistep_cov_estimator.py).

Estimator Command Line Usage

multistep_cov_estimator.py [bagfile] [target_dir]

Estimation Configuration (Parameter Server)

The estimator looks for it's configuration in the /calibration_config namespace. The PR2's configuration is stored in pr2_calibration_launch/estimate_params/head_then_arms_params.launch

The Calibration Bagfile

The estimator relies on calibration samples in the specified bagfile in order to run the calibration. These messages must be of type calibration_msgs/RobotMeasurement under the topic /robot_measurement. Below is a description of all of the fields, and what they mean:

Defining the Robot's Primitives

The calibration estimator relies on a yaml file that stores the configuration of the robot's various kinematic and sensor primitives that we may want to estimate. For the PR2, this file is pr2_calibration_launch/estimate_params/config_pr2_beta/system.yaml. The currently support primitives are dh_chains, tilting_lasers, transforms, and rectified cameras. Although not necessarily a "robot primitive", checkerboard intrinsics are also defined in this file.

DH Chains

A DH Chain primitive is used to represent a kinematic linkage using Denavit-Hartenberg parameters. Each DH Chain primitive must be included as a dictionary element under the dh_chains section of the config file. The parameters defined below are used in pr2_calibration_estimation/dh_chain.py

Example:

  right_arm_chain:
   dh:
   - [   0,  -pi/2, .1, 0      ]    # r_shoulder_pan_joint
   - [ pi/2,  pi/2,  0, 0      ]    # r_shoulder_lift_joint
   - [   0,  -pi/2,  0, .4     ]    # r_upper_arm_joint
   - [   0,   pi/2,  0, 0      ]    # r_elbow_flex_joint
   - [   0,  -pi/2,  0, .321   ]    # r_forearm_roll_joint
   - [   0,   pi/2,  0, 0      ]    # r_wrist_flex_joint
   - [   0,    0,    0, 0      ]    # r_wrist_roll_joint
   cov:
     joint_angles: [0.001, 0.001, 0.001, 0.001, 0.001, 0.001, 0.001]
   gearing: [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]

Note that we have not defined where this kinematic chain is rooted on the robot. This is done later, when defining sensors.

Transforms

A transform primitive is a full 6 dof transformation in a robot system. As defined here, it is very unclear where and how these transforms attach to actual robot frames. Later, when we define sensors, we can attach these transforms before and after dh chains.

6 Examples:

  head_plate_frame_joint:    [   .0232, .0, .0645, 0, 0, 0]
  double_stereo_frame_joint: [ 0, 0, .003, 0, 0, 0]
  wide_stereo_frame_joint:   [ 0, .03, .0305, 0, 0, 0  ]
  narrow_stereo_frame_joint: [ 0, .06, .0305, 0, 0, 0  ]
  wide_stereo_optical_frame_joint:    [0, 0, 0, -1.20919, 1.20919, -1.20919]
  narrow_stereo_optical_frame_joint:  [0, 0, 0, -1.20919, 1.20919, -1.20919]

Each transform is defined by a list with 6 elements:

Tilting Lasers

The tilting laser primitive is used to represent a laser rangefinder (such as the UTM-30lx) that is actuated.

Example:

  tilt_laser:
    before_joint: [ .09893, 0,  .227, 0, 0, 0]
    after_joint:  [  0, 0,  .03,  0, 0, 0]
    gearing: 1.0
    cov:
      bearing: 0.0005
      range: 0.005
      tilt: 0.0005

tilting_laser.png

Rectified Cameras

The recified camera primitive is used to represent a monocular camera that has already been rectified. We assume that the distortion correction is perfect, and we only have estimate parameters associated with the pinhole camera model.

Example:

  wide_left_rect:
    baseline_shift: 0.0
    f_shift: 0.0
    cx_shift: 0.0
    cy_shift: 0.0
    cov: {u: 0.125, v: 0.125}

Checkerboards

Although the checkerboards themselves aren't really a part of the robot, they still have parameters that need to be defined and could be estimated. Thus this file becomes a convenient location to define them.

2 Examples:

  small_cb_4x5:
    corners_x: 4
    corners_y: 5
    spacing_x: .025
    spacing_y: .025
  large_cb_7x6:
    corners_x: 7
    corners_y: 6
    spacing_x: .108
    spacing_y: .108

corners_y and spacing_y have the same semantics as corners_x and spacing_x

Defining the Robot's Sensors

Once we have a set of primitives defined for the robot, we need to define a set of sensors that are attached to the robot. The sensors simply string various primitives together into actual measurement models. There currently are three types of sensors: camera_chains, tilting_lasers, chains.

The sensors used in the PR2 estimation are defined at pr2_calibration_launch/estimate_params/config_pr2_beta/sensors.yaml. This is a yaml file broken down by sensor type. For each sensor type, there is a list of sensors.

Camera Chain

This sensor is used to define a camera that is attached to an actuated linkage. The standard example of this is a camera on a pan-tilt platform:

camera_chains:
  - camera_id: forearm_right_rect
    sensor_id: forearm_right_rect
    chain:
      before_chain: [r_shoulder_pan_joint]
      chain_id:     right_arm_chain
      after_chain:  [r_forearm_roll_adj, r_forearm_cam_frame_joint,
                     r_forearm_cam_optical_frame_joint]
      dh_link_num:  4

This sensor will only be used with a sample (calibration_msgs/RobotMeasurement) with the following requirements:

Chains

A chain sensor is a sensor that can hold a checkerboard.

Example:

chains:
  - sensor_id:    left_arm_chain
    before_chain: [l_shoulder_pan_joint]
    chain_id:     left_arm_chain
    after_chain:  [left_arm_tip_adj, left_arm_cb]
    dh_link_num:  6

This sensor will only be used with a sample (calibration_msgs/RobotMeasurement) with the following requirements:

Tilting Lasers

All the information that we need about a tilting laser is already in robot primitive definitions. Thus, all we need to do is specify the name of the tilting laser primitive, along with a unique sensor_id.

Example:

  tilting_lasers:
    - sensor_id: tilt_laser
      laser_id:  tilt_laser

This sensor will only be used with a sample (calibration_msgs/RobotMeasurement) with the following requirements:

Defining Which Parameters to Estimate

For each estimation step, we need to specify which parameters we want to estimate. This is done using a yaml file that looks extremely to the primitives definition file, except that instead of having values for all the parameters, it is filled with 0s and 1s. All parameters with 1s will be estimated. These are defined for the pr2 as pr2_calibration_launch/estimate_params/config_pr2_beta/free_*.yaml.

How the estimation works

The main thing that the package does is take all of these previously specified configuration files and generate a Jacobian based on the measurements in the previously collected calibration bagfile (cal_measurements.bag). The following diagram gives an overview of what this Jacobian looks like:

jacobian.png

Rows

There is one blockrow for each calibration_msgs/RobotMeasurement on the robot_measurement topic. Each of the messages is called a sample. For each sample, there are multiple sensor measurements inside, where each sensor is its own sub-blockrow. Each sensor has multiple rows, where the number of rows is determined based on the number of feature points detected, and the type of measurement. For instance, a tilting laser has 3 rows per feature point (cartesian x,y,z) and a camera has 2 rows per feature point (image u,v).

Columns

The columns are separated into two sections: system parameters and checkerboard poses.


2024-11-16 14:50