Skip to main content
Version: 14.1

Local Navigation Systems

This guide walks you through the process of calibrating local navigation systems (LNS) using MetriCal. Local navigation systems are represented by odometry messages, and provide precise positioning in indoor or GPS-denied environments.

Camera Needed

LNS calibration assumes the presence of a camera in the dataset. MetriCal will fail to calibrate the extrinsics of LNS without a camera present.

Purchase Targets, or Make Your Own

You can now purchase calibration targets directly from our online store. This store sells the targets necessary to run through this calibration guide. If you are a company or facility that needs a more sophisticated target setup for automation or production line purposes, contact us at info@tangramvision.com.

That being said, you can always make your own! Find examples for AprilGrid, Markerboard, and Lidar targets in the MetriCal Premade Targets repository on GitLab.

Practical Example

We've captured an example of a good local navigation system dataset that you can use to test out MetriCal. If it's your first time performing a local navigation system calibration using MetriCal, it might be worth running through this dataset once just so that you can get a sense of what good data capture looks like.

Running the Example Dataset

First, download the dataset here and unzip it somewhere on your computer. Then, copy the following bash script into the dataset directory:

warning

This script assumes that you have either installed the apt version of MetriCal, or that you are using the docker version with an alias set to metrical. For more information, please review the MetriCal installation instructions.

#!/bin/bash -i
DATA=./demo_camera_lns.mcap
INIT_PLEX=init_plex.json
OBJ=lns_object_space.json
OUTPUT=results.json
REPORT=results.html

# Front cameras
metrical init \
-y \
-m *odometry*:lns \
-m *camera*:eucm \
$DATA $INIT_PLEX

metrical calibrate \
--report-path $REPORT \
-o $OUTPUT $DATA $INIT_PLEX $OBJ

The metrical init command uses the -m flag to describe the system being calibrated. It indicates that any topics containing the word odometry are our local navigation system topics, and topics containing camera are our camera topics using the EUCM (Enhanced Unified Camera Model). We can use the * glob character to capture multiple topics, or to just prevent extra typing when telling MetriCal which topics to use. The configuration generated by metrical init is saved to a file named plex.json, which will be used during the calibration step to configure the system. You can learn more about plexes here.

Note that the --render flag is not included in this example script, but you can add it to the metrical calibrate command if you want to watch the detection phase of the calibration as it happens in realtime. This can have a large impact on performance, but is invaluable for debugging data quality issues.

warning

MetriCal depends on Rerun for all of its rendering. As such, you'll need a specific version of Rerun installed on your machine to use the --render flag. Please ensure that you've followed the visualization configuration instructions before running this script.

When the script finishes, you'll be left with three artifacts:

  • plex.json - as described in the prior section.
  • report.html - a human-readable summary of the calibration run. Everything in the report is also logged to your console in realtime during the calibration. You can learn more about interpreting the report here.
  • results.json - a file containing the final calibration and various other metrics. You can learn more about results json files here and about manipulating your results using shape commands here.

And that's it! Local navigation system calibration is a bit simpler than other sensor calibrations, so you should be able to run through this example quickly. Hopefully this trial run will have given you a better understanding of how to capture your own local navigation system calibration.

Data Capture Guidelines

Maximize View of Object Spaces for Duration of Capture

Similar to IMU, LNS calibration is done by comparing the world pose (or world extrinsic) of the camera between frames to the interpolated motion of the local navigation system.

Because of how this problem is posed, the best way to produce consistent, precise LNS ↔ camera calibrations is to maximize the visibility of one or more targets in the object space from one of the cameras being calibrated alongside the LNS. Put in a different way: avoid capturing sections of data where the LNS is recording but where no object space or target can be seen from any camera. Doing so can lead to misleading bias estimates.

Excite All Axes During Capture

LNS calibrations are no different than any other modality in how they are entirely a data-driven process. In particular, the underlying data needs to demonstrate observed translational and rotational motions in order for MetriCal to understand the motion path that the LNS has followed.

This is what is meant by "exciting" an LNS: accelerations and rotational velocities must be observable in the data (different enough from the underlying noise in the measurements) so as to be separately observable from e.g. the biases. This means that when capturing data to calibrate between an LNS and one or more cameras, it is important to move the LNS in all 6 degrees of freedom. This motion can be repetitive so long as a sufficient magnitude of motion has been achieved.

success

We suggest alternating between periods of "excitement" or motion with the LNS and holding still so that the camera(s) can accurately and precisely measure the given object space.

If you find yourself still having trouble getting a sufficient number of observations to produce reliable calibrations, we suggest bumping up the threshold for our motion filter heuristic when calling metrical calibrate. This is controlled by the --camera-motion-threshold flag. A value of 3.0 through 5.0 can sometimes improve the quality of the calibration a significant amount.

Reduce Motion Blur in Camera Images

This advice holds for both multi-camera and LNS ↔ camera calibrations. It is often advisable to reduce the effects of motion in the images to produce more crisp, detailed images to calibrate against. Some ways to do this are to:

  1. Always use a global shutter camera
  2. Reduce the overall exposure time of the camera
  3. Avoid over-exciting LNS motion, and don't be scared to slow down a little if you find you can't detect the object space much if at all.
danger

MetriCal currently does not perform well with LNS ↔ camera calibrations if the camera is a rolling shutter camera. We advise calibrating with a global shutter camera whenever possible.

Troubleshooting

If you encounter errors during calibration, please refer to our Errors documentation.