IMU Data Capture
Unlike other modalities supported by MetriCal, IMU calibrations do not require any specific object space or target, as these types of components (accelerometers and gyroscopes) measure forces and rotational velocities directly.
What IMUs do require, however, is that they are calibrated alongside one or more cameras. The advice herein assumes that your dataset will be configured to perform a camera ↔ IMU calibration.
IMU ↔ Camera Calibration
Maximize View of Object Spaces for Duration of Capture
IMU calibrations are structured such that MetriCal jointly solves for the first order gyroscope and accelerometer biases in addition to solving for the relative IMU-from-camera extrinsic. This is done by comparing the world pose (or world extrinsic) of the camera between frames to the expected motion that is measured by the accelerometer and gyroscope of the IMU.
Because of how this problem is posed, the best way to produce consistent, precise IMU ↔ camera calibrations is to maximize the visibility of one or more targets in the object space from one of the cameras being calibrated alongside the IMU. Put in a different way: avoid capturing sections of data where the IMU is recording but where no object space or target can be seen from any camera. Doing so can lead to misleading bias estimates.
Excite All IMU Axes During Capture
IMU calibrations are no different than any other modality in how they are entirely a data-driven process. In particular, the underlying data needs to demonstrate observed translational and rotational motions in order for MetriCal to understand the motion path that the IMU has followed.
This is what is meant by "exciting" an IMU: accelerations and rotational velocities must be observable in the data (different enough from the underlying noise in the measurements) so as to be separately observable from e.g. the biases. This means that when capturing data to calibrate between an IMU and one or more cameras, it is important to move the IMU translationally across all accelerometer axes, and rotationally about all gyroscope axes. This motion can be repetitive so long as a sufficient magnitude of motion has been achieved. It is difficult to describe what that magnitude is as that magnitude is dependent on what kind of IMU is being calibrated (e.g. MEMS IMU, how large it is, what kinds of noise it measures, etc.).
We suggest alternating between periods of "excitement" or motion with the IMU and holding still so that the camera(s) can accurately and precisely measure the given object space.
If you find yourself still having trouble getting a sufficient number of observations to produce
reliable calibrations, we suggest bumping up the threshold for our motion filter heuristic up
when calling metrical calibrate
. This is controlled by the -t
or --stillness-threshold
flags. A value of 3.0 through 5.0 can sometimes improve the quality of the calibration a
significant amount.
Reduce Motion Blur in Camera Images
This advice holds for both multi-camera and IMU ↔ camera calibrations. It is often advisable to reduce the effects of motion in the images to produce more crisp, detailed images to calibrate against. Some ways to do this are to:
- Always use a global shutter camera
- Reduce the overall exposure time of the camera
- Avoid over-exciting IMU motion, and don't be scared to slow down a little if you find you can't detect the object space much if at all.
MetriCal currently does not perform well with IMU ↔ camera calibrations if the camera is a rolling shutter camera. We advise calibrating with a global shutter camera whenever possible.