The start
Today I had a talk with Toby. I got some (seemingly) simple instructions for now:
- Read out an IMU with a programming language or to file I got a MinIMU-9 v2 (L3GD20 and LSM303DLHC carrier) of Pololu, which you can hook up with an Arduino.
- Track a camera’s distance with respect to a marker Toby suggested to use ArUco, which should be easy. I can use my laptop’s webcam, or a Logitech C920.
- Perform a simple kind of fusion between the tracked position and the IMU data Show that the IMU can improve the pose estimate of the tracker, for instance, by either pre- or postprocessing ArUco’s output.
There is also some theoretical stuff to do:
- Read in on Kalman filters The online lectures of Cyrill Stachniss were recommended. I should also do the homework.
- In Caarls’ PhD thesis, read up on specific Kalman filters and „continuous time processes” Because the IMU generates more often new data than the cameras, integrating this needs investigation.
- Collect papers on IMU – (stereo)camera fusion If finished with all, search for papers on SLAM/PTAM methods that introduce some method of fusing these two sensors.
I started with the first practical step. I installed the arduino
package for Ubuntu, and did what the related Pololu software told me to do. To see if the provided drift correction works properly, I taped it down the table and let it run for some time. Results will follow with and without drift correction.
References
-
Jurjen Caarls.
Pose estimation for mobile devices and augmented reality.
PhD thesis, Delft University of Technology, 2009.
[ bib ]
@phdthesis{caarls2009pose, title = {Pose estimation for mobile devices and augmented reality}, author = {Caarls, Jurjen}, year = {2009}, school = {Delft University of Technology} }