Posts by topic
-
Arduino (1) ROS Pololu driver
I will write a ROS wrapper for the Pololu MinIMU-9 v2 (L3GD20 and LSM303DLHC Carrier). More info inside.
-
ROS (2) Building the ROS message synchronizer I need
To do this, I had to learn about the ROS
message_filters
androsbag
APIs.ROS Pololu driver
I will write a ROS wrapper for the Pololu MinIMU-9 v2 (L3GD20 and LSM303DLHC Carrier). More info inside.
-
RTOS (4) Second thoughts on RTOS for my project
Earlier I tried to work with Wind River Linux on the UDOO. Not sure if it’s going to work, but RTAI seems like a reliable option.
Compiling Wind River Linux, day 2
Today I’m trying to compile wrlinux with
--dl-layers
. Again, errors occur.Setting up Wind River Linux for UDOO x86 Advanced Plus
I have downloaded Wind River Linux 10.19.BASE, and generated an image for the UDOO. Results are recorded in pkok/udoo-wrlinux
Using an RTOS for data acquisition?
During the latest thesis coaching session, Toto suggested using an RTOS to synchronize the external devices (and thus their clock signals).
-
augmented reality (4) Calibration between the IMU and camera is essential, but there are multiple methods to achieve this. A few references are listed.
Experiment design
After the talk with all supervisors, one of the biggest questions is how the system evaluation will be performed. A small discussion on AR system evaluation seems appropriate.
Preparing for meeting
As a preparation for the first meeting with all supervisors, Toby and I discussed several options of combinations of papers to base my thesis on, and made a (semi-)concrete proposal.
Thesis intro
For my Master’s thesis I will investigate marker-less tracking of a head-mounted display with a particle filter for augmented reality applications. Sensor fusion will be applied to an inertial measurement unit and a stereocamera.
-
disability (1) Thesis outline and dataset requirements
After speaking with Arnoud, I’m restarting again. Focus for today: thesis outline, and getting the requirements for the dataset on paper.
-
sensor/realsense (3) Synchronization statistics of Structure Core and RealSense D435i
Both sensors report to be „synchronized”, but do not report which data streams are all synchronized. This is an initial check to see how well-synchronized both sensors are.
Second attempt with RealSense D435i
No problems* with the D435i anymore.
First attempt with RealSense D435i
Installed the drivers for RealSense D435i, and
realsense-view
works directly. -
sensor/structure (2) Synchronization statistics of Structure Core and RealSense D435i
Both sensors report to be „synchronized”, but do not report which data streams are all synchronized. This is an initial check to see how well-synchronized both sensors are.
First attempt with Structure Core
Installed the drivers for the Structure Core, and the complete device works directly.
-
sensors (1) Xsens sensors
Today I have an Xsens MTi-28A53G35 and MTx-28A53G25 on my desk in Delft, cables included! I am trying to figure out how to read data from them, and what their model numbers exactly mean.
-
thesis (68) Comparing ORBSLAM3 results of multiple runs graphically
Tables of mean, RMSE, median and standard deviations might not be the most useful analysis tool when the error distribution is not known. Graphs of the distribution shapes might be more informative.
Multiple ORBSLAM3 runs of `mono_euroc` on EuRoC MH01 with same params
There seems to be some variation occurring in running ORBSLAM3 with similar arguments. Let’s see what the noise is.
Analyzing ORBSLAM3's published results against the public repo's defaults
Recently, ORB SLAM3 came out. I will focus on an ablation study of this system. But first, reproduce the published results.
Inspecting differences in performance between `raulmur/ORBSLAM2` and `jingpang/LearnVIORB`
The canonical implementation of ORB-SLAM2 by Raúl Mur-Artal seems to perform better than the (non-visual-inertial) binaries generated by the code of Jing Wang. I’m looking into why, so Wang’s code can be improved, and perhaps some bugs for the stereo case may be squashed.
Building the ROS message synchronizer I need
To do this, I had to learn about the ROS
message_filters
androsbag
APIs.Using Jing Wang's code as basis for VI-ORBSLAM2
I’ve made some decisions in approach, listing them here with motivation:
VI-ORBSLAM(2) implementations
I’ve been looking for already existing implementations of a visual-inertial ORB-SLAM or ORB-SLAM2. So far, I haven’t found anything yet.
ORB-SLAM2 keeps resetting in SLAMBench3
Yesterday I reported that only after many frames ORB-SLAM2 started tracking in SLAMBench. Today I performed some tests.
Getting SLAMBench to work
Today I’m trying to get SLAMBench 3 to work. It needed some modifications.
Second thoughts on RTOS for my project
Earlier I tried to work with Wind River Linux on the UDOO. Not sure if it’s going to work, but RTAI seems like a reliable option.
Compiling Wind River Linux, day 2
Today I’m trying to compile wrlinux with
--dl-layers
. Again, errors occur.Setting up Wind River Linux for UDOO x86 Advanced Plus
I have downloaded Wind River Linux 10.19.BASE, and generated an image for the UDOO. Results are recorded in pkok/udoo-wrlinux
Using an RTOS for data acquisition?
During the latest thesis coaching session, Toto suggested using an RTOS to synchronize the external devices (and thus their clock signals).
Benchmark of serializers
As we have previously seen, throughput is essential. One major factor in throughput is how data is serialized. With the experiments in this post, I motivate my choice for capnproto in this project.
Thesis outline and dataset requirements
After speaking with Arnoud, I’m restarting again. Focus for today: thesis outline, and getting the requirements for the dataset on paper.
Synchronization of different clocks
How do I synchronize signals with timestamps from 3 different clocks?
Designing the data collector
To store the collected data without hickups, a small systems needs to be designed.
Synchronization statistics of Structure Core and RealSense D435i
Both sensors report to be „synchronized”, but do not report which data streams are all synchronized. This is an initial check to see how well-synchronized both sensors are.
First attempt with Structure Core
Installed the drivers for the Structure Core, and the complete device works directly.
Second attempt with RealSense D435i
No problems* with the D435i anymore.
First attempt with RealSense D435i
Installed the drivers for RealSense D435i, and
realsense-view
works directly.ORBSLAM datasets review
Together with Arnoud, I decided on a clearer, more up-to-date topic for my thesis. Also, an investigation on available datasets for this topic.
Reviewing my implementation design choices.
Improved the motion blur module by implementing a modified version of Zheng et al.’s method.
Today, I am rebooting this thing after some difficult times. I investigate what I have done, and what needs to be developed/written.
Moved SymPy to the global Python install, to install GAlgebra, to fake Quaternions.
Installed a
pyenv
for symbolic math, worked for the first time withsympy
.- Installed OpenCV 3.1 with Python 3 support
- Installed OpenCV Contrib, commit
afcb7bb
- Created Aruco board with
./example_aruco_create_board -h=6 -w=4 -l=28 -s=4 -d=4 $OUTPUT_FILE
Recently installed ROS Jade (only version in Ubuntu 15.04), which requires OGRE 1.9. Gazebo requires OGRE 1.8.
gazebo
doesn’t show a GUI anymore,gz
nags aboutlibogre....1.8.0.so
listener
makes log recordings ofgazebo::msgs::IMU
,gazebo::msgs::ImageStamped
andgazebo::msgs::PosesStamped
. However, myreader
has problems reading it. Error message at message round 2:Idea for syncing sensors in Gazebo: write a plugin for the „easy to compute” sensor (IMU), which only publishes its data when the „heavy to compute” sensor (camera) publishes data.
Decided to focus on a simulated implementation for my thesis. Main reason: could not obtain extrinsic camera-IMU calibration.
Camera’s exposure could be set by hand to 400 µs, and pattern can be tracked fairly well now. Still, Kalibr doesn’t work. Still in contact with its development team.
Now that the camera is triggered by the IMU and I implemented the first model of Bleser et al., I really need to obtain the extrinsic multi-sensor calibration. Otherwise I can’t test that first model.
Derivatives of some functions I need to use:
Yesterday’s problem is solved (hopefully). Also, nice overview on coordinate representation.
Struggling with implementation of the Extended Kalman Filters of Bleser et al.
Not to taunt faith, but all technical difficulties are resolved! Also: graph of data flow and their representations.
The AVT driver works with SyncIn1 in the timestamp branch!
Connection schemas for my Arduino.
I want to use the SyncIn and SyncOut connectors of the MTi-G-700 and Prosilica GE680C. To do so, I probably need a CA-MP-MTi cable. This post contains a connector schema for the connector head.
The polling on the Prosilica does not work yet. Images of irregular data size are sent, even in „regular”/streaming mode.
Polling, or why ROS's timestamps still might work
Previously I made a case against sensors which don’t provide an own timestamp for sensor fusion. By using a device you can poll, you still can use your main device (laptop)’s timestamps.
Why ROS's timestamps are not enough
The timestamps of observations are important for sensor fusion. Sensor fusion finds a relation between observations of multiple sensors with respect to the time of observation. The necessary precision of these timestamps is related to the highest update frequency. When timestamps are made on a non-dedicated unit, timestamps have a variable offset from the time of observation. An experimental analysis is given and possible approaches for solving this problem are presented.
Big differences in Kalibr calibration output after running on different (but similar) datasets.
Delft’s Xsens is giving some errors, while UvA’s is working fine, after configuring. I started expecting the error when comparing both orientation filters.
New mission: replace the camera’s orientation with information of the IMU. More detailed problem description:
I’ve tried working with ArUco in ROS, but for some reason I don’t get a correct position back. Good thing is, I found Sahloul’s
ar_sys
!Markers
Please don’t remove the markers in the lab
Recalibrated camera to be sure everything is alright. Made a new dataset, with which Kalibr can calibrate!
Running yet another dataset, after trying to set the camera’s focus (was not stabilized) and doing some configuration of the IMU.
New dataset with AprilGrid,more rotation. Still no succes.
Converting the images only to grayscale did not do the trick. The Numpy array had shape
(n, m, 1)
, so it was a 3D array. Reshaping the array to(n, m)
resolved the problem.The module errors of Kalibr are gone, but Kalibr can’t recognize the chessboard on RGB video. Maybe images should be grayscale?
Understanding of ROS has increased! Some ROS tools I learnt to use, and notes on them.
I can work around the issues that
kalibr_calibrate_cameras
raise. Also, I should learn more about ROS.I’ve tried to fix the previous problem (not fixed), but I don’t understand what is going wrong.
Kalibr and ROS needs some getting-used to. As a first test, I want to calibrate two cameras without an IMU.
Xsens replied; I probably did use the right values. They also added that for these purposes, MTi sensors are more frequently used.
I started working with Kalibr today. Main conclusion: compilation takes quite some time.
Calibration between the IMU and camera is essential, but there are multiple methods to achieve this. A few references are listed.
Test environment
FINALLY I got some basic tracking working under OpenGL! EDIT: Sorry, a lie.
Xsens sensors
Today I have an Xsens MTi-28A53G35 and MTx-28A53G25 on my desk in Delft, cables included! I am trying to figure out how to read data from them, and what their model numbers exactly mean.
Experiment design
After the talk with all supervisors, one of the biggest questions is how the system evaluation will be performed. A small discussion on AR system evaluation seems appropriate.
Preparing for meeting
As a preparation for the first meeting with all supervisors, Toby and I discussed several options of combinations of papers to base my thesis on, and made a (semi-)concrete proposal.
Progress report
A small report on what I have done in a little less than a week’s time.
The start
Today I had a talk with Toby. I got some (seemingly) simple instructions for now:
Thesis intro
For my Master’s thesis I will investigate marker-less tracking of a head-mounted display with a particle filter for augmented reality applications. Sensor fusion will be applied to an inertial measurement unit and a stereocamera.