KudanSLAM RGB-D SLAM
May 7, 2018
Show all

How to evaluate SLAM accuracy

Simultaneous Localisation and Mapping (SLAM) is a computer problem of finding the position and orientation (localisation) of a device with respect to its surroundings, as well as building a map of the surrounding environment at the same time. Over the years, various SLAM algorithms have been developed to solve this problem (at least approximately), which typically use either a combination of a camera and a depth sensor, or two cameras together (stereo vision) embedded on the device.
At Kudan, we have been developing SLAM algorithms that can be deployed on a variety of different products that use different sensors, such as drones, robots, vehicles, and mobile devices. Continually evaluating the performance of our SLAM system and discovering limitations is an important part of our development process, to ensure we keep pushing the boundaries of what is possible. However, in order to evaluate a SLAM system accurately we first require accurate ground truth data regarding the pose and trajectory of the camera. This may seem simple to obtain at first, but it is difficult to acquire accurately in practice as we will explain in this article. Here are a few techniques we deploy for evaluating SLAM systems at Kudan –

CG Rendering –
This involves generating a synthetic scene on a computer software to test our SLAM system virtually. Using this method, we know the position/orientation and trajectory of the camera with absolute certainty as it is specified by us in the first place, so we can calculate the accuracy of the SLAM system inside the simulation with precision. However, as the scene is computer generated it does not account for real-world noise due to camera lens interference and inaccurate sensor readings, meaning it is not an accurate indicator of real-world performance. Instead, this method allows us to quickly test our algorithm against various virtual scenarios and identify fundamental issues before we introduce real-world noise.

Black Room –
As implied by the name, this method includes using a physical rig inside a dark room to reduce noise from unaccounted light sources and reflections. The setup requires the experimenter to perform a variety of different tests and physical movements, such as sliding the camera across a straight rail or rotating it on a turntable to simulate cyclic motion. As we have the dimensions of the rig and the time taken to move the camera, we can extract the position and orientation of the camera at different points in time. However, as the experimenter relies on experience to move the camera and the movement is not externally tracked, there is room for error which makes our ground truth an approximation at best. This type of setup is mostly useful for conducting small-scope tests and evaluate the viability of the SLAM system outside of a simulation.

HTC Vive setup –
This setup includes the use of a HTC Vive tracker connected to a camera rig which can be moved around within the HTC Vive ‘play area’. We have developed custom code which allows us to continuously track the orientation and trajectory of the camera using the HTC Vive tracker and log its position with much greater accuracy than the Black Room method. It is important to note that although this method allows us to evaluate the SLAM system outside of a simulation with greater accuracy, it is still performed in a controlled lab environment and does not account for many other sources of real-world noise which can affect the SLAM system, especially in outdoor environments.

Outdoor tests –
Following the HTC Vive test rig, we are investigating other methods of testing our SLAM system to account for more realistic scenarios of real-world implementations. This includes testing our system on a drone and setting up a camera rig on a car to collect outdoor data. However, obtaining the trajectory of the camera accurately in outdoor conditions can be difficult as GPS systems are typically accurate within a 5 meter radius, and this accuracy gets much worse near buildings and other obstacles.
Don’t forget to visit our website to find out more about our work at https://www.kudan.eu

Comments are closed.