== DLR Spatial Cognition Data Set == A data set for comparing and benchmarking data association algorithms. There are two data sets in the package. One has artificial landmarks with white or black circles on the ground. The position of the landmark is then given as as relative 2D coordinate in the robots frame. The second data set is more difficult. The landmark are natural vertical lines in the image. Their position is only given as an angle in respect to the robot. The data sets are preprocessed and provide geometric data as measurement. No computer vision or similar is necessary. === Videos === ||{{attachment:circscreen.png|alt Circles| width=400px}} || {{attachment:linescreen.png|alt Lines| width=400px}}|| || [[attachment:circles.avi| Download Video]] || [[attachment:lines.avi| Download Video]] || === Description === The data set we provide can be used by researchers to test and compare their algorithms related to feature-based SLAM and in particular data association. The data set covers a large path mostly in an indoor office environment with '''artificial and natural landmarks'''. It provides odometry and geometrical results from computer vision based landmark detection, as well as raw images and camera calibration data. The artificial landmarks are white discs on the floor where the position relative to the robot is provided as measurement. The natural landmarks are vertical lines where only the bearing relative to the robot is provided and data association is remarkably difficult. A '''technical report''' is also available, which provides full information about experimental setup, model and measurement functions. The focus of this data set is on data association and for comparison of different approaches we provide '''full ground truth by annotating all features'''. === Experimental Setup === ==== Map / Environment ==== The data set evolved from the data used for the PhD. Thesis of [[en/UdoFrese|Udo Frese]] on efficient SLAM. It was recorded at the DLR (Deutsches Zentrum für Luft und Raumfahrt) Institute of Robotics and Mechatronics building with a mobile robot controlled by hand. The building covers a '''region of 60m x 45m''' and the robot path consists of three '''large loops''' within the building (plus a small outside path) with a total '''length of 505 meters'''. On the way the robot visits '''29 rooms'''. Here you can see an architectural map with the path the robot traveled included [[ attachment:map2.png|{{attachment:map2.png|alt Map| width=400px}} ]] ==== Robot Hardware ==== The robot is equipped with four wheels, each with one motor for steering and another one for moving. Since wheel axis and steering axis intersect, the robot is able to move omni-directionally but is not holonomic. {{attachment:robot1.png|alt Map| width=400px}}{{attachment:robot2.png|alt Map| height=400px}} === Downloads === More detailed description of the whole scene, the data format, useful euqations etc. can be found in the technical report provided with this data set. * [[attachment:dlr_set_techreport.pdf| Technical Report]] for the DLR-Spatial Cognition Data Set * Get current version of the Data Set package (updated February 2010): [[attachment:DLR-Spatial_Cognition.tar.gz| Data Set with images (919MB)]] [[attachment:DLR-Spatial_Cognition_noimages.tar.gz| Data Set without images (12MB)]] * The data set is also available at Radish: http://cres.usc.edu/radishrepository/view-one.php?name=DLR-Spatial_Cognition