A collaborative master students project supervised by INS and ifp that integrates a mobile mapping platform with multiple sensors for generating georeferenced point clouds.
As part of the master studies in Geodäsie und Geoinformatik this semester, a new project was offered in cooperation with the Institut for Navigation (INS). The goal was to combine a remote-controlled mobile platform with a GNSS antenna, a ZEB horizon laser scanner and multiple other sensor elements that are able to scan the surroundings before georeferencing and colorizing the point cloud in the postprocessing phase. Mobile mapping platforms provide an easy way to capture as-built data for structures and deformations in surrounding areas. Here, georeferencing plays an important role in contextualizing the information gained by the laser scanner and monitoring long term changes over multiple scans.
A prerequisite to integrate all sensor parts was to design a mount that provides a stable frame while being equally easy to deconstruct for further development. While some parts were created using a 3D printer, the main base plate was laser cut for improved accuracy. During testing an additional cover plate was introduced for the GNSS antenna to shield the sensitive signal from electromagnetic fields generated by the motor in the laser scanner and improving the antenna pattern. Additionally, a GoPro was mounted on the antenna stand to capture 360° images of the surroundings during the data acquisition. This information is later used to colorize the point cloud. To combine the data of all independent system parts, it is especially important to establish the lever arms between laser scanner, GNSS antenna and camera. Ultimately these parameters were measured digitally using the CAD model in Fusion 360 and validated with analog measurements on the vehicle.
On the mount, a Raspberry Pi serves as the central processing unit connecting all sensors as well as the motor controls. It also connects the mapping platform to outside components, like the remote control and a computer. To combine all those parts, it ran ArduPilots autopilot system along with Mission planner as the ground control station.
The main programming part of this project revolved around a georeferencing algorithm and additional colorization and classification of the point cloud. Firstly, the time offset between the scan data and GNSS data was calculated using a cross-correlation approach. With that information, the idea is to match the trajectories of both signals and transform the scan data onto the georeferenced GNSS data. This procedure was split into two parts: A coarse trajectory match as an approximated solution and an accurate trajectory match, iteratively using a least-squares 9 parameter transformation. The now fixed orientation and location of the vehicle and the camera is used to find RGB colors for all scan points by matching the point cloud with the successive images taken during the run. Also, an algorithm for classification was created that will distinguish ground points and detect moving objects like pedestrians or vehicles that will be filtered out in the final cloud.
The results obtained from the post-processing were visualized in Unity and a Cesium-based webpage. As a game engine, Unity offers a high-performance offline rendering of the full data including the possibility to move freely through the 3D point cloud. Cesium on the other hand is designed to enable the online display of large geospatial data by dividing it into smaller 3D tiles for faster loading times. The web-viewer can be accessed openly:
The git-project page can be found here: