Multisensor splicing of 2D lidar

0. Introduction

In order to ensure the 360 ° environmental coverage of the lidar, we often need to use multi-sensor splicing. If we simply take and read the information of the lidar, as shown in the figure below, the two lidars will overlap, which requires us to calibrate the lidar.

<arg name="device_ip1" default="" />
<arg name="device_ip2" default="" />

<arg name="msop_port1" default="2368" />
<arg name="difop_port1" default="2369" />
<arg name="msop_port2" default="2370" />
<arg name="difop_port2" default="2371" />
<arg name="return_mode" default="1" />
<arg name="time_synchronization" default="true" />

1. ICP and NDT

NDT: equivalent to grid ICP


  • Rasterization can remove the influence of noise

  • NDT algorithm is easy to accelerate with GPU


  • For structured point clouds, NDT's assumption of Gaussian distribution of a grid is not tenable and the effect is not good. On the contrary, point line ICP(2D) or point plane ICP is better.

  • For unstructured large-scale point clouds, the NDT speed is faster, and the robustness of the initial value depends on the grid size. The larger the grid size, the worse the accuracy, but the robustness to the initial value is better. On the contrary, it is more dependent on the initial value, similar to the Min corresponding to ICP_ Distance is the minimum matching distance.

  • If you want to make the point cloud matching less sensitive to the initial value, you can consider the method of CSM+ICP. CSM determines an initial range and then makes it accurate through ICP.

The detailed mathematical derivation of the two methods can be referred to This article Generally, we can use it directly in the PCL library

2. Multi lidar calibration

Here is a very good blogger AdamShan , the article has written a very detailed content for the multi laser calibration part. It also exists on Github Multi lidar calibration as well as Lidar and camera There is no code posted here (this paragraph is written to facilitate subsequent self review and search). The indoor robot will also face the problem of using multi-sensor calibration (such problem will also exist in the external parameter matching between maps in multi robot formation). Here is a matching scheme based on 2D Radar.


#include <iostream>
#include <vector>

#include <ros/ros.h>
#include <sensor_msgs/LaserScan.h>
#include <geometry_msgs/TwistStamped.h>
#include <geometry_msgs/Transform.h>
#include <geometry_msgs/TransformStamped.h>
#include <message_filters/subscriber.h>
#include <message_filters/synchronizer.h>
#include <message_filters/sync_policies/approximate_time.h>

#include <tf2/utils.h>
#include <tf2_ros/transform_listener.h>

#include <pcl_ros/point_cloud.h>

#include <pcl/point_cloud.h>
#include <pcl/point_types.h>
#include <pcl/registration/icp.h>
#include <pcl/io/ply_io.h>
#include <pcl/io/vtk_lib_io.h>
#include <pcl/visualization/cloud_viewer.h>
#include <pcl/visualization/pcl_visualizer.h>

class MultiLidarCalibration
    MultiLidarCalibration(ros::NodeHandle &n);

    // Function processing
    void Run();

    // Subscribed laser topics
    std::string source_lidar_topic_str_;
    std::string target_lidar_topic_str_;

    // Lidar coordinate system
    std::string source_lidar_frame_str_;
    std::string target_lidar_frame_str_;

    // icp matching score
    float icp_score_;

    //  In base_ Main in link coordinate system_ laser_ Coordinates of link
    float main_to_base_transform_x_;
    float main_to_base_transform_y_;
    float main_to_base_transform_row_;
    float main_to_base_transform_yaw_;

    // Matrix assignment at the first run
    bool is_first_run_;

    // The transfrom between two lasers is obtained by tf2
    Eigen::Matrix4f transform_martix_;
    // Main laser to base_link TF
    Eigen::Matrix4f front_to_base_link_;

    ros::NodeHandle nh_;
    // Correct laser output, type pointcloud2
    ros::Publisher final_point_cloud_pub_;
    // Laser time synchronization and data cache
    typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::LaserScan, sensor_msgs::LaserScan> SyncPolicyT;
    message_filters::Subscriber<sensor_msgs::LaserScan> *scan_front_subscriber_, *scan_back_subscriber_;
    message_filters::Synchronizer<SyncPolicyT> *scan_synchronizer_;

    // Laser data in pcl format
    pcl::PointCloud<pcl::PointXYZ>::Ptr main_scan_pointcloud_;
    // Calibration radar data
    pcl::PointCloud<pcl::PointXYZ>::Ptr sub_scan_pointcloud_;
    // Laser data to be calibrated after tf2 conversion
    pcl::PointCloud<pcl::PointXYZ>::Ptr sub_scan_pointcloud_init_transformed_;

     * @brief icp 
     * @param final_registration_scan_ The data to be calibrated is converted to the laser point cloud data in the target coordinate system by icp processing
    pcl::IterativeClosestPoint<pcl::PointXYZ, pcl::PointXYZ> icp_;
    pcl::PointCloud<pcl::PointXYZ>::Ptr final_registration_scan_;

    // Initial coordinate transformation between two laser coordinate systems
    void GetFrontLasertoBackLaserTf();

    // Subscribe to two laser data of main radar and sub radar
    void ScanCallBack(const sensor_msgs::LaserScan::ConstPtr &in_main_scan_msg, const sensor_msgs::LaserScan::ConstPtr &in_sub_scan_msg);

    // Polar coordinates to Cartesian coordinates, sensor_ Msgs:: Laserscan to PCL:: pointcloud < PCL:: pointxyz > (there are many methods, not limited to this one)
    pcl::PointCloud<pcl::PointXYZ> ConvertScantoPointCloud(const sensor_msgs::LaserScan::ConstPtr &scan_msg);

    // The calibrated laser point cloud is published, and the published data format is sensor_msgs::PointCloud2
    void PublishCloud(pcl::PointCloud<pcl::PointXYZ>::Ptr &in_cloud_to_publish_ptr);

    // icp matching of laser point cloud of main radar and sub radar
    bool ScanRegistration();

    // Publish calibration results
    void PrintResult();

    // visualization
    void View();

... please refer to Gu Yueju

Tags: Embedded system ROS

Posted on Sat, 30 Oct 2021 21:44:36 -0400 by gingerboy101