5

I am getting a point cloud from a lidar on an autonomous driving robot, but it's too much data to process.

I already implemented a passthrough filter.

I did get a very good result and I was asking myself if there were others filters or methods I could dig into.

Of course I'm not looking for anything specific but rather a direction or advice, because I'm pretty new to the pcl library and it seems pretty huge.

Here is my filter now:

    pcl::PointCloud<PointXYZIR>::Ptr cloudInput;
    cloudInput.reset(new pcl::PointCloud<PointXYZIR> (cloud_in));

    pcl::PointCloud<PointXYZIR>::Ptr cloudFiltered;
    cloudFiltered.reset(new pcl::PointCloud<PointXYZIR>);

    // Create the filtering object: downsample the dataset using a leaf size
    pcl::VoxelGrid<PointXYZIR> avg;
    avg.setInputCloud(cloudInput);
    avg.setLeafSize(0.25f, 0.25f, 0.25f);
    avg.filter(*cloudFiltered);

    //Filter object
    pcl::PassThrough<PointXYZIR> filter;
    filter.setInputCloud(cloudFiltered);

    filter.setFilterFieldName("x");
    filter.setFilterLimits(-100, 100);
    filter.filter(*cloudFiltered);

    filter.setFilterFieldName("y");
    filter.setFilterLimits(-100, 100);
    filter.filter(*cloudFiltered);

    cloud_out = *cloudFiltered;
brad
  • 930
  • 9
  • 22
tony497
  • 360
  • 1
  • 3
  • 15

2 Answers2

3

voxel grid to downsampling should maintain pretty good cloud distribution while reducing the number of points. You can set how small the voxels are in each axis in order to maintain as much or as little resolution as you want. Each voxel will delete all points in it and replace them with a single point which is averaged from those deleted. http://pointclouds.org/documentation/tutorials/voxel_grid.php#voxelgrid

Sneaky Polar Bear
  • 1,611
  • 2
  • 17
  • 29
3

Actually i did find a solution, but there is no general solution. In my case, and i think this problem is very specific to which point cloud you gonna get and what you wanna do with it.

The passtrought filter is a very efficient way to down sample without taking too many risk of losing interesting data.

http://pointclouds.org/documentation/tutorials/passthrough.php

Then i tested StatisticalOutlierRemoval, is efficient but not relevant in my case.

http://pointclouds.org/documentation/tutorials/statistical_outlier.php

Now, i downsample the pointcloud with a leafsize function, then i create a kdtree to filter the point by radius. It's about the same amount of calculation that the passtrought filter took, but it has more sense in my project to do it this way.

 // Create the filtering object: downsample the dataset using a leaf size
    pcl::VoxelGrid<PointXYZIR> avg;
    avg.setInputCloud(cloudInput);
    avg.setLeafSize(0.25f, 0.25f, 0.25f);
    avg.filter(*cloudFiltered);

    //searchPoint
    PointXYZIR searchPoint = cloudFiltered->at(0) ;

    //result from radiusSearch()
    std::vector<int> pointIdxRadiusSearch;
    std::vector<float> pointRadiusSquaredDistance;

    //kdTree
    pcl::KdTreeFLANN<PointXYZIR> kdtree;
    kdtree.setInputCloud (cloudFiltered);
    kdtree.setSortedResults(true);

    if ( kdtree.radiusSearch (searchPoint, 100, pointIdxRadiusSearch, pointRadiusSquaredDistance) > 0 )
    {
        //delete every point in target
        for (size_t j = 0; j < pointIdxRadiusSearch.size (); ++j)
        {
            //is this the way to erase correctly???
            cloud_out.push_back(cloudFiltered->points[pointIdxRadiusSearch[j]]);
        }
    }
tony497
  • 360
  • 1
  • 3
  • 15