Tracking object in real time
This tutorial explains 6D object tracking and show example code(tracking_sample.cpp) using pcl::tracking libraries. Implementing this example code, you can see the segment track the target object even if you move tracked object or your sensor device. In example, first, you should initialize tracker and you have to pass target object’s point cloud to tracker so that tracker should know what to track. So, before this tutorial, you need to make segmented model with PCD file beforehand. Setting the model to tracker, it starts tracking the object.
Following figure shows how looks like when trakcing works successfully.

fig1: The blue model tracks the cup successfully with red particles.
Details
The pcl_tracking library contains data structures and mechanism for 3D tracking which uses Particle Filter Algorithm. This tracking will enable you to implement 6D-pose (position and rotation) tracking which is optimized to run in real time.
- At each loop, tracking program proceeds along with following algorithm.(see fig2)
(At t = t - 1) At first, using previous Pariticle’s information about position and rotation, it will predict each position and rotation of them at the next frame.
Next, we calculate weights of those particles with the likelihood formula below.(you can select which likelihood function you use)
Finally, we use the evaluate function which compares real point cloud data from depth sensor with the predicted particles, and resample particles.

fig2: The process of tracking Particle Filter
The code
Create three files, paste following code with your editor and save it as tracking_sample.cpp.
tracking_sample.cpp
1#include <pcl/point_cloud.h>
2#include <pcl/point_types.h>
3#include <pcl/io/openni_grabber.h>
4#include <pcl/console/parse.h>
5#include <pcl/common/time.h>
6#include <pcl/common/centroid.h>
7
8#include <pcl/visualization/cloud_viewer.h>
9#include <pcl/visualization/pcl_visualizer.h>
10#include <pcl/io/pcd_io.h>
11
12#include <pcl/filters/passthrough.h>
13#include <pcl/filters/voxel_grid.h>
14#include <pcl/filters/approximate_voxel_grid.h>
15
16#include <pcl/sample_consensus/method_types.h>
17#include <pcl/sample_consensus/model_types.h>
18
19#include <pcl/search/pcl_search.h>
20#include <pcl/common/transforms.h>
21
22#include <pcl/tracking/tracking.h>
23#include <pcl/tracking/particle_filter.h>
24#include <pcl/tracking/kld_adaptive_particle_filter_omp.h>
25#include <pcl/tracking/particle_filter_omp.h>
26#include <pcl/tracking/coherence.h>
27#include <pcl/tracking/distance_coherence.h>
28#include <pcl/tracking/hsv_color_coherence.h>
29#include <pcl/tracking/approx_nearest_pair_point_cloud_coherence.h>
30#include <pcl/tracking/nearest_pair_point_cloud_coherence.h>
31
32#include <boost/format.hpp>
33
34#include <mutex>
35#include <thread>
36
37using namespace pcl::tracking;
38using namespace std::chrono_literals;
39
40typedef pcl::PointXYZRGBA RefPointType;
41typedef ParticleXYZRPY ParticleT;
42typedef pcl::PointCloud<pcl::PointXYZRGBA> Cloud;
43typedef Cloud::Ptr CloudPtr;
44typedef Cloud::ConstPtr CloudConstPtr;
45typedef ParticleFilterTracker<RefPointType, ParticleT> ParticleFilter;
46
47CloudPtr cloud_pass_;
48CloudPtr cloud_pass_downsampled_;
49CloudPtr target_cloud;
50
51std::mutex mtx_;
52ParticleFilter::Ptr tracker_;
53bool new_cloud_;
54double downsampling_grid_size_;
55int counter;
56
57
58//Filter along a specified dimension
59void filterPassThrough (const CloudConstPtr &cloud, Cloud &result)
60{
61 pcl::PassThrough<pcl::PointXYZRGBA> pass;
62 pass.setFilterFieldName ("z");
63 pass.setFilterLimits (0.0, 10.0);
64 pass.setKeepOrganized (false);
65 pass.setInputCloud (cloud);
66 pass.filter (result);
67}
68
69
70void gridSampleApprox (const CloudConstPtr &cloud, Cloud &result, double leaf_size)
71{
72 pcl::ApproximateVoxelGrid<pcl::PointXYZRGBA> grid;
73 grid.setLeafSize (static_cast<float> (leaf_size), static_cast<float> (leaf_size), static_cast<float> (leaf_size));
74 grid.setInputCloud (cloud);
75 grid.filter (result);
76}
77
78
79//Draw the current particles
80bool
81drawParticles (pcl::visualization::PCLVisualizer& viz)
82{
83 ParticleFilter::PointCloudStatePtr particles = tracker_->getParticles ();
84 if (particles && new_cloud_)
85 {
86 //Set pointCloud with particle's points
87 pcl::PointCloud<pcl::PointXYZ>::Ptr particle_cloud (new pcl::PointCloud<pcl::PointXYZ> ());
88 for (std::size_t i = 0; i < particles->points.size (); i++)
89 {
90 pcl::PointXYZ point;
91
92 point.x = particles->points[i].x;
93 point.y = particles->points[i].y;
94 point.z = particles->points[i].z;
95 particle_cloud->points.push_back (point);
96 }
97
98 //Draw red particles
99 {
100 pcl::visualization::PointCloudColorHandlerCustom<pcl::PointXYZ> red_color (particle_cloud, 250, 99, 71);
101
102 if (!viz.updatePointCloud (particle_cloud, red_color, "particle cloud"))
103 viz.addPointCloud (particle_cloud, red_color, "particle cloud");
104 }
105 return true;
106 }
107 else
108 {
109 return false;
110 }
111}
112
113//Draw model reference point cloud
114void
115drawResult (pcl::visualization::PCLVisualizer& viz)
116{
117 ParticleXYZRPY result = tracker_->getResult ();
118 Eigen::Affine3f transformation = tracker_->toEigenMatrix (result);
119
120 //move close to camera a little for better visualization
121 transformation.translation () += Eigen::Vector3f (0.0f, 0.0f, -0.005f);
122 CloudPtr result_cloud (new Cloud ());
123 pcl::transformPointCloud<RefPointType> (*(tracker_->getReferenceCloud ()), *result_cloud, transformation);
124
125 //Draw blue model reference point cloud
126 {
127 pcl::visualization::PointCloudColorHandlerCustom<RefPointType> blue_color (result_cloud, 0, 0, 255);
128
129 if (!viz.updatePointCloud (result_cloud, blue_color, "resultcloud"))
130 viz.addPointCloud (result_cloud, blue_color, "resultcloud");
131 }
132}
133
134//visualization's callback function
135void
136viz_cb (pcl::visualization::PCLVisualizer& viz)
137{
138 std::lock_guard<std::mutex> lock (mtx_);
139
140 if (!cloud_pass_)
141 {
142 std::this_thread::sleep_for(1s);
143 return;
144 }
145
146 //Draw downsampled point cloud from sensor
147 if (new_cloud_ && cloud_pass_downsampled_)
148 {
149 CloudPtr cloud_pass;
150 cloud_pass = cloud_pass_downsampled_;
151
152 if (!viz.updatePointCloud (cloud_pass, "cloudpass"))
153 {
154 viz.addPointCloud (cloud_pass, "cloudpass");
155 viz.resetCameraViewpoint ("cloudpass");
156 }
157 bool ret = drawParticles (viz);
158 if (ret)
159 drawResult (viz);
160 }
161 new_cloud_ = false;
162}
163
164//OpenNI Grabber's cloud Callback function
165void
166cloud_cb (const CloudConstPtr &cloud)
167{
168 std::lock_guard<std::mutex> lock (mtx_);
169 cloud_pass_.reset (new Cloud);
170 cloud_pass_downsampled_.reset (new Cloud);
171 filterPassThrough (cloud, *cloud_pass_);
172 gridSampleApprox (cloud_pass_, *cloud_pass_downsampled_, downsampling_grid_size_);
173
174 if(counter < 10){
175 counter++;
176 }else{
177 //Track the object
178 tracker_->setInputCloud (cloud_pass_downsampled_);
179 tracker_->compute ();
180 new_cloud_ = true;
181 }
182}
183
184int
185main (int argc, char** argv)
186{
187 if (argc < 3)
188 {
189 PCL_WARN("Please set device_id pcd_filename(e.g. $ %s '#1' sample.pcd)\n", argv[0]);
190 exit (1);
191 }
192
193 //read pcd file
194 target_cloud.reset(new Cloud());
195 if(pcl::io::loadPCDFile (argv[2], *target_cloud) == -1){
196 std::cout << "pcd file not found" << std::endl;
197 exit(-1);
198 }
199
200 std::string device_id = std::string (argv[1]);
201
202 counter = 0;
203
204 //Set parameters
205 new_cloud_ = false;
206 downsampling_grid_size_ = 0.002;
207
208 std::vector<double> default_step_covariance = std::vector<double> (6, 0.015 * 0.015);
209 default_step_covariance[3] *= 40.0;
210 default_step_covariance[4] *= 40.0;
211 default_step_covariance[5] *= 40.0;
212
213 std::vector<double> initial_noise_covariance = std::vector<double> (6, 0.00001);
214 std::vector<double> default_initial_mean = std::vector<double> (6, 0.0);
215
216 KLDAdaptiveParticleFilterOMPTracker<RefPointType, ParticleT>::Ptr tracker
217 (new KLDAdaptiveParticleFilterOMPTracker<RefPointType, ParticleT> (8));
218
219 ParticleT bin_size;
220 bin_size.x = 0.1f;
221 bin_size.y = 0.1f;
222 bin_size.z = 0.1f;
223 bin_size.roll = 0.1f;
224 bin_size.pitch = 0.1f;
225 bin_size.yaw = 0.1f;
226
227
228 //Set all parameters for KLDAdaptiveParticleFilterOMPTracker
229 tracker->setMaximumParticleNum (1000);
230 tracker->setDelta (0.99);
231 tracker->setEpsilon (0.2);
232 tracker->setBinSize (bin_size);
233
234 //Set all parameters for ParticleFilter
235 tracker_ = tracker;
236 tracker_->setTrans (Eigen::Affine3f::Identity ());
237 tracker_->setStepNoiseCovariance (default_step_covariance);
238 tracker_->setInitialNoiseCovariance (initial_noise_covariance);
239 tracker_->setInitialNoiseMean (default_initial_mean);
240 tracker_->setIterationNum (1);
241 tracker_->setParticleNum (600);
242 tracker_->setResampleLikelihoodThr(0.00);
243 tracker_->setUseNormal (false);
244
245
246 //Setup coherence object for tracking
247 ApproxNearestPairPointCloudCoherence<RefPointType>::Ptr coherence
248 (new ApproxNearestPairPointCloudCoherence<RefPointType>);
249
250 DistanceCoherence<RefPointType>::Ptr distance_coherence
251 (new DistanceCoherence<RefPointType>);
252 coherence->addPointCoherence (distance_coherence);
253
254 pcl::search::Octree<RefPointType>::Ptr search (new pcl::search::Octree<RefPointType> (0.01));
255 coherence->setSearchMethod (search);
256 coherence->setMaximumDistance (0.01);
257
258 tracker_->setCloudCoherence (coherence);
259
260 //prepare the model of tracker's target
261 Eigen::Vector4f c;
262 Eigen::Affine3f trans = Eigen::Affine3f::Identity ();
263 CloudPtr transed_ref (new Cloud);
264 CloudPtr transed_ref_downsampled (new Cloud);
265
266 pcl::compute3DCentroid<RefPointType> (*target_cloud, c);
267 trans.translation ().matrix () = Eigen::Vector3f (c[0], c[1], c[2]);
268 pcl::transformPointCloud<RefPointType> (*target_cloud, *transed_ref, trans.inverse());
269 gridSampleApprox (transed_ref, *transed_ref_downsampled, downsampling_grid_size_);
270
271 //set reference model and trans
272 tracker_->setReferenceCloud (transed_ref_downsampled);
273 tracker_->setTrans (trans);
274
275 //Setup OpenNIGrabber and viewer
276 pcl::visualization::CloudViewer* viewer_ = new pcl::visualization::CloudViewer("PCL OpenNI Tracking Viewer");
277 pcl::Grabber* interface = new pcl::OpenNIGrabber (device_id);
278 std::function<void (const CloudConstPtr&)> f = cloud_cb;
279 interface->registerCallback (f);
280
281 viewer_->runOnVisualizationThread (viz_cb, "viz_cb");
282
283 //Start viewer and object tracking
284 interface->start();
285 while (!viewer_->wasStopped ())
286 std::this_thread::sleep_for(1s);
287 interface->stop();
288}
The explanation
Now, let’s break down the code piece by piece.
//Set all parameters for KLDAdaptiveParticleFilterOMPTracker
tracker->setMaximumParticleNum (1000);
tracker->setDelta (0.99);
tracker->setEpsilon (0.2);
tracker->setBinSize (bin_size);
//Set all parameters for ParticleFilter
tracker_ = tracker;
tracker_->setTrans (Eigen::Affine3f::Identity ());
tracker_->setStepNoiseCovariance (default_step_covariance);
tracker_->setInitialNoiseCovariance (initial_noise_covariance);
tracker_->setInitialNoiseMean (default_initial_mean);
tracker_->setIterationNum (1);
tracker_->setParticleNum (600);
tracker_->setResampleLikelihoodThr(0.00);
First, in main function, these lines set the parameters for tracking.
//Setup coherence object for tracking
ApproxNearestPairPointCloudCoherence<RefPointType>::Ptr coherence
(new ApproxNearestPairPointCloudCoherence<RefPointType>);
DistanceCoherence<RefPointType>::Ptr distance_coherence
(new DistanceCoherence<RefPointType>);
coherence->addPointCoherence (distance_coherence);
pcl::search::Octree<RefPointType>::Ptr search (new pcl::search::Octree<RefPointType> (0.01));
coherence->setSearchMethod (search);
coherence->setMaximumDistance (0.01);
Here, we set likelihood function which tracker use when calculate weights. You can add more likelihood function as you like. By default, there are normals likelihood and color likelihood functions. When you want to add other likelihood function, all you have to do is initialize new Coherence Class and add the Coherence instance to coherence variable with addPointCoherence function.
//prepare the model of tracker's target
Eigen::Vector4f c;
Eigen::Affine3f trans = Eigen::Affine3f::Identity ();
CloudPtr transed_ref (new Cloud);
CloudPtr transed_ref_downsampled (new Cloud);
pcl::compute3DCentroid<RefPointType> (*target_cloud, c);
trans.translation ().matrix () = Eigen::Vector3f (c[0], c[1], c[2]);
pcl::transformPointCloud<RefPointType> (*target_cloud, *transed_ref, trans.inverse());
gridSampleApprox (transed_ref, *transed_ref_downsampled, downsampling_grid_size_);
//set reference model and trans
tracker_->setReferenceCloud (transed_ref_downsampled);
In this part, we set the point cloud loaded from pcd file as reference model to tracker and also set model’s transform values.
if(counter < 10){
counter++;
}else{
//Track the object
tracker_->setInputCloud (cloud_pass_downsampled_);
tracker_->compute ();
new_cloud_ = true;
Until the counter variable become equal to 10, we ignore the input point cloud, because the point cloud at first few frames often have noise. After counter variable reach to 10 frame, at each loop, we set downsampled input point cloud to tracker and the tracker will compute particles movement.
{
In drawParticles function, you can get particles’s positions by calling getParticles().
{
ParticleXYZRPY result = tracker_->getResult ();
In drawResult function, you can get model information about position and rotation.
Compiling and running the program
Create a CMakeLists.txt file and add the following lines into it.
1cmake_minimum_required(VERSION 2.8 FATAL_ERROR)
2
3project(openni_tracking)
4
5find_package(PCL 1.7 REQUIRED)
6
7include_directories(${PCL_INCLUDE_DIRS})
8link_directories(${PCL_LIBRARY_DIRS})
9add_definitions(${PCL_DEFINITIONS})
10
11add_executable (tracking_sample tracking_sample.cpp)
12target_link_libraries (tracking_sample ${PCL_LIBRARIES})
If you finish saving CMakeLists.txt, let’s prepare for running.
Put the target object on a plane where there is nothing.
Put sensor device about 1 meter away from target.
Don’t move the target and the device until you launch tracking program.
Output only target point cloud with your other code (See Plane model segmentation tutorial) and save as tracking_target.pcd
After you created model point cloud and the executable, you can then launch tracking_sample. Set device_id as second argument and pcd file’s name you made in above 4 as third.
$ ./tracking_sample “#1” tracking_target.pcd
After few seconds, tracking will start working and you can move tracking object around. As you can see in following pictures, the blue point cloud is reference model segmentation’s cloud and the red one is particles’ cloud.


More Advanced
If you want to see more flexible and useful tracking code which starts tracking without preparing to make segemented model beforehand, you should refer a tracking code https://github.com/PointCloudLibrary/pcl/blob/master/apps/src/openni_tracking.cpp. It will show you better and more legible code. The above Figures are windows when you implement that code.