< Back to previous page
People tracking by cooperative fusion of RADAR and camera sensors
Book Contribution - Book Chapter Conference Contribution
Accurate 3D tracking of objects from monocularcamera poses challenges due to the loss of depth duringprojection. Although ranging by RADAR has proven effectivein highway environments, people tracking remains beyond thecapability of single sensor systems. In this paper, we propose acooperative RADAR-camera fusion method for people trackingon the ground plane. Using average person height, joint detection likelihood is calculated by back-projecting detections fromthe camera onto the RADAR Range-Azimuth data. Peaks in thejoint likelihood, representing candidate targets, are fed into aParticle Filter tracker. Depending on the association outcome,particles are updated using the associated detections (Trackingby Detection), or by sampling the raw likelihood itself (TrackingBefore Detection). Utilizing the raw likelihood data has theadvantage that lost targets are continuously tracked even ifthe camera or RADAR signal is below the detection threshold.We show that in single target, uncluttered environments, theproposed method entirely outperforms camera-only tracking.Experiments in a real-world urban environment also confirmthat the cooperative fusion tracker produces significantly betterestimates, even in difficult and ambiguous situations.
Book: IEEE Intelligent Transportation Systems Conference - ITSC 2019
Number of pages: 1