Synthetic Aperture Tracking: Tracking through Occlusions

 

Neel Joshi Shai Avidan Wojciech Matusik David Kriegman
UCSD MERL Adobe Systems Inc. UCSD

ICCV 2007

 

Tracking through occlusion. Frames from a single camera from a video sequence of a person moving behind a tree (first and second image). Using our method we can track a person successfully (third and fourth image). We track across the entire 360 frame sequence, which includes 200 straight frames where the person is heavily occluded.

 

Abstract

 

Occlusion is a significant challenge for many tracking algorithms. Most current methods can track through transient occlusion, but cannot handle significant extended occlusion when the object's trajectory may change significantly. We present a method to track a 2D object through significant occlusion using multiple nearby cameras (e.g., a camera array). When an occluder and object are at different depths, different parts of the object are visible or occluded in each view due to parallax. By aggregating across these views, the method can track even when any individual camera observes very little of the target object. Implementationwise, the methods are straightforward and build upon established single-camera algorithms. They do not require explicit modeling or reconstruction of the scene and enable tracking in complex, dynamic scenes with moving cameras. Analysis of accuracy and robustness shows that these methods are successful when upwards of 70% of the object is occluded in every camera view. To the best of our knowledge, this system is the first capable of tracking in the presence of such significant occlusion.

 

Paper

Adobe Acrobat PDF (4.29 MB)

Video



ICCV 2007 Talk

Zip File of Slides and Videos (66.2 MB)

Datasets

UCSD/MERL Light Field Archive

   

Copyright 2007 by Neel Joshi, UCSD, MERL, and IEEE

Use of images, videos, and slides for non-commercial, academic, or news related purposes is allowed with proper attribution. Contact Neel Joshi for inquiries about other uses.