Object tracking has attracted recent attention because of high demands for its everyday-life applications. Handling occlusions especially in cluttered environments introduced new challenges to the tracking problem. Depth-maps provide necessary clues to retrieve occluded objects after they reappear, recombine split group of objects, compensate drastic appearance changes, and reduce the effect of appearance artifacts. This page is dedicated to gather resources about RGBD tracking, as well as showing my research directions and my code repositry explaination in GitHub.
Research Community
Nov 11, 2013
Websites:
People:
- Wongun Choi (Univ. of Michigan Ann Arbor)
- Zdenek Kalal (Univ. of Surray)
- Ming-Hsuang Yang (U Merced)
Labs:
- Probabilistic Tracking (Seoul Nationa Univ-Kwon & Lee)
- Computational Vision and Geometry Lab (Stanford - Savarese)
Videos:
- BMVC 2012 - Detection and Tracking Occluded People
- Gaussian Processes for Monocular 3D People tracking
- CVPR 2010 - Visual Tracking Decomposition
- NIPS 2009 - Understanding Visual Scenes
Datasets:
- Princeton RGBD Tracking Benchmark
- 50 Recent Challenging Movie Sequences for Visual Tracking (2013)
- YACVID: A list of frequently used computer vision datasets
- PETS Series: 2000~2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013, 2014
- ViSOR 2008
Conferences:
- AVSS (Advanced Video and Signal-based Survillience)
- PETS (Privacy Enhancing Technology Symposium)
- ICCV (International Conference on Computer Vision)
- ECCV (European Conference on Computer Vision)