Multi-Scale Feature Density Approximation for Object Representation and Tracking

C.Y. Liu and N.H.C. Yung (PRC)


Object tracking, feature density estimation, Gaussian Mixture Model, multi-scale feature.


This paper proposes a scale-consistent feature density estimation method based on Gaussian Mixture Model (GMM) for robustly tracking video object under scale variation and partial occlusion. Scale consistency is achieved in both feature extraction and feature density estimation. Firstly, an image is partitioned into patches in the scale space that matches the scale of the local image pattern and the size that includes the most basic image pattern, from which scale consistent image features are extracted. Secondly, to invariantly estimate the feature density against the variation in image partition caused by the object’s changing scale, an observational credible probability is defined for each patch and used to control its feature’s contribution in the feature density estimation according to the size of the patch. Thirdly, the likelihood function defined by both the extracted features and their observational credible probability are maximized in the GMM parameter estimation. Moreover, partial occlusion on the patches which has repeated features does not affect the object’s global appearance. Experiment results show that this method effectively tracks objects with scale variation and partial occlusion in the image sequence.

Important Links:

Go Back