This paper addresses the problem of object tracking in video sequences. The use of a structural similarity measure for tracking is proposed. The measure reflects the distance between two images by comparing their structural and spatial characteristics and has shown to be robust to illumination and contrast changes. As a result it guarantees robustness of the tracking process under changes in the environment. The previously used Bhattacharyya distance is not robust to such changes. Additionally, when a tracker is run with the Bhattacharyya distance, histograms should be calculated in order to find the likelihood function of the measurements. With the new function there is no need to calculate histograms. A particle filter (PF) is implemented where this measure is used for computing the distance between the reference and current frame. The algorithm performance has been tested and evaluated over real-world video sequences, and has been shown to outperform methods based on colour and edge histograms.
Sponsorship: The authors are grateful to the financial support by the UK MOD Data and Information Fusion Defence Technology Centre, by projects 2.1 'Image and video sensor fusion' and 2.2 'Communication optimisation for
distributed sensor systems'.
This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Bristol's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to firstname.lastname@example.org.
By choosing to view this document, you agree to all provisions of the copyright laws protecting it.
Name of Conference: 9th International Conference on Information Fusion
Venue of Conference: Florence, Italy
- object tracking, video sequences, particle filtering, similarity measure