Human gait recognition with matrix representation
Xu, D. and Yan, S. and Tao, D. and Zhang, L. and Li, Xuelong and Zhang, H.J. (2006) Human gait recognition with matrix representation. IEEE Transactions on Circuits and Systems for Video Technology 16 (7), pp. 896-903. ISSN 1051-8215.
|
Text
Binder1.pdf Download (345kB) | Preview |
Abstract
Human gait is an important biometric feature. It can be perceived from a great distance and has recently attracted greater attention in video-surveillance-related applications, such as closed-circuit television. We explore gait recognition based on a matrix representation in this paper. First, binary silhouettes over one gait cycle are averaged. As a result, each gait video sequence, containing a number of gait cycles, is represented by a series of gray-level averaged images. Then, a matrix-based unsupervised algorithm, namely coupled subspace analysis (CSA), is employed as a preprocessing step to remove noise and retain the most representative information. Finally, a supervised algorithm, namely discriminant analysis with tensor representation, is applied to further improve classification ability. This matrix-based scheme demonstrates a much better gait recognition performance than state-of-the-art algorithms on the standard USF HumanID Gait database.
Metadata
Item Type: | Article |
---|---|
Additional Information: | This is an exact copy of a paper published in IEEE Transactions on Circuits and Systems for Video Technology (ISSN 1051-8215). This material is posted here with permission of the IEEE. Copyright © 2006 IEEE. |
Keyword(s) / Subject(s): | coupled subspaces analysis (CSA), dimensionality reduction, discriminant analysis with tensor representation (DATER), human gait recognition, object representation |
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences |
Depositing User: | Sandra Plummer |
Date Deposited: | 30 Jan 2007 |
Last Modified: | 09 Aug 2023 12:29 |
URI: | https://eprints.bbk.ac.uk/id/eprint/451 |
Statistics
Additional statistics are available via IRStats2.