Abstract
Unconstrained human activities recognition with a radar network is considered. A hybrid classifier combining both convolutional neural networks (CNNs) and recurrent neural networks (RNNs) for spatial–temporal pattern extraction is proposed. The 2-D CNNs (2D-CNNs) are first applied to the radar data to perform spatial feature extraction on the input spectrograms. Subsequently, gated recurrent units with bidirectional implementations are used to capture the long- and short-term temporal dependencies in the feature maps generated by the 2D-CNNs. Three NN-based data fusion methods were explored and compared with utilize the rich information provided by the different radar nodes. The performance of the proposed classifier was validated rigorously using the K-fold cross-validation (CV) and leave-one-person-out (L1PO) methods. Unlike competitive research, the dataset with continuous human activities with seamless interactivity transitions that can occur at any time and unconstrained moving trajectories of the participants has been collected and used for evaluation purposes. Classification accuracy of about 90.8% is achieved for nine-class human activity recognition (HAR) by the proposed classifier with the halfway fusion method
Original language | English |
---|---|
Article number | 5115215 |
Pages (from-to) | 1-15 |
Number of pages | 15 |
Journal | IEEE Transactions on Geoscience and Remote Sensing |
Volume | 60 |
DOIs | |
Publication status | Published - 2022 |
Bibliographical note
Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-careOtherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.
Keywords
- Deep learning (DL)
- distributed radar
- human activity recognition (HAR)
- micro-Doppler signatures
- radar sensor network (RSN)