Vis enkel innførsel

dc.contributor.authorMuaaz, Muhammad
dc.contributor.authorWaqar, Sahil
dc.contributor.authorPätzold, Matthias Uwe
dc.date.accessioned2024-04-26T12:04:59Z
dc.date.available2024-04-26T12:04:59Z
dc.date.created2023-06-26T16:38:15Z
dc.date.issued2023
dc.identifier.citationMuaaz, M., Waqar, S. & Pätzold, M. U. (2023). Orientation-Independent Human Activity Recognition Using Complementary Radio Frequency Sensing. Sensors, 23 (13), Article 5810.en_US
dc.identifier.issn1424-8220
dc.identifier.urihttps://hdl.handle.net/11250/3128230
dc.description.abstractRF sensing offers an unobtrusive, user-friendly, and privacy-preserving method for detecting accidental falls and recognizing human activities. Contemporary RF-based HAR systems generally employ a single monostatic radar to recognize human activities. However, a single monostatic radar cannot detect the motion of a target, e.g., a moving person, orthogonal to the boresight axis of the radar. Owing to this inherent physical limitation, a single monostatic radar fails to efficiently recognize orientation-independent human activities. In this work, we present a complementary RF sensing approach that overcomes the limitation of existing single monostatic radar-based HAR systems to robustly recognize orientation-independent human activities and falls. Our approach used a distributed mmWave MIMO radar system that was set up as two separate monostatic radars placed orthogonal to each other in an indoor environment. These two radars illuminated the moving person from two different aspect angles and consequently produced two time-variant micro-Doppler signatures. We first computed the mean Doppler shifts (MDSs) from the micro-Doppler signatures and then extracted statistical and time- and frequency-domain features. We adopted feature-level fusion techniques to fuse the extracted features and a support vector machine to classify orientation-independent human activities. To evaluate our approach, we used an orientation-independent human activity dataset, which was collected from six volunteers. The dataset consisted of more than 1350 activity trials of five different activities that were performed in different orientations. The proposed complementary RF sensing approach achieved an overall classification accuracy ranging from 98.31 to 98.54%. It overcame the inherent limitations of a conventional single monostatic radar-based HAR and outperformed it by 6%.en_US
dc.language.isoengen_US
dc.publisherMDPIen_US
dc.rightsNavngivelse 4.0 Internasjonal*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/deed.no*
dc.titleOrientation-Independent Human Activity Recognition Using Complementary Radio Frequency Sensingen_US
dc.title.alternativeOrientation-Independent Human Activity Recognition Using Complementary Radio Frequency Sensingen_US
dc.typePeer revieweden_US
dc.typeJournal articleen_US
dc.description.versionpublishedVersionen_US
dc.rights.holder© 2023 The Author(s)en_US
dc.subject.nsiVDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550en_US
dc.source.volume23en_US
dc.source.journalSensorsen_US
dc.source.issue13en_US
dc.identifier.doihttps://doi.org/10.3390/s23135810
dc.identifier.cristin2158223
dc.relation.projectUniversitetet i Agder: Wiseneten_US
dc.relation.projectNorges forskningsråd: 300638en_US
dc.source.articlenumber5810en_US
cristin.qualitycode1


Tilhørende fil(er)

Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel

Navngivelse 4.0 Internasjonal
Med mindre annet er angitt, så er denne innførselen lisensiert som Navngivelse 4.0 Internasjonal