BIROn - Birkbeck Institutional Research Online

    Robust deep identification using ECG and multimodal biometrics for Industrial Internet of Things

    Al Alkeem, E. and Yeun, C.Y. and Yun, J. and Yoo, Paul D. and Chae, M. and Rahman, A. and Asyhari, A.T. (2021) Robust deep identification using ECG and multimodal biometrics for Industrial Internet of Things. Ad Hoc Networks 121 (102581), ISSN 1570-8705.

    [img]
    Preview
    Text
    Alkeem et al_v3.pdf - Author's Accepted Manuscript
    Available under License Creative Commons Attribution Non-commercial No Derivatives.

    Download (2MB) | Preview

    Abstract

    The use of electrocardiogram (ECG) data for personal identification in Industrial Internet of Things can achieve near-perfect accuracy in an ideal condition. However, real-life ECG data are often exposed to various types of noises and interferences. A reliable and enhanced identification method could be achieved by employing additional features from other biometric sources. This work, thus, proposes a novel robust and reliable identification technique grounded on multimodal biometrics, which utilizes deep learning to combine fingerprint, ECG and facial image data, particularly useful for identification and gender classification purposes. The multimodal approach allows the model to deal with a range of input domains removing the requirement of independent training on each modality, and inter-domain correlation can improve the model generalization capability on these tasks. In multitask learning, losses from one task help to regularize others, thus, leading to better overall performances. The proposed approach merges the embedding of multimodality by using feature-level and score level fusions. To the best of our understanding, the key concepts presented herein is a pioneering work combining multimodality, multitasking and different fusion methods. The proposed model achieves a better generalization on the benchmark dataset used while the feature-level fusion outperforms other fusion methods. The proposed model is validated on noisy and incomplete data with missing modalities and the analyses on the experimental results are provided.

    Metadata

    Item Type: Article
    Keyword(s) / Subject(s): Personal identification, multimodal biometrics, deep learning, gender classification, 33 electrocardiogram, fingerprint, face recognition, feature-level fusion
    School: Birkbeck Faculties and Schools > Faculty of Science > School of Computing and Mathematical Sciences
    Depositing User: Paul Yoo
    Date Deposited: 06 Oct 2021 15:41
    Last Modified: 09 Aug 2023 12:51
    URI: https://eprints.bbk.ac.uk/id/eprint/45428

    Statistics

    Activity Overview
    6 month trend
    216Downloads
    6 month trend
    209Hits

    Additional statistics are available via IRStats2.

    Archive Staff Only (login required)

    Edit/View Item
    Edit/View Item