An Elderly Fall Detection Method Based on Federated Learning and Extreme Learning Machine (Fed-ELM)

The lack of fall data for the elderly is a challenging problem in the fall detection community. To date, the fall and activities of daily life simulated by young people have been used in most studies to train and test fall detection algorithms. However, there are differences in movement patterns bet...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhigang Yu, Jiahui Liu, Mingchuan Yang, Yanmin Cheng, Jie Hu, Xinchi Li
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9984667/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The lack of fall data for the elderly is a challenging problem in the fall detection community. To date, the fall and activities of daily life simulated by young people have been used in most studies to train and test fall detection algorithms. However, there are differences in movement patterns between young and elderly individuals due to bone aging, which leads to the degradation of the algorithm performance in the elderly population. To solve the above issue, this paper proposes a fall detection algorithm combining Federated Learning and Extreme Learning Machine (Fed-ELM). First, the online extreme learning machine can use a small amount of misclassified user data to update the parameters so that its performance is improved for individual users. Then, Federated Learning is applied to share data information among different users without involving user privacy. In this way, the generalizability of the fall detection algorithm is improved. The performance of the proposed algorithm in different age groups is analyzed by experiments. For young people, the accuracy, sensitivity and specificity reach 96.96%, 94.50% and 99.29%, respectively, and the accuracy on each individual is more than 94%. For elderly individuals, the accuracy, sensitivity and specificity reach 99.07%, 96.00% and 98.33%, respectively, and the accuracy of each individual is more than 96%.
ISSN:2169-3536