TY - GEN
T1 - Machine Assisted Video Tagging of Elderly Activities in K-Log Centre
AU - Lee, Chanwoong
AU - Choi, Hyorim
AU - Muralidharan, Shapna
AU - Ko, Heedong
AU - Yoo, Byounghyun
AU - Kim, Gerard Jounghyun
N1 - Funding Information:
This work was supported by the Korea Institute of Science and Technology (KIST) under the Institutional Program (Grant No. 2E30270). REFERENCES
Publisher Copyright:
© 2020 IEEE.
PY - 2020/9/14
Y1 - 2020/9/14
N2 - In a rapidly aging society, like in South Korea, the number of Alzheimer's Disease (AD) patients is a significant public health problem, and the need for specialized healthcare centers is in high demand. Healthcare providers generally rely on caregivers (CG) for elderly persons with AD to monitor and help them in their daily activities. K-Log Centre is a healthcare provider located in Korea to help AD patients meet their daily needs with assistance from CG in the center. The CG'S in the K-Log Centre need to attend the patients' unique demands and everyday essentials for long-term care. Moreover, the CG also describes and logs the day-to-day activities in Activities of Daily Living (ADL) log, which comprises various events in detail. The CG's logging activities can overburden their work, leading to appalling results like suffering quality of elderly care and hiring additional CG's to maintain the quality of care and a negative feedback cycle. In this paper, we have analyzed this impending issue in K-Log Centre and propose a method to facilitate machine-assisted human tagging of videos for logging of the elderly activities using Human Activity Recognition (HAR). To enable the scenario, we use a You Only Look Once (YOLO-v3)-based deep learning method for object detection and use it for HAR creating a multi-modal machine-assisted human tagging of videos. The proposed algorithm detects the HAR with a precision of 98.4%. After designing the HAR model, we have tested it in a live video feed from the K-Log Centre to test the proposed method. The model showed an accuracy of 81.4% in live data, reducing the logging activities of the CG's.
AB - In a rapidly aging society, like in South Korea, the number of Alzheimer's Disease (AD) patients is a significant public health problem, and the need for specialized healthcare centers is in high demand. Healthcare providers generally rely on caregivers (CG) for elderly persons with AD to monitor and help them in their daily activities. K-Log Centre is a healthcare provider located in Korea to help AD patients meet their daily needs with assistance from CG in the center. The CG'S in the K-Log Centre need to attend the patients' unique demands and everyday essentials for long-term care. Moreover, the CG also describes and logs the day-to-day activities in Activities of Daily Living (ADL) log, which comprises various events in detail. The CG's logging activities can overburden their work, leading to appalling results like suffering quality of elderly care and hiring additional CG's to maintain the quality of care and a negative feedback cycle. In this paper, we have analyzed this impending issue in K-Log Centre and propose a method to facilitate machine-assisted human tagging of videos for logging of the elderly activities using Human Activity Recognition (HAR). To enable the scenario, we use a You Only Look Once (YOLO-v3)-based deep learning method for object detection and use it for HAR creating a multi-modal machine-assisted human tagging of videos. The proposed algorithm detects the HAR with a precision of 98.4%. After designing the HAR model, we have tested it in a live video feed from the K-Log Centre to test the proposed method. The model showed an accuracy of 81.4% in live data, reducing the logging activities of the CG's.
KW - Activities of Daily Living (ADL)
KW - Human Activity Recognition (HAR)
KW - K-Log Center
KW - Machine-assisted human tagging
KW - Multi-modal
KW - You Only Look Once (YOLO-v3)
UR - http://www.scopus.com/inward/record.url?scp=85096126295&partnerID=8YFLogxK
U2 - 10.1109/MFI49285.2020.9235269
DO - 10.1109/MFI49285.2020.9235269
M3 - Conference contribution
AN - SCOPUS:85096126295
T3 - IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems
SP - 237
EP - 242
BT - 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, MFI 2020
Y2 - 14 September 2020 through 16 September 2020
ER -