A platform for research: civil engineering, architecture and urbanism
Worker Activity Classification Using Multimodal Data Fusion from Wearable Sensors
Accurate and automated classification of workers’ activities is critical for safety and performance monitoring of workers, especially in highly hazardous working conditions. Previous studies have explored automated worker activity classification using wearable sensors with a sole type of data (e.g., acceleration) in controlled lab environments. To further improve the accuracy of worker activity classification with wearable sensors, we collected multimodal data from workers that conduct highway maintenance activities such as crack sealing, and pothole patching, in an Indiana Department of Transportation (INDOT) facility. Several activities were identified through field videos, including crack sealing, transferring material and walking. Two datasets were developed based on the collected data with one containing acceleration data only and the other one fusing acceleration data with multimodal data including heart rate, electrodermal activity (EDA), and skin temperature. The K-nearest neighbors (KNN) models were built to classify workers’ activities for the two datasets respectively. Results showed that the accuracies for detecting crack sealing, transferring material, and walking without the data fusion were 1.0, 1.0 and 0.71. With the data fusion, the accuracies for detecting crack sealing, transferring material, and walking became 1.0, 0.93, and 0.93. The overall accuracy for classifying the three activities increased from 0.9069 to 0.9535 with the data fusion.
Worker Activity Classification Using Multimodal Data Fusion from Wearable Sensors
Accurate and automated classification of workers’ activities is critical for safety and performance monitoring of workers, especially in highly hazardous working conditions. Previous studies have explored automated worker activity classification using wearable sensors with a sole type of data (e.g., acceleration) in controlled lab environments. To further improve the accuracy of worker activity classification with wearable sensors, we collected multimodal data from workers that conduct highway maintenance activities such as crack sealing, and pothole patching, in an Indiana Department of Transportation (INDOT) facility. Several activities were identified through field videos, including crack sealing, transferring material and walking. Two datasets were developed based on the collected data with one containing acceleration data only and the other one fusing acceleration data with multimodal data including heart rate, electrodermal activity (EDA), and skin temperature. The K-nearest neighbors (KNN) models were built to classify workers’ activities for the two datasets respectively. Results showed that the accuracies for detecting crack sealing, transferring material, and walking without the data fusion were 1.0, 1.0 and 0.71. With the data fusion, the accuracies for detecting crack sealing, transferring material, and walking became 1.0, 0.93, and 0.93. The overall accuracy for classifying the three activities increased from 0.9069 to 0.9535 with the data fusion.
Worker Activity Classification Using Multimodal Data Fusion from Wearable Sensors
Lecture Notes in Civil Engineering
Skatulla, Sebastian (editor) / Beushausen, Hans (editor) / Tian, Chi (author) / Chen, Yunfeng (author) / Feng, Yiheng (author) / Zhang, Jiansong (author)
International Conference on Computing in Civil and Building Engineering ; 2022 ; Cape Town, South Africa
Advances in Information Technology in Civil and Building Engineering ; Chapter: 12 ; 153-160
2023-09-30
8 pages
Article/Chapter (Book)
Electronic Resource
English
British Library Online Contents | 2017
|Multimodal Data Fusion for Big Events
British Library Online Contents | 2016
|British Library Conference Proceedings | 2015
|