Eine Plattform für die Wissenschaft: Bauingenieurwesen, Architektur und Urbanistik
Towards recognising collaborative activities using multiple on-body sensors
This paper describes the initial stages of a new work on recognising collaborative activities involving two or more people. In the experiment described a physically demanding construction task is completed by a team of 4 volunteers. The task, to build a large video wall, requires communication, coordination, and physical collaboration between group members. Minimal outside assistance is provided to better reflect the ad-hoc and loosely structured nature of real-world construction tasks. On-body inertial measurement units (IMU) record each subject's head and arm movements; a wearable eye-tracker records gaze and ego-centric video; and audio is recorded from each person's head and dominant arm. A first look at the data reveals promising correlations between, for example, the movement patterns of two people carrying a heavy object. Also revealed are clues on how complementary information from different sensor types, such as sound and vision, might further aid collaboration recognition.
Towards recognising collaborative activities using multiple on-body sensors
This paper describes the initial stages of a new work on recognising collaborative activities involving two or more people. In the experiment described a physically demanding construction task is completed by a team of 4 volunteers. The task, to build a large video wall, requires communication, coordination, and physical collaboration between group members. Minimal outside assistance is provided to better reflect the ad-hoc and loosely structured nature of real-world construction tasks. On-body inertial measurement units (IMU) record each subject's head and arm movements; a wearable eye-tracker records gaze and ego-centric video; and audio is recorded from each person's head and dominant arm. A first look at the data reveals promising correlations between, for example, the movement patterns of two people carrying a heavy object. Also revealed are clues on how complementary information from different sensor types, such as sound and vision, might further aid collaboration recognition.
Towards recognising collaborative activities using multiple on-body sensors
Ward, JA (Autor:in) / Pirkl, G (Autor:in) / Hevesi, P (Autor:in) / Lukowicz, P (Autor:in)
12.09.2016
In: UbiComp '16 Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. (pp. pp. 221-224). ACM publishing: New York, NY, USA. (2016)
Paper
Elektronische Ressource
Englisch
DDC:
690
Towards Recognising Individual Behaviours from Pervasive Mobile Datasets in Urban Spaces
DOAJ | 2019
|Wiley | 2025
|Recognising the structural characteristics of concrete
British Library Conference Proceedings | 1996
|Recognising design quality in development control
British Library Online Contents | 1994
|Recognising building patterns using matched filters and genetic search
Online Contents | 1998
|