A platform for research: civil engineering, architecture and urbanism
Mutual physical state-aware object handover in full-contact collaborative human-robot construction work
Abstract Full-contact physical interactions inherent in typical construction workflow, such as material handovers, have yet to be adequately resolved or adopted in Human–Robot Collaboration (HRC) due to safety concerns. Replicating protective behavior norms for robots can help achieve safe robot material handovers to human workers. To build such a human-adaptive model, firstly, we present a comprehensive receiver grip state indicator that encompasses both gripping strength and gestures with whole-hand tactile sensors. Secondly, a Learning from Demonstration (LfD) model built to replicate the human grip state-reactive behavior norms for robots is described. The proposed method outperforms other robot-to-human object handover methods using only one-shot demonstrations of natural handovers. Additionally, the LfD-based programming interface is accessible to construction workers without programming expertise and can continuously collect data for a future large-scale LfD model covering a wide range of handover materials, users, and gestures to further enhance worker safety during close-proximity material handovers.
Highlights Accurate and fast human grip state aware robot handover controller was built. Tactile haptic gloves capture the human receiver's grip strength and grip gesture simultaneously. A robot can adapt to a human's grip state within 1 ms with enriched sensory information from tactile gloves. Robots programmed by demonstration can also adapt to unseen grip settings. Secondary collision from dropped objects is mitigated by extending the contact energy limitation approach in ISO 15066:2016.
Mutual physical state-aware object handover in full-contact collaborative human-robot construction work
Abstract Full-contact physical interactions inherent in typical construction workflow, such as material handovers, have yet to be adequately resolved or adopted in Human–Robot Collaboration (HRC) due to safety concerns. Replicating protective behavior norms for robots can help achieve safe robot material handovers to human workers. To build such a human-adaptive model, firstly, we present a comprehensive receiver grip state indicator that encompasses both gripping strength and gestures with whole-hand tactile sensors. Secondly, a Learning from Demonstration (LfD) model built to replicate the human grip state-reactive behavior norms for robots is described. The proposed method outperforms other robot-to-human object handover methods using only one-shot demonstrations of natural handovers. Additionally, the LfD-based programming interface is accessible to construction workers without programming expertise and can continuously collect data for a future large-scale LfD model covering a wide range of handover materials, users, and gestures to further enhance worker safety during close-proximity material handovers.
Highlights Accurate and fast human grip state aware robot handover controller was built. Tactile haptic gloves capture the human receiver's grip strength and grip gesture simultaneously. A robot can adapt to a human's grip state within 1 ms with enriched sensory information from tactile gloves. Robots programmed by demonstration can also adapt to unseen grip settings. Secondary collision from dropped objects is mitigated by extending the contact energy limitation approach in ISO 15066:2016.
Mutual physical state-aware object handover in full-contact collaborative human-robot construction work
Yu, Hongrui (author) / Kamat, Vineet R. (author) / Menassa, Carol C. (author) / McGee, Wes (author) / Guo, Yijie (author) / Lee, Honglak (author)
2023-03-08
Article (Journal)
Electronic Resource
English