Human-Aware Collaborative Robots in the Wild: Coping with Uncertainty in Activity Recognition
- PMID: 37050446
- PMCID: PMC10099038
- DOI: 10.3390/s23073388
Human-Aware Collaborative Robots in the Wild: Coping with Uncertainty in Activity Recognition
Abstract
This study presents a novel approach to cope with the human behaviour uncertainty during Human-Robot Collaboration (HRC) in dynamic and unstructured environments, such as agriculture, forestry, and construction. These challenging tasks, which often require excessive time, labour and are hazardous for humans, provide ample room for improvement through collaboration with robots. However, the integration of humans in-the-loop raises open challenges due to the uncertainty that comes with the ambiguous nature of human behaviour. Such uncertainty makes it difficult to represent high-level human behaviour based on low-level sensory input data. The proposed Fuzzy State-Long Short-Term Memory (FS-LSTM) approach addresses this challenge by fuzzifying ambiguous sensory data and developing a combined activity recognition and sequence modelling system using state machines and the LSTM deep learning method. The evaluation process compares the traditional LSTM approach with raw sensory data inputs, a Fuzzy-LSTM approach with fuzzified inputs, and the proposed FS-LSTM approach. The results show that the use of fuzzified inputs significantly improves accuracy compared to traditional LSTM, and, while the fuzzy state machine approach provides similar results than the fuzzy one, it offers the added benefits of ensuring feasible transitions between activities with improved computational efficiency.
Keywords: deep learning; finite state machine; fuzzy logic; human activity recognition and modelling; human-robot collaboration; long short—term memory.
Conflict of interest statement
The authors declare no conflict of interest.
Figures














References
-
- Villani V., Pini F., Leali F., Secchi C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics. 2018;55:248–266. doi: 10.1016/j.mechatronics.2018.02.009. - DOI
-
- Ajoudani A., Zanchettin A.M., Ivaldi S., Albu-Schäffer A., Kosuge K., Khatib O. Progress and prospects of the human–robot collaboration. Auton. Robot. 2018;42:957–975. doi: 10.1007/s10514-017-9677-2. - DOI
-
- Chueshev A., Melekhova O., Meshcheryakov R. Cloud Robotic Platform on Basis of Fog Computing Approach. In: Ronzhin A., Rigoll G., Meshcheryakov R., editors. Interactive Collaborative Robotics, Proceedings of the Interactive Collaborative Robotics, Leipzig, Germany, 18–22 September 2018. Springer International Publishing; Cham, Switzerland: 2018. pp. 34–43.
-
- Rodriguez-Losada D., Matia F., Jimenez A., Galan R., Lacey G. Implementing Map Based Navigation in Guido, the Robotic SmartWalker; Proceedings of the 2005 IEEE International Conference on Robotics and Automation; Barcelona, Spain. 18–22 April 2005; pp. 3390–3395. - DOI
-
- Jia P., Hu H. Head gesture based control of an intelligent wheelchair; Proceedings of the 11th Annual Conference of the Chinese Automation and Computing Society in the UK [CACSUK05]; Sheffield, UK. 10 September 2005; pp. 85–90.
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources