Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory
Language English Country Switzerland Media electronic
Document type Journal Article
Grant support
CZ.02.1.01/0.0/0.0/17_049/0008425
Research Platform focused on Industry 4.0 and Robotics in Ostrava Agglomeration project
SP2021/47
Specific research project financed by the state budget of the Czech Republic
PubMed
34070528
PubMed Central
PMC8198032
DOI
10.3390/s21113673
PII: s21113673
Knihovny.cz E-resources
- Keywords
- bidirectional awareness, haptic feedback device, human machine interface, human robot collaboration, human robot interaction, path planning,
- MeSH
- Ergonomics MeSH
- Humans MeSH
- Motion MeSH
- Robotics * MeSH
- User-Computer Interface MeSH
- Feedback MeSH
- Check Tag
- Humans MeSH
- Publication type
- Journal Article MeSH
In a collaborative scenario, the communication between humans and robots is a fundamental aspect to achieve good efficiency and ergonomics in the task execution. A lot of research has been made related to enabling a robot system to understand and predict human behaviour, allowing the robot to adapt its motion to avoid collisions with human workers. Assuming the production task has a high degree of variability, the robot's movements can be difficult to predict, leading to a feeling of anxiety in the worker when the robot changes its trajectory and approaches since the worker has no information about the planned movement of the robot. Additionally, without information about the robot's movement, the human worker cannot effectively plan own activity without forcing the robot to constantly replan its movement. We propose a novel approach to communicating the robot's intentions to a human worker. The improvement to the collaboration is presented by introducing haptic feedback devices, whose task is to notify the human worker about the currently planned robot's trajectory and changes in its status. In order to verify the effectiveness of the developed human-machine interface in the conditions of a shared collaborative workspace, a user study was designed and conducted among 16 participants, whose objective was to accurately recognise the goal position of the robot during its movement. Data collected during the experiment included both objective and subjective parameters. Statistically significant results of the experiment indicated that all the participants could improve their task completion time by over 45% and generally were more subjectively satisfied when completing the task with equipped haptic feedback devices. The results also suggest the usefulness of the developed notification system since it improved users' awareness about the motion plan of the robot.
Department of Robotics Faculty of Mechanical Engineering VSB TU Ostrava 70800 Ostrava Czech Republic
See more in PubMed
Vysocky A., Novak P. Human-robot collaboration in industry. MM Sci. J. 2016;2016:903–906. doi: 10.17973/MMSJ.2016_06_201611. DOI
Wang L., Gao R., Váncza J., Krüger J., Wang X.V., Makris S., Chryssolouris G. Symbiotic Human-robot collaborative assembly. CIRP Ann. 2019;68:701–726. doi: 10.1016/j.cirp.2019.05.002. DOI
Villani V., Pini F., Leali F., Secchi C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics. 2018;55:248–266. doi: 10.1016/j.mechatronics.2018.02.009. DOI
Mohammadi Amin F., Rezayati M., van de Venn H.W., Karimpour H. A mixed-perception approach for safe human–robot collaboration in industrial automation. Sensors. 2020;20:6347. doi: 10.3390/s20216347. PubMed DOI PMC
Casalino A., Messeri C., Pozzi M., Zanchettin A.M., Rocco P., Prattichizzo D. Operator awareness in human–robot collaboration through wearable vibrotactile feedback. IEEE Robot. Autom. Lett. 2018;3:4289–4296. doi: 10.1109/LRA.2018.2865034. DOI
Bonci A., Cen Cheng P.D., Indri M., Nabissi G., Sibona F. Human-robot perception in industrial environments: A survey. Sensors. 2021;21:1571. doi: 10.3390/s21051571. PubMed DOI PMC
Tang K.-H., Ho C.-F., Mehlich J., Chen S.-T. Assessment of handover prediction models in estimation of cycle times for manual assembly tasks in a human–robot collaborative environment. Appl. Sci. 2020;10:556. doi: 10.3390/app10020556. DOI
Mainprice J., Berenson D. Human-robot collaborative manipulation planning using early prediction of human motion; Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems; Tokyo, Japan. 3–7 November 2013; pp. 299–306.
Hermann A., Mauch F., Fischnaller K., Klemm S., Roennau A., Dillmann R. Anticipate your surroundings: Predictive collision detection between dynamic obstacles and planned robot trajectories on the GPU; Proceedings of the 2015 European Conference on Mobile Robots (ECMR); Lincoln, UK. 2–4 September 2015; pp. 1–8.
Li G., Liu Z., Cai L., Yan J. Standing-posture recognition in human–robot collaboration based on deep learning and the dempster–shafer evidence theory. Sensors. 2020;20:1158. doi: 10.3390/s20041158. PubMed DOI PMC
Feleke A.G., Bi L., Fei W. EMG-based 3D hand motor intention prediction for information transfer from human to robot. Sensors. 2021;21:1316. doi: 10.3390/s21041316. PubMed DOI PMC
Scimmi L.S., Melchiorre M., Troise M., Mauro S., Pastorelli S. A practical and effective layout for a safe human-robot collaborative assembly task. Appl. Sci. 2021;11:1763. doi: 10.3390/app11041763. DOI
Mišeikis J., Glette K., Elle O.J., Torresen J. Multi 3D camera mapping for predictive and reflexive robot manipulator trajectory estimation; Proceedings of the 2016 IEEE Symposium Series on Computational Intelligence (SSCI); Athens, Greece. 6–9 December 2016; pp. 1–8.
Bolano G., Roennau A., Dillmann R. Transparent robot behavior by adding intuitive visual and acoustic feedback to motion replanning; Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); Nanjing, China. 27–31 August 2018; pp. 1075–1080.
Tsarouchi P., Makris S., Chryssolouris G. Human–Robot Interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 2016;29:916–931. doi: 10.1080/0951192X.2015.1130251. DOI
Lee W., Park C.H., Jang S., Cho H.-K. Design of effective robotic gaze-based social cueing for users in task-oriented situations: How to overcome in-attentional blindness? Appl. Sci. 2020;10:5413. doi: 10.3390/app10165413. DOI
Kalpagam Ganesan R., Rathore Y.K., Ross H.M., Ben Amor H. Better teaming through visual cues: how projecting imagery in a workspace can improve human-robot collaboration. IEEE Robot. Autom. Mag. 2018;25:59–71. doi: 10.1109/MRA.2018.2815655. DOI
Andersen R.S., Madsen O., Moeslund T.B., Amor H.B. Projecting robot intentions into human environments; Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); New York, NY, USA. 26–31 August 2016; pp. 294–301.
Bambuŝek D., Materna Z., Kapinus M., Beran V., Smrž P. Combining interactive spatial augmented reality with head-mounted display for end-user collaborative robot programming; Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN); New Delhi, India. 14–18 October 2019; pp. 1–8.
Fang H.C., Ong S.K., Nee A.Y.C. A novel augmented reality-based interface for robot path planning. Int. J. Interact. Des. Manuf. 2014;8:33–42. doi: 10.1007/s12008-013-0191-2. DOI
Hietanen A., Pieters R., Lanz M., Latokartano J., Kämäräinen J.-K. AR-based interaction for human-robot collaborative manufacturing. Robot. Comput. Integr. Manuf. 2020;63:101891. doi: 10.1016/j.rcim.2019.101891. DOI
Clair A.S., Matarić M. How robot verbal feedback can improve team performance in human-robot task collaborations; Proceedings of the 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI); Portland, OR, USA. 2–5 March 2015; pp. 213–220.
Barros P.G., de Lindeman R.W., Ward M.O. Enhancing robot teleoperator situation awareness and performance using vibro-tactile and graphical feedback; Proceedings of the 2011 IEEE Symposium on 3D User Interfaces (3DUI); Singapore. 19–20 March 2011; pp. 47–54.
Li H., Sarter N.B., Sebok A., Wickens C.D. The design and evaluation of visual and tactile warnings in support of space teleoperation. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2012 doi: 10.1177/1071181312561384. DOI
Sziebig G., Korondi P. Remote operation and assistance in human robot interactions with vibrotactile feedback; Proceedings of the 2017 IEEE 26th International Symposium on Industrial Electronics (ISIE); Edinburgh, UK. 19–21 June 2017; pp. 1753–1758.
Lasota P., Shah J. Analyzing the effects of human-aware motion planning on close-proximity human-robot collaboration. Hum. Factors J. Hum. Factors Ergon. Soc. 2015;57:21–33. doi: 10.1177/0018720814565188. PubMed DOI PMC
Unhelkar V.V., Lasota P.A., Tyroller Q., Buhai R.-D., Marceau L., Deml B., Shah J.A. Human-aware robotic assistant for collaborative assembly: Integrating human motion prediction with planning in time. IEEE Robot. Autom. Lett. 2018;3:2394–2401. doi: 10.1109/LRA.2018.2812906. DOI
Fu L., Duan J., Zou X., Lin G., Song S., Ji B., Yang Z. Banana detection based on color and texture features in the natural environment. Comput. Electron. Agric. 2019;167:105057. doi: 10.1016/j.compag.2019.105057. DOI
Song Z., Fu L., Wu J., Liu Z., Li R., Cui Y. Kiwifruit detection in field images using faster R-CNN with VGG16. IFAC Pap. OnLine. 2019;52:76–81. doi: 10.1016/j.ifacol.2019.12.500. DOI
Lim G.M., Jatesiktat P., Keong Kuah C.W., Tech Ang W. Hand and object segmentation from depth image using fully convolutional network; Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); Berlin, Germany. 23–27 July 2019; pp. 2082–2086. PubMed
Tang Y., Chen M., Wang C., Luo L., Li J., Lian G., Zou X. Recognition and localization methods for vision-based fruit picking robots: A review. Front. Plant Sci. 2020;11 doi: 10.3389/fpls.2020.00510. PubMed DOI PMC
Chen M., Tang Y., Zou X., Huang K., Li L., He Y. High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm. Opt. Lasers Eng. 2019;122:170–183. doi: 10.1016/j.optlaseng.2019.06.011. DOI
Ioan Sucan S.C. MoveIt! [(accessed on 4 March 2021)]; Available online: http://moveit.ros.org.
Pan J., Chitta S., Manocha D. FCL: A general purpose library for collision and proximity queries; Proceedings of the 2012 IEEE International Conference on Robotics and Automation; Saint Paul, MN, USA. 14–18 May 2012; pp. 3859–3866.
Ganesan R.K. Mediating Human-Robot Collaboration Through Mixed Reality Cues. [(accessed on 2 March 2021)]; Available online: https://www.semanticscholar.org/paper/Mediating-Human-Robot-Collaboration-through-Mixed-Ganesan/de797205f4359044639071fa8935cd23aa3fa5c9.
Practice Effect—APA Dictionary of Psychology. [(accessed on 4 March 2021)]; Available online: https://dictionary.apa.org/practice-effect.
Aggravi M., Salvietti G., Prattichizzo D. Haptic wrist guidance using vibrations for human-robot teams; Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); New York, NY, USA. 26–31 August 2016; pp. 113–118.
Scheggi S., Chinello F., Prattichizzo D. ACM PETRA ’12: Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments, Heraklion, Crete, Greece, 6–8 June 2012. Vibrotactile Haptic Feedback for Human-Robot. Interaction in Leader-Follower Tasks; pp. 1–4.
Scheggi S., Aggravi M., Morbidi F., Prattichizzo D. Cooperative Human-Robot. Haptic Navigation; Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA); Hong Kong, China. 31 May–7 June 2014; pp. 2693–2698.
Camera Arrangement Optimization for Workspace Monitoring in Human-Robot Collaboration
Distributed Camera Subsystem for Obstacle Detection