Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study

. 2023 Apr 23 ; 23 (9) : . [epub] 20230423

Jazyk angličtina Země Švýcarsko Médium electronic

Typ dokumentu časopisecké články

Perzistentní odkaz   https://www.medvik.cz/link/pmid37177421

Grantová podpora
CZ.02.1.01/0.0/0.0/17_049/0008425 Ministry of Education Youth and Sports
SP2023/060 Ministry of Education Youth and Sports

The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. We present a system designed to facilitate collaboration between humans and robots in manufacturing processes that require frequent revisions of the robot path and that allows direct definition of the waypoints, which differentiates our system from the existing ones. We introduce a novel and intuitive approach to human-robot cooperation through the use of simple gestures. As part of a robotic workspace, a proposed interface was developed and implemented utilising three RGB-D sensors for monitoring the operator's hand movements within the workspace. The system employs distributed data processing through multiple Jetson Nano units, with each unit processing data from a single camera. MediaPipe solution is utilised to localise the hand landmarks in the RGB image, enabling gesture recognition. We compare the conventional methods of defining robot trajectories with their developed gesture-based system through an experiment with 20 volunteers. The experiment involved verification of the system under realistic conditions in a real workspace closely resembling the intended industrial application. Data collected during the experiment included both objective and subjective parameters. The results indicate that the gesture-based interface enables users to define a given path objectively faster than conventional methods. We critically analyse the features and limitations of the developed system and suggest directions for future research. Overall, the experimental results indicate the usefulness of the developed system as it can speed up the definition of the robot's path.

Zobrazit více v PubMed

Vysocky A., Novak P. Human-Robot Collaboration in Industry. MM Sci. J. 2016;2016:903–906. doi: 10.17973/MMSJ.2016_06_201611. DOI

Tang G., Asif S., Webb P. The Integration of Contactless Static Pose Recognition and Dynamic Hand Motion Tracking Control System for Industrial Human and Robot Collaboration. Ind. Robot Int. J. 2015;42:416–428. doi: 10.1108/IR-03-2015-0059. DOI

Safeea M., Neto P. Precise Positioning of Collaborative Robotic Manipulators Using Hand-Guiding. Int. J. Adv. Manuf. Technol. 2022;120:5497–5508. doi: 10.1007/s00170-022-09107-1. DOI

Kumar N., Lee S.C. Human-Machine Interface in Smart Factory: A Systematic Literature Review. Technol. Forecast. Soc. Chang. 2022;174:121–284. doi: 10.1016/j.techfore.2021.121284. DOI

Ionescu T.B., Schlund S. Programming Cobots by Voice: A Human-Centered, Web-Based Approach. Procedia CIRP. 2021;97:123–129. doi: 10.1016/j.procir.2020.05.213. DOI

Poncela A., Gallardo-Estrella L. Command-Based Voice Teleoperation of a Mobile Robot via a Human-Robot Interface. Robotica. 2015;33:1–18. doi: 10.1017/S0263574714000010. DOI

Norberto Pires J. Robot-by-voice: Experiments on Commanding an Industrial Robot Using the Human Voice. Industrial Robot. Int. J. 2005;32:505–511. doi: 10.1108/01439910510629244. DOI

Scalera L., Seriani S., Gallina P., Lentini M., Gasparetto A. Human–Robot Interaction through Eye Tracking for Artistic Drawing. Robotics. 2021;10:54. doi: 10.3390/robotics10020054. DOI

Grushko S., Vysocky A., Jha V., Pastor R., Prada E., Miková Ľ., Bobovský Z. Tuning Perception and Motion Planning Parameters for Moveit! Framework. MM Sci. J. 2020;2020:4154–4163. doi: 10.17973/MMSJ.2020_11_2020064. DOI

Grushko S., Vysocký A., Heczko D., Bobovský Z. Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human–Robot Collaboration. Sensors. 2021;21:5748. doi: 10.3390/s21175748. PubMed DOI PMC

Liu H., Xi Y., Song W., Um K., Cho K. Gesture-Based NUI Application for Real-Time Path Modification; Proceedings of the 2013 IEEE 11th International Conference on Dependable, Autonomic and Secure Computing; Chengdu, China. 21–22 December 2013; pp. 446–449. DOI

Zhang X., Zhang R., Chen L., Zhang X. Natural Gesture Control of a Delta Robot Using Leap Motion. J. Phys. Conf. Ser. 2019;1187:032042. doi: 10.1088/1742-6596/1187/3/032042. DOI

Takahashi S. Hand-Gesture-Recognition-Using-Mediapipe. [(accessed on 27 March 2023)]. Available online: https://github.com/Kazuhito00/hand-gesture-recognition-using-mediapipe.

Kamath V., Bhat S. Kinect Sensor Based Real-Time Robot Path Planning Using Hand Gesture and Clap Sound; Proceedings of the International Conference on Circuits, Communication, Control and Computing; Bangalore, India. 21–22 November 2014; pp. 129–134. DOI

Quintero C.P., Fomena R.T., Shademan A., Wolleb N., Dick T., Jagersand M. SEPO: Selecting by Pointing as an Intuitive Human-Robot Command Interface; Proceedings of the 2013 IEEE International Conference on Robotics and Automation; Karlsruhe, Germany. 6–10 May 2013; pp. 1166–1171. DOI

Vysocký A., Grushko S., Oščádal P., Kot T., Babjak J., Jánoš R., Sukop M., Bobovský Z. Analysis of Precision and Stability of Hand Tracking with Leap Motion Sensor. Sensors. 2020;20:4088. doi: 10.3390/s20154088. PubMed DOI PMC

Guna J., Jakus G., Pogačnik M., Tomažič S., Sodnik J. An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking. Sensors. 2014;14:3702–3720. doi: 10.3390/s140203702. PubMed DOI PMC

Jha V.K., Grushko S., Mlotek J., Kot T., Krys V., Oščádal P., Bobovskỳ Z. A Depth Image Quality Benchmark of Three Popular Low-Cost Depth Cameras. MM Sci. J. 2020;2020:4194–4200. doi: 10.17973/MMSJ.2020_12_2020057. DOI

Vysocky A., Grushko S., Spurny T., Pastor R., Kot T. Generating Synthetic Depth Image Dataset for Industrial Applications of Hand Localization. IEEE Access. 2022;10:99734–99744. doi: 10.1109/ACCESS.2022.3206948. DOI

Müezzinoğlu T., Karaköse M. An Intelligent Human–Unmanned Aerial Vehicle Interaction Approach in Real Time Based on Machine Learning Using Wearable Gloves. Sensors. 2021;21:1766. doi: 10.3390/s21051766. PubMed DOI PMC

Carneiro M.R., Rosa L.P., de Almeida A.T., Tavakoli M. Tailor-Made Smart Glove for Robot Teleoperation, Using Printed Stretchable Sensors; Proceedings of the 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft); Edinburgh, UK. 4–8 April 2022; pp. 722–728. DOI

Grushko S., Vysocký A., Oščádal P., Vocetka M., Novák P., Bobovský Z. Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory. Sensors. 2021;21:3673. doi: 10.3390/s21113673. PubMed DOI PMC

Roda-Sanchez L., Olivares T., Garrido-Hidalgo C., de la Vara J.L., Fernández-Caballero A. Human-Robot Interaction in Industry 4.0 Based on an Internet of Things Real-Time Gesture Control System. Integr. Comput.-Aided Eng. 2021;28:159–175. doi: 10.3233/ICA-200637. DOI

He J., Joshi M.V., Chang J., Jiang N. Efficient Correction of Armband Rotation for Myoelectric-Based Gesture Control Interface. J. Neural Eng. 2020;17:036025. doi: 10.1088/1741-2552/ab8682. PubMed DOI

Caeiro-Rodríguez M., Otero-González I., Mikic-Fonte F.A., Llamas-Nistal M. A Systematic Review of Commercial Smart Gloves: Current Status and Applications. Sensors. 2021;21:2667. doi: 10.3390/s21082667. PubMed DOI PMC

Parizi F.S., Whitmire E., Patel S. AuraRing: Precise Electromagnetic Finger Tracking. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020;3:1–28. doi: 10.1145/3369831. DOI

Danielsson O., Holm M., Syberfeldt A. Augmented Reality Smart Glasses in Industrial Assembly: Current Status and Future Challenges. J. Ind. Inf. Integr. 2020;20:100175. doi: 10.1016/j.jii.2020.100175. DOI

Oudah M., Al-Naji A., Chahl J. Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. J. Imaging. 2020;6:73. doi: 10.3390/jimaging6080073. PubMed DOI PMC

Müezzinoğlu T., Karaköse M. Wearable Glove Based Approach for Human-UAV Interaction; Proceedings of the 2020 IEEE International Symposium on Systems Engineering (ISSE); Vienna, Austria. 12 October–12 November 2020; pp. 1–6. DOI

DelPreto J., Rus D. Plug-and-Play Gesture Control Using Muscle and Motion Sensors; Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery, New York, NY, USA. 9 March 2020; pp. 439–448.

Pomykalski P., Woźniak M.P., Woźniak P.W., Grudzień K., Zhao S., Romanowski A. Considering Wake Gestures for Smart Assistant Use; Proceedings of the CHI ’20: CHI Conference on Human Factors in Computing Systems; Honolulu, HI, USA. 25–30 April 2020; DOI

Zhao H., Wang S., Zhou G., Zhang D. Ultigesture: A Wristband-Based Platform for Continuous Gesture Control in Healthcare. Smart Health. 2019;11:45–65. doi: 10.1016/j.smhl.2017.12.003. DOI

Oščádal P., Heczko D., Vysocký A., Mlotek J., Novák P., Virgala I., Sukop M., Bobovský Z. Improved Pose Estimation of Aruco Tags Using a Novel 3D Placement Strategy. Sensors. 2020;20:4825. doi: 10.3390/s20174825. PubMed DOI PMC

Oščádal P., Spurný T., Kot T., Grushko S., Suder J., Heczko D., Novák P., Bobovský Z. Distributed Camera Subsystem for Obstacle Detection. Sensors. 2022;22:4588. doi: 10.3390/s22124588. PubMed DOI PMC

Grushko S. Fork of Google’s MediaPipe (v0.8.9) for Jetson Nano (JetPack 4.6) CUDA (10.2) [(accessed on 27 March 2023)]. Available online: https://github.com/anion0278/mediapipe-jetson.

Hand Landmarks Detection Guide. [(accessed on 27 March 2023)]. Available online: https://google.github.io/mediapipe/solutions/hands.html.

How to Calculate Z-Score and Its Meaning. [(accessed on 1 March 2023)]. Available online: https://www.investopedia.com/terms/z/zscore.asp.

APA Dictionary of Psychology. [(accessed on 23 February 2023)]. Available online: https://dictionary.apa.org/

Hart S.G. NASA-Task Load Index (NASA-TLX); 20 Years Later. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2006;50:904–908. doi: 10.1177/154193120605000909. DOI

Bustamante E.A., Spain R.D. Measurement Invariance of the NASA TLX. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2008;52:1522–1526. doi: 10.1177/154193120805201946. DOI

Brooke J. SUS: A Quick and Dirty Usability Scale. Usability Evaluation In Industry. 1st ed. CRC Press; Boca Raton, FL, USA: 1995.

Bolano G., Roennau A., Dillmann R. Transparent Robot Behavior by Adding Intuitive Visual and Acoustic Feedback to Motion Replanning; Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); Nanjing, China. 27–31 August 2018; pp. 1075–1080. DOI

Hietanen A., Pieters R., Lanz M., Latokartano J., Kämäräinen J.-K. AR-Based Interaction for Human-Robot Collaborative Manufacturing. Robot. Comput.-Integr. Manuf. 2020;63:101891. doi: 10.1016/j.rcim.2019.101891. DOI

Bobovský Z., Krys V., Mostýn V. Kinect v2 Infrared Images Correction. Int. J. Adv. Robot. Syst. 2018;15:1729881418755780. doi: 10.1177/1729881418755780. DOI

Najít záznam

Citační ukazatele

Nahrávání dat ...

Možnosti archivace

Nahrávání dat ...