Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human-Robot Collaboration
Language English Country Switzerland Media electronic
Document type Journal Article
Grant support
CZ.02.1.01/0.0/0.0/17_049/0008425
Research Platform focused on Industry 4.0 and Robotics in Ostrava Agglomeration project
SP2021/47
State budget of the Czech Republic
PubMed
34502639
PubMed Central
PMC8434014
DOI
10.3390/s21175748
PII: s21175748
Knihovny.cz E-resources
- Keywords
- haptic feedback device, human–machine interface, human–robot collaboration, human–robot interaction, mutual awareness, spatial tactile feedback,
- MeSH
- Touch MeSH
- Humans MeSH
- Robotics * MeSH
- Hand MeSH
- User-Computer Interface MeSH
- Feedback MeSH
- Check Tag
- Humans MeSH
- Publication type
- Journal Article MeSH
In this work, we extend the previously proposed approach of improving mutual perception during human-robot collaboration by communicating the robot's motion intentions and status to a human worker using hand-worn haptic feedback devices. The improvement is presented by introducing spatial tactile feedback, which provides the human worker with more intuitive information about the currently planned robot's trajectory, given its spatial configuration. The enhanced feedback devices communicate directional information through activation of six tactors spatially organised to represent an orthogonal coordinate frame: the vibration activates on the side of the feedback device that is closest to the future path of the robot. To test the effectiveness of the improved human-machine interface, two user studies were prepared and conducted. The first study aimed to quantitatively evaluate the ease of differentiating activation of individual tactors of the notification devices. The second user study aimed to assess the overall usability of the enhanced notification mode for improving human awareness about the planned trajectory of a robot. The results of the first experiment allowed to identify the tactors for which vibration intensity was most often confused by users. The results of the second experiment showed that the enhanced notification system allowed the participants to complete the task faster and, in general, improved user awareness of the robot's movement plan, according to both objective and subjective data. Moreover, the majority of participants (82%) favoured the improved notification system over its previous non-directional version and vision-based inspection.
Department of Robotics Faculty of Mechanical Engineering VSB TU Ostrava 70800 Ostrava Czech Republic
See more in PubMed
Vysocký A., Novak P. Human—Robot Collaboration in Industry. MM Sci. J. 2016;2016:903–906. doi: 10.17973/MMSJ.2016_06_201611. DOI
Bolano G., Roennau A., Dillmann R. Transparent Robot Behavior by Adding Intuitive Visual and Acoustic Feedback to Motion Replanning; Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); Nanjing, China. 27–31 August 2018; pp. 1075–1080.
Mohammadi Amin F., Rezayati M., van de Venn H.W., Karimpour H. A Mixed-Perception Approach for Safe Human–Robot Collaboration in Industrial Automation. Sensors. 2020;20:6347. doi: 10.3390/s20216347. PubMed DOI PMC
Casalino A., Messeri C., Pozzi M., Zanchettin A.M., Rocco P., Prattichizzo D. Operator Awareness in Human–Robot Collaboration Through Wearable Vibrotactile Feedback. IEEE Robot. Autom. Lett. 2018;3:4289–4296. doi: 10.1109/LRA.2018.2865034. DOI
Bonci A., Cen Cheng P.D., Indri M., Nabissi G., Sibona F. Human-Robot Perception in Industrial Environments: A Survey. Sensors. 2021;21:1571. doi: 10.3390/s21051571. PubMed DOI PMC
Lee W., Park C.H., Jang S., Cho H.-K. Design of Effective Robotic Gaze-Based Social Cueing for Users in Task-Oriented Situations: How to Overcome In-Attentional Blindness? Appl. Sci. 2020;10:5413. doi: 10.3390/app10165413. DOI
Kalpagam Ganesan R., Rathore Y.K., Ross H.M., Ben Amor H. Better Teaming Through Visual Cues: How Projecting Imagery in a Workspace Can Improve Human-Robot Collaboration. IEEE Robot. Autom. Mag. 2018;25:59–71. doi: 10.1109/MRA.2018.2815655. DOI
Andersen R.S., Madsen O., Moeslund T.B., Amor H.B. Projecting Robot Intentions into Human Environments; Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); New York, NY, USA. 26–31 August 2016; pp. 294–301.
Bambuŝek D., Materna Z., Kapinus M., Beran V., Smrž P. Combining Interactive Spatial Augmented Reality with Head-Mounted Display for End-User Collaborative Robot Programming; Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN); New Delhi, India. 14–18 October 2019; pp. 1–8.
Fang H.C., Ong S.K., Nee A.Y.C. A Novel Augmented Reality-Based Interface for Robot Path Planning. Int. J. Interact. Des. Manuf. 2014;8:33–42. doi: 10.1007/s12008-013-0191-2. DOI
Hietanen A., Pieters R., Lanz M., Latokartano J., Kämäräinen J.-K. AR-Based Interaction for Human-Robot Collaborative Manufacturing. Robot. Comput.-Integr. Manuf. 2020;63:101891. doi: 10.1016/j.rcim.2019.101891. DOI
Clair A.S., Matarić M. How Robot Verbal Feedback Can Improve Team Performance in Human-Robot Task Collaborations; Proceedings of the 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI); Portland, OR, USA. 2–5 March 2015; pp. 213–220.
Li H., Sarter N.B., Sebok A., Wickens C.D. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. SAGE Publications; Los Angeles, CA, USA: 2012. The Design and Evaluation of Visual and Tactile Warnings in Support of Space Teleoperation. DOI
Sziebig G., Korondi P. Remote Operation and Assistance in Human Robot Interactions with Vibrotactile Feedback; Proceedings of the 2017 IEEE 26th International Symposium on Industrial Electronics (ISIE); Edinburgh, UK. 19–21 June 2017; pp. 1753–1758.
de Barros P.G., Lindeman R.W., Ward M.O. Enhancing Robot Teleoperator Situation Awareness and Performance Using Vibro-Tactile and Graphical Feedback; Proceedings of the 2011 IEEE Symposium on 3D User Interfaces (3DUI); Singapore. 19–20 March 2011; pp. 47–54.
Mucha Ł., Lis K. Soft and Stiffness-controllable Robotics Solutions for Minimally Invasive Surgery: The STIFF-FLOP Approach. River Publishers; Aalborg, Denmark: 2018. Force Feedback Sleeve Using Pneumatic and Micro Vibration Actuators.
Zikmund P., Macik M., Dubnicky L., Horpatzska M. Comparison of Joystick Guidance Methods; Proceedings of the 2019 10th IEEE International Conference on Cognitive Infocommunications (CogInfoCom); Naples, Italy. 23–25 October 2019; pp. 265–270.
Stanley A.A., Kuchenbecker K.J. Design of Body-Grounded Tactile Actuators for Playback of Human Physical Contact; Proceedings of the 2011 IEEE World Haptics Conference; Istanbul, Turkey. 21–24 June 2011; pp. 563–568.
Scalera L., Seriani S., Gallina P., Di Luca M., Gasparetto A. An Experimental Setup to Test Dual-Joystick Directional Responses to Vibrotactile Stimuli. IEEE Trans. Haptics. 2018;11:378–387. doi: 10.1109/TOH.2018.2804391. PubMed DOI
Grushko S., Vysocký A., Oščádal P., Vocetka M., Novák P., Bobovský Z. Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory. Sensors. 2021;21:3673. doi: 10.3390/s21113673. PubMed DOI PMC
Brock A., Kammoun S., Macé M., Jouffrais C. Using Wrist Vibrations to Guide Hand Movement and Whole Body Navigation. i-com. 2014;3:19–28. doi: 10.1515/icom.2014.0026. DOI
Zeng L., Miao M., Weber G. Interactive Audio-Haptic Map Explorer on a Tactile Display. Interact. Comput. 2015;27:413–429. doi: 10.1093/iwc/iwu006. DOI
Zhao K., Rayar F., Serrano M., Oriola B., Jouffrais C. Vibrotactile Cues for Conveying Directional Information during Blind Exploration of Digital Graphics; Proceedings of the 31st Conference on l’Interaction Homme-Machine; Grenoble, France. 10–13 December 2019; New York, NY, USA: ACM; 2019. p. 12.
Zhang C., Pearson J., Robinson S., Hopkins P., Jones M., Sahoo R., Holton M. Active PinScreen: Exploring Spatio-Temporal Tactile Feedback for Multi-Finger Interaction; Proceedings of the 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services; Oldenburg, Germany. 5–8 October 2020; New York, NY, USA: ACM; 2020.
Barbosa C., Souza L., Bezerra de Lima E., Silva B., Santana O., Jr., Alberto G. Human Gesture Evaluation with Visual Detection and Haptic Feedback; Proceedings of the 24th Brazilian Symposium on Multimedia and the Web; Salvador, Brazil. 16–19 October 2018; New York, NY, USA: ACM; 2018. p. 282.
Jung J., Son S., Lee S., Kim Y., Lee G. ThroughHand: 2D Tactile Interaction to Simultaneously Recognize and Touch Multiple Objects; Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems; Yokohama, Japan. 8–13 May 2021; New York, NY, USA: ACM; 2021. p. 13.
Williams S.R., Okamura A.M. Body-Mounted Vibrotactile Stimuli: Simultaneous Display of Taps on the Fingertips and Forearm. IEEE Trans. Haptics. 2021;14:432–444. doi: 10.1109/TOH.2020.3042955. PubMed DOI
Aggravi M., Salvietti G., Prattichizzo D. Haptic Wrist Guidance Using Vibrations for Human-Robot Teams; Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); New York, NY, USA. 26–31 August 2016; pp. 113–118.
Scheggi S., Chinello F., Prattichizzo D. Vibrotactile Haptic Feedback for Human-Robot Interaction in Leader-Follower Tasks; Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments; Crete, Greece. 6–8 June 2012; New York, NY, USA: ACM; 2012. p. 4.
Scheggi S., Aggravi M., Morbidi F., Prattichizzo D. Cooperative Human-Robot Haptic Navigation; Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA); Hong Kong, China. 31 May–7 June 2014.
Yatani K., Banovic N., Truong K. SpaceSense: Representing Geographical Information to Visually Impaired People Using Spatial Tactile Feedback; Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Austin, TX, USA. 5–10 May 2012; New York, NY, USA: ACM; 2012. pp. 415–424.
Wang Y., Millet B., Smith J.L. Designing Wearable Vibrotactile Notifications for Information Communication. Int. J. Hum.-Comput. Stud. 2016;89:24–34. doi: 10.1016/j.ijhcs.2016.01.004. DOI
Alvina J., Zhao S., Perrault S.T., Azh M., Roumen T., Fjeld M. OmniVib: Towards Cross-Body Spatiotemporal Vibrotactile Notifications for Mobile Phones; Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems; Seoul, Korea. 18–23 April 2015; New York, NY, USA: Association for Computing Machinery; 2015. pp. 2487–2496.
Israr A., Poupyrev I. Tactile Brush: Drawing on Skin with a Tactile Grid Display; Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Vancouver, BC, Canada . 7–12 May 2011; New York, NY, USA: ACM; 2011. DOI
Ioan Sucan S.C. MoveIt! [(accessed on 28 March 2021)]; Available online: http://moveit.ros.org.
Hornung A., Wurm K.M., Bennewitz M., Stachniss C., Burgard W. OctoMap: An Efficient Probabilistic 3D Mapping Framework Based on Octrees. Auton Robot. 2013;34:189–206. doi: 10.1007/s10514-012-9321-0. DOI
Pan J., Chitta S., Manocha D. FCL: A General Purpose Library for Collision and Proximity Queries; Proceedings of the 2012 IEEE International Conference on Robotics and Automation; Saint Paul, MN, USA. 14–18 May 2012; pp. 3859–3866.
Johansson R.S., Vallbo A.B. Tactile Sensibility in the Human Hand: Relative and Absolute Densities of Four Types of Mechanoreceptive Units in Glabrous Skin. J. Physiol. 1979;286:283–300. doi: 10.1113/jphysiol.1979.sp012619. PubMed DOI PMC
Fransson-Hall C., Kilbom Å. Sensitivity of the Hand to Surface Pressure. Appl. Ergon. 1993;24:181–189. doi: 10.1016/0003-6870(93)90006-U. PubMed DOI
Practice Effect—APA Dictionary of Psychology. [(accessed on 4 March 2021)]; Available online: https://dictionary.apa.org/practice-effect.
Scalera L., Giusti A., Cosmo V., Riedl M., Vidoni R., Matt D. Application of Dynamically Scaled Safety Zones Based on the ISO/TS 15066:2016 for Collaborative Robotics. Int. J. Mech. Control. 2020;21:41–50.
Camera Arrangement Optimization for Workspace Monitoring in Human-Robot Collaboration
Distributed Camera Subsystem for Obstacle Detection