Document Type
Article
Publication Date
10-2024
Abstract
Conventional geometric metrology, or three-dimensional (3D) scanning, and reverse engineering heavily rely on the experience of the operators. With an increasing need for automation, robot arms have been adopted for this task. However, due to the large variety of parts and designs, automated path planning could provide a scanning solution that may overlook the critical area, which could potentially deteriorate the scan results. This article explores the integration of collaborative robotics (cobots) with eye-tracking technology to improve the autonomous 3D scanning process. The primary objective of this study is to enhance the accuracy and efficiency of cobots in 3D scanning, particularly in the capture of functionally critical areas, and to provide a detailed description of regions with complex geometric features. The study develops a framework where the scanning path of the robot-driven scanner is partially guided by eye tracking data, that is, calibrated gaze tracking, to improve the automated 3D scanning process. This framework provides an innovative integration of human gaze movement with automatic robot path planning, providing a new way of human-autonomy teaming. Case studies are presented to present and validate the proposed framework to automatically improve the 3D point cloud collection process, specifically in areas that usually require human manual intervention to capture details.
Recommended Citation
Karunathilake, Sachithra, Md Shahriar Forhad, and Zhaohui Geng. "Gaze tracking embedded collaborative robots for automated metrology and reverse engineering." Manufacturing Letters 41 (2024): 1488-1498. https://doi.org/10.1016/j.mfglet.2024.09.175
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Publication Title
Manufacturing Letters
DOI
https://doi.org/10.1016/j.mfglet.2024.09.175
Comments
© 2024 The Authors. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)