Title: A high precision stereo-vision robotic system for large-sized object measurement and grasping in non-structured environment
Authors: Guoyang Wan; Guofeng Wang; Kaisheng Xing; Tinghao Yi; Yunsheng Fan
Addresses: Anhui Polytechnic University, Wuhu, China ' Department of Marine Electrical Engineering, Dalian Maritime University, Linhai Road No. 1, Dalian, China ' Anhui Institute of Information Technology, Yonghe Road No. 1, Wuhu, China ' University of Science and Technology of China, No. 96, JinZhai Road, Baohe District, Hefei, China ' Department of Marine Electrical Engineering, Dalian Maritime University, Linhai Road No. 1, Dalian, China
Abstract: Handling and loading of large-sized objects represent a challenging task in industrial environments, especially when the object is a metal object with reflective surface features. Active stereo vision technology is not good for measuring the posture of reflective metal object, a high-precision pose measurement system based on passive stereo vision is proposed to automatic measurement, handling and loading of large objects by industrial robots in industrial environments. The system adopts advanced coarse and fine stereo vision positioning strategy, and realises the high-precision positioning of the measured target for the premise of ensuring stability. For coarse positioning, an improved multi-models template matching method based on machine learning is proposed to robust recognition of the multiple objects in complex background. A RANSAC-based method for the ellipse fitting and multi-points plane fitting is proposed for the 6-DOF pose of object accurately obtained in fine positioning step. Compared with the classical CAD-views method, experiments show that the method proposed in this paper has better performance in positioning accuracy and recognition robustness.
Keywords: industrial robot; machine vision; stereo vision; 3D measurement; template matching; coordinate transform.
DOI: 10.1504/IJICA.2022.125662
International Journal of Innovative Computing and Applications, 2022 Vol.13 No.4, pp.210 - 220
Received: 25 Aug 2020
Accepted: 23 Nov 2020
Published online: 26 Sep 2022 *