Title: RGB-depth map formation from cili-padi plant imaging using stereo vision camera
Authors: Wira Hidayat Bin Mohd Saad; Muhammad Haziq Bin Abd Razak; Muhammad Noorazlan Shah Zainudin; Syafeeza Binti Ahmad Radzi; Muhd. Shah Jehan Bin Abd. Razak
Addresses: Faculty of Electronics and Computer Engineering, Universiti Teknikal Malaysia Melaka (UTeM), 76100, Durian Tunggal, Melaka, Malaysia ' Faculty of Electronics and Computer Engineering, Universiti Teknikal Malaysia Melaka (UTeM), 76100, Durian Tunggal, Melaka, Malaysia ' Faculty of Electronics and Computer Engineering, Universiti Teknikal Malaysia Melaka (UTeM), 76100, Durian Tunggal, Melaka, Malaysia ' Faculty of Electronics and Computer Engineering, Universiti Teknikal Malaysia Melaka (UTeM), 76100, Durian Tunggal, Melaka, Malaysia ' SolokFertigasi by MSJ Perwira Enterprise, 75460 Duyong, Melaka, Malaysia
Abstract: Stereo vision is one of the advancements in computer vision and pattern recognition applications using a dual camera to mimic human visuals. This study focused on RGB-depth (RGB-d) map image formation selection parameters, specifically from the stereo image captured on the cili-padi (birds-eye chilli) plant. The process starts from calibrating the camera used with a checkerboard image to obtain the camera's intrinsic and extrinsic resolution. The stereo images were rectified to facilitate the disparity computation between the left and right images. Then, point cloud plotting is acquired by using a triangulation function on the image disparity with the camera parameter value. RGB-d images are computed by normalising the depth information of each point plot into greyscale value or any other suitable colourmap. Comparing the different types of disparity map transformation function algorithms used to produce the RGB-d image shows that using SGM-function provides the best output of RGB-d image formation.
Keywords: cili-padi plant; depth map formation; RGB-depth map; stereo camera vision.
DOI: 10.1504/IJCVR.2023.132001
International Journal of Computational Vision and Robotics, 2023 Vol.13 No.4, pp.343 - 358
Received: 27 Jan 2021
Accepted: 08 Mar 2022
Published online: 06 Jul 2023 *