Retinal vessel segmentation method based on multi-scale dual-path convolutional neural network Online publication date: Wed, 23-Oct-2024
by Tao Fang; Linling Fang
International Journal of Signal and Imaging Systems Engineering (IJSISE), Vol. 13, No. 2, 2024
Abstract: Aiming to identify small blood vessels and low-contrast areas in retinal images, the paper presents an innovative approach to segmenting blood vessels in retinal images by employing a multi-scale dual-path convolutional neural network. It utilises Gabor filters to capture the unique characteristics of vessels at various scales, distinguishing between thick and thin vessels. The method integrates a dual-path network that employs convolution and sampling operations for advanced feature learning, leading to efficient end-to-end segmentation. The local vessel segmentation network features an encoder-decoder structure that retains spatial dimensions and employs dilated convolution to enhance the precision of thin vessel segmentation. A skip connection is added to further refine the segmentation of small vessels. The results on the DRIVE and CHASE_DB1 datasets show that this method outperforms existing techniques, achieving higher accuracy, sensitivity, and specificity. It successfully segments small, low-contrast vessels that are often overlooked while preserving the integrity and connectivity of the vascular structure.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Signal and Imaging Systems Engineering (IJSISE):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com