An Efficient Edge Detection Approach to Provide Better Edge Connectivity for Image Analysis

An edge detection is important for its reliability and security which delivers a better understanding of object recognition in the applications of computer vision, such as pedestrian detection, face detection, and video surveillance. This paper introduced two fundamental limitations encountered in e...

Full description

Bibliographic Details
Main Authors: Mamta Mittal, Amit Verma, Iqbaldeep Kaur, Bhavneet Kaur, Meenakshi Sharma, Lalit Mohan Goyal, Sudipta Roy, Tai-Hoon Kim
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8667063/
Description
Summary:An edge detection is important for its reliability and security which delivers a better understanding of object recognition in the applications of computer vision, such as pedestrian detection, face detection, and video surveillance. This paper introduced two fundamental limitations encountered in edge detection: edge connectivity and edge thickness, those have been used by various developments in the state-of-the-art. An optimal selection of the threshold for effectual edge detection has constantly been a key challenge in computer vision. Therefore, a robust edge detection algorithm using multiple threshold approaches (B-Edge) is proposed to cover both the limitations. The majorly used canny edge operator focuses on two thresholds selections and still witnesses a few gaps for optimal results. To handle the loopholes of the canny edge operator, our method selects the simulated triple thresholds that target to the prime issues of the edge detection: image contrast, effective edge pixels selection, errors handling, and similarity to the ground truth. The qualitative and quantitative experimental evaluations demonstrate that our edge detection method outperforms competing algorithms for mentioned issues. The proposed approach endeavors an improvement for both grayscale and colored images.
ISSN:2169-3536