YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications

Deep‐learning‐based pedestrian detectors can enhance the capabilities of smart camera systems in a wide spectrum of machine vision applications including video surveillance, autonomous driving, robots and drones, smart factory, and health monitoring. However, such complex paradigms do not scale easi...

Full description

Bibliographic Details
Main Author: Christos Kyrkou
Format: Article
Language:English
Published: Wiley 2020-10-01
Series:IET Computer Vision
Subjects:
Online Access:https://doi.org/10.1049/iet-cvi.2019.0897
_version_ 1797684621634174976
author Christos Kyrkou
author_facet Christos Kyrkou
author_sort Christos Kyrkou
collection DOAJ
description Deep‐learning‐based pedestrian detectors can enhance the capabilities of smart camera systems in a wide spectrum of machine vision applications including video surveillance, autonomous driving, robots and drones, smart factory, and health monitoring. However, such complex paradigms do not scale easily and are not traditionally implemented in resource‐constrained smart cameras for on‐device processing which offers significant advantages in situations when real‐time monitoring and privacy are vital. This work addresses the challenge of achieving a good trade‐off between accuracy and speed for efficient deep‐learning‐based pedestrian detection in smart camera applications. The contributions of this work are the following: 1) a computationally efficient architecture based on separable convolutions that integrates dense connections across layers and multi‐scale feature fusion to improve representational capacity while decreasing the number of parameters and operations, 2) a more elaborate loss function for improved localization, 3) and an anchor‐less approach for detection. The proposed approach referred to as YOLOpeds is evaluated using the PETS2009 surveillance dataset on 320 × 320 images. A real‐system implementation is presented using the Jetson TX2 embedded platform. YOLOpeds provides real‐time sustained operation of over 30 frames per second with detection rates in the range of 86% outperforming existing deep learning models.
first_indexed 2024-03-12T00:33:20Z
format Article
id doaj.art-a43d1708f8c648b5866cf073f0589494
institution Directory Open Access Journal
issn 1751-9632
1751-9640
language English
last_indexed 2024-03-12T00:33:20Z
publishDate 2020-10-01
publisher Wiley
record_format Article
series IET Computer Vision
spelling doaj.art-a43d1708f8c648b5866cf073f05894942023-09-15T10:11:27ZengWileyIET Computer Vision1751-96321751-96402020-10-0114741742510.1049/iet-cvi.2019.0897YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applicationsChristos Kyrkou0KIOS Research and Innovation Center of ExcellenceUniversity of CyprusCyprusDeep‐learning‐based pedestrian detectors can enhance the capabilities of smart camera systems in a wide spectrum of machine vision applications including video surveillance, autonomous driving, robots and drones, smart factory, and health monitoring. However, such complex paradigms do not scale easily and are not traditionally implemented in resource‐constrained smart cameras for on‐device processing which offers significant advantages in situations when real‐time monitoring and privacy are vital. This work addresses the challenge of achieving a good trade‐off between accuracy and speed for efficient deep‐learning‐based pedestrian detection in smart camera applications. The contributions of this work are the following: 1) a computationally efficient architecture based on separable convolutions that integrates dense connections across layers and multi‐scale feature fusion to improve representational capacity while decreasing the number of parameters and operations, 2) a more elaborate loss function for improved localization, 3) and an anchor‐less approach for detection. The proposed approach referred to as YOLOpeds is evaluated using the PETS2009 surveillance dataset on 320 × 320 images. A real‐system implementation is presented using the Jetson TX2 embedded platform. YOLOpeds provides real‐time sustained operation of over 30 frames per second with detection rates in the range of 86% outperforming existing deep learning models.https://doi.org/10.1049/iet-cvi.2019.0897machine vision applicationsdeep‐learning‐based pedestrian detectionYOLOpedssingle‐shot pedestrian detectiondeep learning‐based object detectorssmart camera systems
spellingShingle Christos Kyrkou
YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications
IET Computer Vision
machine vision applications
deep‐learning‐based pedestrian detection
YOLOpeds
single‐shot pedestrian detection
deep learning‐based object detectors
smart camera systems
title YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications
title_full YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications
title_fullStr YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications
title_full_unstemmed YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications
title_short YOLOpeds: efficient real‐time single‐shot pedestrian detection for smart camera applications
title_sort yolopeds efficient real time single shot pedestrian detection for smart camera applications
topic machine vision applications
deep‐learning‐based pedestrian detection
YOLOpeds
single‐shot pedestrian detection
deep learning‐based object detectors
smart camera systems
url https://doi.org/10.1049/iet-cvi.2019.0897
work_keys_str_mv AT christoskyrkou yolopedsefficientrealtimesingleshotpedestriandetectionforsmartcameraapplications