Object detection in inland vessels using combined trained and pretrained models of YOLO8

<p><span class="fontstyle0">Abstract</span><span class="fontstyle1">—One of the main challenges in computer vision is<br />object detection, which entails both locating and identifying<br />specific items on an image. With a fresh perspective,...

Full description

Bibliographic Details
Main Authors: Ahmad A. Goudah, Maximilian Jarofka, Mohmed El-Habrouk, Dieter Schramm, Yasser G. Dessouky
Format: Article
Language:English
Published: Academy Publishing Center 2023-11-01
Series:Advances in Computing and Engineering
Online Access:http://apc.aast.edu/ojs/index.php/ACE/article/view/669
Description
Summary:<p><span class="fontstyle0">Abstract</span><span class="fontstyle1">—One of the main challenges in computer vision is<br />object detection, which entails both locating and identifying<br />specific items on an image. With a fresh perspective, the YOLO<br />(You Only Look Once) algorithm was developed in 2015 and<br />performs object detection in a single neural network. That caused<br />the field of object detection to explode and produce considerably<br />more amazing achievements than it had a decade before. So far,<br />YOLO has been improved to eight versions and rated as one<br />of the top object identification algorithms. This is thanks to its<br />combination with many of the most cutting-edge concepts being<br />explored in the computer vision research field. The most recent<br />version of YOLO, known as YOLOv8, performs better than the<br />YOLOv7 and YOLO5 in terms of accuracy and speed, though.<br />This paper examines the most recent developments in computer<br />vision that were incorporated into YOLOv5,YOLO7 and YOLO8<br />and its predecessors.<br /></span><span class="fontstyle0">Index Terms</span><span class="fontstyle1">—Object Detection, YOLO, Autonomous Vehicles,<br />Inland Waterway Vessels, Bounded Boxes, Neural Network, CNN.</span></p><p><span class="fontstyle1"><br /></span></p><p><strong><span class="fontstyle1">Received: 14 June 2023 </span></strong></p><p><strong><span class="fontstyle1">Accepted: 11 September 2023</span></strong></p><p><strong><span class="fontstyle1">Published: 20 November 2023</span></strong></p>
ISSN:2735-5977
2735-5985