Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction

Robotic applications, such as educational programs, are well-known. Nonetheless, there are challenges to be implemented in other settings, e.g., mine detection, agriculture support, and tasks for industry 4.0. The main challenge consists of robotic operations supported by autonomous decision using s...

Full description

Bibliographic Details
Main Authors: Hernando León Araujo, Jesús Gulfo Agudelo, Richard Crawford Vidal, Jorge Ardila Uribe, John Freddy Remolina, Claudia Serpa-Imbett, Ana Milena López, Diego Patiño Guevara
Format: Article
Language:English
Published: MDPI AG 2022-03-01
Series:Machines
Subjects:
Online Access:https://www.mdpi.com/2075-1702/10/3/193
_version_ 1797445897423945728
author Hernando León Araujo
Jesús Gulfo Agudelo
Richard Crawford Vidal
Jorge Ardila Uribe
John Freddy Remolina
Claudia Serpa-Imbett
Ana Milena López
Diego Patiño Guevara
author_facet Hernando León Araujo
Jesús Gulfo Agudelo
Richard Crawford Vidal
Jorge Ardila Uribe
John Freddy Remolina
Claudia Serpa-Imbett
Ana Milena López
Diego Patiño Guevara
author_sort Hernando León Araujo
collection DOAJ
description Robotic applications, such as educational programs, are well-known. Nonetheless, there are challenges to be implemented in other settings, e.g., mine detection, agriculture support, and tasks for industry 4.0. The main challenge consists of robotic operations supported by autonomous decision using sensed-based features extraction. A prototype of a robot assembled using mechanical parts of a LEGO MINDSTORMS Robotic Kit EV3 and a Raspberry Pi controlled through servo algorithms of 2D and 2D1/2 vision approaches was implemented to tackle this challenge. This design is supported by simulations based on image, position, and a hybrid scheme for visual servo controllers. Practical implementation is operated using navigation guided by running up image-based visual servo control algorithms embedded in a Raspberry Pi that uses a control criterion based on error evolution to compute the difference between a target and sensed image. Images are collected by a camera installed on a mobile robotic platform manually and automatically operated and controlled using the Raspberry Pi. An Android application to watch the images by video streaming is shown here, using a smartphone and a video related to the implemented robot’s operation. This kind of robot might be used to complete field reactive tasks in the settings mentioned above, since the detection and control approaches allow self-contained guidance.
first_indexed 2024-03-09T13:32:23Z
format Article
id doaj.art-ae45840ec8f2447b856c71d525a9ba76
institution Directory Open Access Journal
issn 2075-1702
language English
last_indexed 2024-03-09T13:32:23Z
publishDate 2022-03-01
publisher MDPI AG
record_format Article
series Machines
spelling doaj.art-ae45840ec8f2447b856c71d525a9ba762023-11-30T21:16:08ZengMDPI AGMachines2075-17022022-03-0110319310.3390/machines10030193Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine InteractionHernando León Araujo0Jesús Gulfo Agudelo1Richard Crawford Vidal2Jorge Ardila Uribe3John Freddy Remolina4Claudia Serpa-Imbett5Ana Milena López6Diego Patiño Guevara7ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, ColombiaElectronics Department, Pontificia Universidad Javeriana, Carrera 7 No. 40-62 Edificio 42, Bogotá 110111, ColombiaRobotic applications, such as educational programs, are well-known. Nonetheless, there are challenges to be implemented in other settings, e.g., mine detection, agriculture support, and tasks for industry 4.0. The main challenge consists of robotic operations supported by autonomous decision using sensed-based features extraction. A prototype of a robot assembled using mechanical parts of a LEGO MINDSTORMS Robotic Kit EV3 and a Raspberry Pi controlled through servo algorithms of 2D and 2D1/2 vision approaches was implemented to tackle this challenge. This design is supported by simulations based on image, position, and a hybrid scheme for visual servo controllers. Practical implementation is operated using navigation guided by running up image-based visual servo control algorithms embedded in a Raspberry Pi that uses a control criterion based on error evolution to compute the difference between a target and sensed image. Images are collected by a camera installed on a mobile robotic platform manually and automatically operated and controlled using the Raspberry Pi. An Android application to watch the images by video streaming is shown here, using a smartphone and a video related to the implemented robot’s operation. This kind of robot might be used to complete field reactive tasks in the settings mentioned above, since the detection and control approaches allow self-contained guidance.https://www.mdpi.com/2075-1702/10/3/193visual servo controlRaspberry PiroboticsLego Mindstorms
spellingShingle Hernando León Araujo
Jesús Gulfo Agudelo
Richard Crawford Vidal
Jorge Ardila Uribe
John Freddy Remolina
Claudia Serpa-Imbett
Ana Milena López
Diego Patiño Guevara
Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
Machines
visual servo control
Raspberry Pi
robotics
Lego Mindstorms
title Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
title_full Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
title_fullStr Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
title_full_unstemmed Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
title_short Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
title_sort autonomous mobile robot implemented in lego ev3 integrated with raspberry pi to use android based vision control algorithms for human machine interaction
topic visual servo control
Raspberry Pi
robotics
Lego Mindstorms
url https://www.mdpi.com/2075-1702/10/3/193
work_keys_str_mv AT hernandoleonaraujo autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT jesusgulfoagudelo autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT richardcrawfordvidal autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT jorgeardilauribe autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT johnfreddyremolina autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT claudiaserpaimbett autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT anamilenalopez autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction
AT diegopatinoguevara autonomousmobilerobotimplementedinlegoev3integratedwithraspberrypitouseandroidbasedvisioncontrolalgorithmsforhumanmachineinteraction