Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot

Decision-making is the key for autonomous systems to achieve real intelligence and autonomy. This paper presents an integrated probabilistic decision framework for a robot to infer roles that humans fulfill in specific missions. The framework also enables the assessment of the situation and necessit...

Full description

Bibliographic Details
Main Authors: Yang, Chule, Wang, Danwei, Zeng, Yijie, Yue, Yufeng, Siritanawan, Prarinya
Other Authors: School of Electrical and Electronic Engineering
Format: Journal Article
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/141400
_version_ 1811684345216237568
author Yang, Chule
Wang, Danwei
Zeng, Yijie
Yue, Yufeng
Siritanawan, Prarinya
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Yang, Chule
Wang, Danwei
Zeng, Yijie
Yue, Yufeng
Siritanawan, Prarinya
author_sort Yang, Chule
collection NTU
description Decision-making is the key for autonomous systems to achieve real intelligence and autonomy. This paper presents an integrated probabilistic decision framework for a robot to infer roles that humans fulfill in specific missions. The framework also enables the assessment of the situation and necessity of interaction with the person fulfilling the target role. The target role is the person who is distinctive in movement or holds a mission-critical object, where the object is pre-specified in the corresponding mission. The proposed framework associates prior knowledge with spatial relationships between the humans and objects as well as with their temporal changes. Distance-Based Inference (DBI) and Knowledge-Based Inference (KBI) support recognition of human roles. DBI deduces the role based on the relative distance between humans and the specified objects. KBI focuses on human actions and objects existence. The role is estimated using weighted fusion scheme based on the information entropy. The situation is assessed by analyzing the action of the person fulfilling the target role and relative position of this person to the mission-related entities, where the entity is something that has a particular function in the corresponding mission. This assessment determines the robot decision on what actions it should take. A series of experiments has proofed that the proposed framework provides a reasonable assessment of the situation. Moreover, it outperforms other approaches on accuracy, efficiency, and robustness.
first_indexed 2024-10-01T04:27:09Z
format Journal Article
id ntu-10356/141400
institution Nanyang Technological University
language English
last_indexed 2024-10-01T04:27:09Z
publishDate 2020
record_format dspace
spelling ntu-10356/1414002020-06-08T05:29:59Z Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot Yang, Chule Wang, Danwei Zeng, Yijie Yue, Yufeng Siritanawan, Prarinya School of Electrical and Electronic Engineering ST Engineering-NTU Corporate Laboratory Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics Decision Making Multimodal Information Fusion Decision-making is the key for autonomous systems to achieve real intelligence and autonomy. This paper presents an integrated probabilistic decision framework for a robot to infer roles that humans fulfill in specific missions. The framework also enables the assessment of the situation and necessity of interaction with the person fulfilling the target role. The target role is the person who is distinctive in movement or holds a mission-critical object, where the object is pre-specified in the corresponding mission. The proposed framework associates prior knowledge with spatial relationships between the humans and objects as well as with their temporal changes. Distance-Based Inference (DBI) and Knowledge-Based Inference (KBI) support recognition of human roles. DBI deduces the role based on the relative distance between humans and the specified objects. KBI focuses on human actions and objects existence. The role is estimated using weighted fusion scheme based on the information entropy. The situation is assessed by analyzing the action of the person fulfilling the target role and relative position of this person to the mission-related entities, where the entity is something that has a particular function in the corresponding mission. This assessment determines the robot decision on what actions it should take. A series of experiments has proofed that the proposed framework provides a reasonable assessment of the situation. Moreover, it outperforms other approaches on accuracy, efficiency, and robustness. NRF (Natl Research Foundation, S’pore) Accepted version 2020-06-08T05:29:59Z 2020-06-08T05:29:59Z 2018 Journal Article Yang, C., Wang, D., Zeng, Y., Yue, Y., & Siritanawan, P. (2019). Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot. Information Fusion, 50, 126-138. doi:10.1016/j.inffus.2018.10.007 1566-2535 https://hdl.handle.net/10356/141400 10.1016/j.inffus.2018.10.007 50 126 138 en Information Fusion © 2018 Elsevier B.V. All rights reserved. This paper was published in Information Fusion and is made available with permission of Elsevier B.V. application/pdf
spellingShingle Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
Decision Making
Multimodal Information Fusion
Yang, Chule
Wang, Danwei
Zeng, Yijie
Yue, Yufeng
Siritanawan, Prarinya
Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot
title Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot
title_full Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot
title_fullStr Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot
title_full_unstemmed Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot
title_short Knowledge-based multimodal information fusion for role recognition and situation assessment by using mobile robot
title_sort knowledge based multimodal information fusion for role recognition and situation assessment by using mobile robot
topic Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
Decision Making
Multimodal Information Fusion
url https://hdl.handle.net/10356/141400
work_keys_str_mv AT yangchule knowledgebasedmultimodalinformationfusionforrolerecognitionandsituationassessmentbyusingmobilerobot
AT wangdanwei knowledgebasedmultimodalinformationfusionforrolerecognitionandsituationassessmentbyusingmobilerobot
AT zengyijie knowledgebasedmultimodalinformationfusionforrolerecognitionandsituationassessmentbyusingmobilerobot
AT yueyufeng knowledgebasedmultimodalinformationfusionforrolerecognitionandsituationassessmentbyusingmobilerobot
AT siritanawanprarinya knowledgebasedmultimodalinformationfusionforrolerecognitionandsituationassessmentbyusingmobilerobot