A New Localization Objective for Accurate Fine-Grained Affordance Segmentation Under High-Scale Variations

Fine-grained affordance segmentation for object parts can greatly benefit robotics and scene understanding applications. In this work, we propose an instance-segmentation framework that can accurately localize functionality and affordance of individual object parts. We build on the standard Mask-RCN...

Full description

Bibliographic Details
Main Authors: Mohammed Hassanin, Salman Khan, Murat Tahtali
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8930497/
Description
Summary:Fine-grained affordance segmentation for object parts can greatly benefit robotics and scene understanding applications. In this work, we propose an instance-segmentation framework that can accurately localize functionality and affordance of individual object parts. We build on the standard Mask-RCNN framework and propose two novelties to the localization objective that can lead to improved part detection and affordance segmentation results. Specifically, we notice two problems with the conventional IOU based regression loss, (a) the small boxes, that are specially relevant for fine-grained detection, have a higher risk of being ignored during the optimization process and (b) a constant value of IOU for non-overlapping candidates means no supervision is available to encourage the reduction in loss function. To address these limitations, we propose a novel Angular Intersection Over Larger (AIOL) measure. Our experiments show consistent improvement over other baselines and state of the art localization loss functions for the fine-grained affordance segmentation task.
ISSN:2169-3536