Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation

Depth perception capability is one of the essential requirements for various autonomous driving platforms. However, accurate depth estimation in a real-world setting is still a challenging problem due to high computational costs. In this paper, we propose a lightweight depth completion network for d...

Full description

Bibliographic Details
Main Authors: Yongseop Jeong, Jinsun Park, Donghyeon Cho, Yoonjin Hwang, Seibum B. Choi, In So Kweon
Format: Article
Language:English
Published: MDPI AG 2022-09-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/19/7388
_version_ 1797476897232257024
author Yongseop Jeong
Jinsun Park
Donghyeon Cho
Yoonjin Hwang
Seibum B. Choi
In So Kweon
author_facet Yongseop Jeong
Jinsun Park
Donghyeon Cho
Yoonjin Hwang
Seibum B. Choi
In So Kweon
author_sort Yongseop Jeong
collection DOAJ
description Depth perception capability is one of the essential requirements for various autonomous driving platforms. However, accurate depth estimation in a real-world setting is still a challenging problem due to high computational costs. In this paper, we propose a lightweight depth completion network for depth perception in real-world environments. To effectively transfer a teacher’s knowledge, useful for the depth completion, we introduce local similarity-preserving knowledge distillation (LSPKD), which allows similarities between local neighbors to be transferred during the distillation. With our LSPKD, a lightweight student network is precisely guided by a heavy teacher network, regardless of the density of the ground-truth data. Experimental results demonstrate that our method is effective to reduce computational costs during both training and inference stages while achieving superior performance over other lightweight networks.
first_indexed 2024-03-09T21:11:15Z
format Article
id doaj.art-916375e056364ba3acc3772ad43d4550
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T21:11:15Z
publishDate 2022-09-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-916375e056364ba3acc3772ad43d45502023-11-23T21:48:22ZengMDPI AGSensors1424-82202022-09-012219738810.3390/s22197388Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge DistillationYongseop Jeong0Jinsun Park1Donghyeon Cho2Yoonjin Hwang3Seibum B. Choi4In So Kweon5The Robotics Program, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, KoreaSchool of Computer Science and Engineering, Pusan National University, 2 Busandaehak-ro 63beon-gil, Geumjeong-gu, Busan 46241, KoreaDepartment of Electronics Engineering, Chungnam National University, 99 Daehak-ro, Yuseong-gu, Daejeon 34134, KoreaDepartment of Mechanical Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, KoreaDepartment of Mechanical Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, KoreaSchool of Electrical Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, KoreaDepth perception capability is one of the essential requirements for various autonomous driving platforms. However, accurate depth estimation in a real-world setting is still a challenging problem due to high computational costs. In this paper, we propose a lightweight depth completion network for depth perception in real-world environments. To effectively transfer a teacher’s knowledge, useful for the depth completion, we introduce local similarity-preserving knowledge distillation (LSPKD), which allows similarities between local neighbors to be transferred during the distillation. With our LSPKD, a lightweight student network is precisely guided by a heavy teacher network, regardless of the density of the ground-truth data. Experimental results demonstrate that our method is effective to reduce computational costs during both training and inference stages while achieving superior performance over other lightweight networks.https://www.mdpi.com/1424-8220/22/19/7388depth completionlocal similarityknowledge distillationmodel compressionsensor fusionmultimodal learning
spellingShingle Yongseop Jeong
Jinsun Park
Donghyeon Cho
Yoonjin Hwang
Seibum B. Choi
In So Kweon
Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
Sensors
depth completion
local similarity
knowledge distillation
model compression
sensor fusion
multimodal learning
title Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
title_full Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
title_fullStr Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
title_full_unstemmed Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
title_short Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation
title_sort lightweight depth completion network with local similarity preserving knowledge distillation
topic depth completion
local similarity
knowledge distillation
model compression
sensor fusion
multimodal learning
url https://www.mdpi.com/1424-8220/22/19/7388
work_keys_str_mv AT yongseopjeong lightweightdepthcompletionnetworkwithlocalsimilaritypreservingknowledgedistillation
AT jinsunpark lightweightdepthcompletionnetworkwithlocalsimilaritypreservingknowledgedistillation
AT donghyeoncho lightweightdepthcompletionnetworkwithlocalsimilaritypreservingknowledgedistillation
AT yoonjinhwang lightweightdepthcompletionnetworkwithlocalsimilaritypreservingknowledgedistillation
AT seibumbchoi lightweightdepthcompletionnetworkwithlocalsimilaritypreservingknowledgedistillation
AT insokweon lightweightdepthcompletionnetworkwithlocalsimilaritypreservingknowledgedistillation