Adaptive Residual Weighted <i>K</i>-Nearest Neighbor Fingerprint Positioning Algorithm Based on Visible Light Communication

The weighted <i>K</i>-nearest neighbor (WKNN) algorithm is a commonly used fingerprint positioning, the difficulty of which lies in how to optimize the value of <i>K</i> to obtain the minimum positioning error. In this paper, we propose an adaptive residual weighted <i>...

Full description

Bibliographic Details
Main Authors: Shiwu Xu, Chih-Cheng Chen, Yi Wu, Xufang Wang, Fen Wei
Format: Article
Language:English
Published: MDPI AG 2020-08-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/20/16/4432
Description
Summary:The weighted <i>K</i>-nearest neighbor (WKNN) algorithm is a commonly used fingerprint positioning, the difficulty of which lies in how to optimize the value of <i>K</i> to obtain the minimum positioning error. In this paper, we propose an adaptive residual weighted <i>K</i>-nearest neighbor (ARWKNN) fingerprint positioning algorithm based on visible light communication. Firstly, the target matches the fingerprints according to the received signal strength indication (RSSI) vector. Secondly, <i>K</i> is a dynamic value according to the matched RSSI residual. Simulation results show the ARWKNN algorithm presents a reduced average positioning error when compared with random forest (81.82%), extreme learning machine (83.93%), artificial neural network (86.06%), grid-independent least square (60.15%), self-adaptive WKNN (43.84%), WKNN (47.81%), and KNN (73.36%). These results were obtained when the signal-to-noise ratio was set to 20 dB, and Manhattan distance was used in a two-dimensional (2-D) space. The ARWKNN algorithm based on Clark distance and minimum maximum distance metrics produces the minimum average positioning error in 2-D and 3-D, respectively. Compared with self-adaptive WKNN (SAWKNN), WKNN and KNN algorithms, the ARWKNN algorithm achieves a significant reduction in the average positioning error while maintaining similar algorithm complexity.
ISSN:1424-8220