Biometric Template Protection for Dynamic Touch Gestures Based on Fuzzy Commitment Scheme and Deep Learning

Privacy plays an important role in biometric authentication systems. Touch authentication systems have been widely used since touch devices reached their current level of development. In this work, a fuzzy commitment scheme (FCS) is proposed based on deep learning (DL) to protect the touch-gesture t...

Full description

Bibliographic Details
Main Authors: Asrar Bajaber, Lamiaa Elrefaei
Format: Article
Language:English
Published: MDPI AG 2022-01-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/3/362
Description
Summary:Privacy plays an important role in biometric authentication systems. Touch authentication systems have been widely used since touch devices reached their current level of development. In this work, a fuzzy commitment scheme (FCS) is proposed based on deep learning (DL) to protect the touch-gesture template in a touch authentication system. The binary Bose–Ray-Chaudhuri code (BCH) is used with FCS to deal with touch variations. The BCH code is described by the triplet (<i>n, k, t</i>) where <i>n</i> denotes the code word’s length, <i>k</i> denotes the length of the key and <i>t</i> denotes error-correction capability. In our proposed system, the system performance is investigated using different lengths <i>k</i>. The learning-based approach is applied to extract touch features from raw touch data, as the recurrent neural network (RNN) is used based on a convolutional neural network (CNN). The proposed system has been evaluated on two different touch datasets: the Touchalytics dataset and BioIdent dataset. The best results obtained were with a key length <i>k</i> = 99 and <i>n</i> = 255; the false accept rate (FAR) was 0.00 and false reject rate (FRR) was 0.5854 for the Touchalytics dataset, while the FAR was 0.00 and FRR was 0.5399 with the BioIdent dataset. The FCS shows its effectiveness in dynamic authentication systems, as good results are obtained and compared with other works.
ISSN:2227-7390