Data protection with unlearnable examples

The pervasive success of deep learning across diverse fields hinges on the extensive use of large datasets, which often contain sensitive personal information collected without explicit consent. This practice has raised significant privacy concerns, prompting the development of unlearnable examples...

Full description

Bibliographic Details
Main Author: Ma, Xiaoyu
Other Authors: Alex Chichung Kot
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/177180
_version_ 1826127425025605632
author Ma, Xiaoyu
author2 Alex Chichung Kot
author_facet Alex Chichung Kot
Ma, Xiaoyu
author_sort Ma, Xiaoyu
collection NTU
description The pervasive success of deep learning across diverse fields hinges on the extensive use of large datasets, which often contain sensitive personal information collected without explicit consent. This practice has raised significant privacy concerns, prompting the development of unlearnable examples (UE) as a novel data protection strategy. Unlearnable examples aim to modify data with subtle perturbations that, while imperceptible to humans, prevent machine learning models from effectively learning from them. Existing research has primarily focused on unimodal data, such as images, leaving a gap in the study of UE for multimodal data, which includes complex interactions between different data types like video and audio. This project explores the extension of UE techniques to multimodal learning environments, addressing the unique challenges posed by these datasets. By innovating and testing new UE strategies tailored for multimodal data and assessing their impact on model learning and data interpretability, this study aims to advance the field of data privacy in deep learning. Through a comprehensive survey of current UE technology, experimentation with multimodal datasets like CREMA-D and Kinetics-Sounds, and rigorous analysis, the project seeks to enhance privacy protections in multimodal deep learning frameworks, offering insights and practical solutions for the creation of robust and transferable unlearnable examples.
first_indexed 2024-10-01T07:08:40Z
format Final Year Project (FYP)
id ntu-10356/177180
institution Nanyang Technological University
language English
last_indexed 2024-10-01T07:08:40Z
publishDate 2024
publisher Nanyang Technological University
record_format dspace
spelling ntu-10356/1771802024-05-31T15:43:43Z Data protection with unlearnable examples Ma, Xiaoyu Alex Chichung Kot School of Electrical and Electronic Engineering Rapid-Rich Object Search (ROSE) Lab EACKOT@ntu.edu.sg Computer and Information Science Engineering Deep learning Unlearnable examples Data protection The pervasive success of deep learning across diverse fields hinges on the extensive use of large datasets, which often contain sensitive personal information collected without explicit consent. This practice has raised significant privacy concerns, prompting the development of unlearnable examples (UE) as a novel data protection strategy. Unlearnable examples aim to modify data with subtle perturbations that, while imperceptible to humans, prevent machine learning models from effectively learning from them. Existing research has primarily focused on unimodal data, such as images, leaving a gap in the study of UE for multimodal data, which includes complex interactions between different data types like video and audio. This project explores the extension of UE techniques to multimodal learning environments, addressing the unique challenges posed by these datasets. By innovating and testing new UE strategies tailored for multimodal data and assessing their impact on model learning and data interpretability, this study aims to advance the field of data privacy in deep learning. Through a comprehensive survey of current UE technology, experimentation with multimodal datasets like CREMA-D and Kinetics-Sounds, and rigorous analysis, the project seeks to enhance privacy protections in multimodal deep learning frameworks, offering insights and practical solutions for the creation of robust and transferable unlearnable examples. Bachelor's degree 2024-05-27T04:20:02Z 2024-05-27T04:20:02Z 2024 Final Year Project (FYP) Ma, X. (2024). Data protection with unlearnable examples. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/177180 https://hdl.handle.net/10356/177180 en A3079-231 application/pdf Nanyang Technological University
spellingShingle Computer and Information Science
Engineering
Deep learning
Unlearnable examples
Data protection
Ma, Xiaoyu
Data protection with unlearnable examples
title Data protection with unlearnable examples
title_full Data protection with unlearnable examples
title_fullStr Data protection with unlearnable examples
title_full_unstemmed Data protection with unlearnable examples
title_short Data protection with unlearnable examples
title_sort data protection with unlearnable examples
topic Computer and Information Science
Engineering
Deep learning
Unlearnable examples
Data protection
url https://hdl.handle.net/10356/177180
work_keys_str_mv AT maxiaoyu dataprotectionwithunlearnableexamples