Efficient quantum circuits for machine learning activation functions including constant T-depth ReLU
In recent years, Quantum Machine Learning (QML) has increasingly captured the interest of researchers. Among the components in this domain, activation functions hold a fundamental and indispensable role. Our research focuses on the development of activation functions quantum circuits for integration...
Main Authors: | Zi, Wei, Wang, Siyi, Kim, Hyunji, Sun, Xiaoming, Chattopadhyay, Anupam, Rebentrost, Patrick |
---|---|
Other Authors: | School of Computer Science and Engineering |
Format: | Journal Article |
Language: | English |
Published: |
2025
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/182156 |
Similar Items
-
Towards Fast computation of certified robustness for ReLU networks
by: Weng, Tsui-Wei, et al.
Published: (2021) -
Reductions of ReLU neural networks to linear neural networks and their applications
by: Le, Thien
Published: (2022) -
Fault-tolerant strategies for multi-rotor parcel delivery
by: Tan, Jun Kiat
Published: (2024) -
Small ReLU networks are powerful memorizers: A tight analysis of memorization capacity
by: Yun, Chulhee, et al.
Published: (2021) -
Voting algorithms for large scale fault-tolerant systems
by: Karimi, Abbas
Published: (2011)