Training neural networks with end-to-end optical backpropagation

Optics is an exciting route for the next generation of computing hardware for machine learning, promising several orders of magnitude enhancement in both computational speed and energy efficiency. However, reaching the full capacity of an optical neural network necessitates the computing be implemen...

Olles dieđut

Bibliográfalaš dieđut
Váldodahkkit: Spall, J, Guo, X, Lvovsky, A
Materiálatiipa: Journal article
Giella:English
Almmustuhtton: Society of Photo-Optical Instrumentation Engineers 2025
Govvádus
Čoahkkáigeassu:Optics is an exciting route for the next generation of computing hardware for machine learning, promising several orders of magnitude enhancement in both computational speed and energy efficiency. However, reaching the full capacity of an optical neural network necessitates the computing be implemented optically not only for inference, but also for training. The primary algorithm for network training is backpropagation, in which the calculation is performed in the order opposite to the information flow for inference. While straightforward in a digital computer, optical implementation of backpropagation has remained elusive, particularly because of the conflicting requirements for the optical element that implements the nonlinear activation function. In this work, we address this challenge for the first time with a surprisingly simple scheme, employing saturable absorbers for the role of activation units. Our approach is adaptable to various analog platforms and materials, and demonstrates the possibility of constructing neural networks entirely reliant on analog optical processes for both training and inference tasks.