Summary: | Deep neural networks have become ubiquitous due to their ability to perform arbitrary tasks more accurately than manually-crafted systems. This ability has created a substantial demand for more complex models processing larger amounts of data. However, the traditional computing architecture has reached a bottleneck in processing performance due to data movement. Considerable efforts have been made to create custom hardware to accelerate deep neural network training and inference. Among these efforts are optical neural networks, which have been a promising approach that excel at linear operations but struggle with nonlinear implementations. Here, we propose our multiplicative analog frequency transform optical neural network (MAFT-ONN) that computes matrix products using frequency-encoded signals and implements the nonlinearity for each layer using a single Mach-Zhender modulator. We experimentally demonstrate a 3-layer DNN for inference of MNIST digits, showing a scalable, fully analog front-to-end ONN. This architecture is also the first deep neural network hardware accelerator that is suited for direct inference of time-based signals without digitization.
|