XOR-Net : an efficient computation pipeline for binary neural network inference on edge devices
Accelerating the inference of Convolution Neural Networks (CNNs) on edge devices is essential due to the small memory size and poor computation capability of these devices. Network quantization methods such as XNOR-Net, Bi-Real-Net, and XNOR-Net++ reduce the memory usage of CNNs by binarizing the CN...
Main Authors: | Zhu, Shien, Duong, Luan H. K., Liu, Weichen |
---|---|
Other Authors: | School of Computer Science and Engineering |
Format: | Conference Paper |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/145503 |
Similar Items
-
TAB : unified and optimized ternary, binary and mixed-precision neural network inference on the edge
by: Zhu, Shien, et al.
Published: (2022) -
Live demonstration: man-in-the-middle attack on edge artificial intelligence
by: Hu, Bowen, et al.
Published: (2024) -
The effect of varying kilovoltage (kVp) and tube current (mAs) on the image quality and dose of CTA head phantom / Sity Noor Ayseah Dzulkafli
by: Dzulkafli, Sity Noor Ayseah
Published: (2015) -
EdgeNAS: discovering efficient neural architectures for edge systems
by: Luo, Xiangzhong, et al.
Published: (2023) -
Edge intelligence for smart grid: a survey on application potentials
by: Gooi, Hoay Beng, et al.
Published: (2024)