Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

Activation functions are essential for deep learning methods to learn and perform complex tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and become the default activation function across the deep learning community since 2012. Although ReLU has been popular, ho...

Бүрэн тодорхойлолт

Номзүйн дэлгэрэнгүй
Үндсэн зохиолчид: Hock, Hung Chieng, Wahid, Noorhaniza, Ong, Pauline, Perla, Sai Raj Kishore
Формат: Өгүүллэг
Хэл сонгох:English
Хэвлэсэн: Program Studi Teknik Informatika 2018
Нөхцлүүд:
Онлайн хандалт:http://eprints.uthm.edu.my/5227/1/AJ%202020%20%28102%29.pdf