Deep Stochastic Logic Gate Networks
This paper introduces a novel regularization approach aimed at improving generalization performance by perturbing deterministic logical expressions. We incorporate logical inference into deep neural networks using logic gates and propose stochastic sampling to select appropriate logic gates from a p...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10301592/ |
_version_ | 1827767699772538880 |
---|---|
author | Youngsung Kim |
author_facet | Youngsung Kim |
author_sort | Youngsung Kim |
collection | DOAJ |
description | This paper introduces a novel regularization approach aimed at improving generalization performance by perturbing deterministic logical expressions. We incorporate logical inference into deep neural networks using logic gates and propose stochastic sampling to select appropriate logic gates from a predetermined set at each node, resembling sampling from a categorical distribution. While the Gumbel softmax relaxation facilitates effective sampling learning, the independence of perturbation from the maximum index operation (<inline-formula> <tex-math notation="LaTeX">$\mathop {\mathrm {arg\,max}} $ </tex-math></inline-formula>) poses challenges in maintaining consistent sampling and preserving the original categorical probability order. To address this issue, we introduce scaled noise in the Gumbel process, followed by normalization to unnormalized probabilities. By leveraging randomness and introducing stochastic learning into deterministic logical transformations, we demonstrate enhanced classification accuracy. Extensive evaluations on publicly available datasets, including UCI (adult and breast cancer), MNIST, and CIFAR-10, establish the superiority of our method over softmax-based logical gate networks. Our contributions significantly advance the training of logic gate-based networks, inspiring further developments in deep logic gate network training. |
first_indexed | 2024-03-11T12:02:38Z |
format | Article |
id | doaj.art-6483e0dcfb1e402ca4bf9c76a0fccbf4 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-03-11T12:02:38Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-6483e0dcfb1e402ca4bf9c76a0fccbf42023-11-08T00:01:30ZengIEEEIEEE Access2169-35362023-01-011112248812250110.1109/ACCESS.2023.332862210301592Deep Stochastic Logic Gate NetworksYoungsung Kim0https://orcid.org/0009-0001-7420-129XDepartment of Artificial Intelligence, Inha University, Incheon, Republic of KoreaThis paper introduces a novel regularization approach aimed at improving generalization performance by perturbing deterministic logical expressions. We incorporate logical inference into deep neural networks using logic gates and propose stochastic sampling to select appropriate logic gates from a predetermined set at each node, resembling sampling from a categorical distribution. While the Gumbel softmax relaxation facilitates effective sampling learning, the independence of perturbation from the maximum index operation (<inline-formula> <tex-math notation="LaTeX">$\mathop {\mathrm {arg\,max}} $ </tex-math></inline-formula>) poses challenges in maintaining consistent sampling and preserving the original categorical probability order. To address this issue, we introduce scaled noise in the Gumbel process, followed by normalization to unnormalized probabilities. By leveraging randomness and introducing stochastic learning into deterministic logical transformations, we demonstrate enhanced classification accuracy. Extensive evaluations on publicly available datasets, including UCI (adult and breast cancer), MNIST, and CIFAR-10, establish the superiority of our method over softmax-based logical gate networks. Our contributions significantly advance the training of logic gate-based networks, inspiring further developments in deep logic gate network training.https://ieeexplore.ieee.org/document/10301592/Logic gates networksreparameterizationsamplingstochastic process |
spellingShingle | Youngsung Kim Deep Stochastic Logic Gate Networks IEEE Access Logic gates networks reparameterization sampling stochastic process |
title | Deep Stochastic Logic Gate Networks |
title_full | Deep Stochastic Logic Gate Networks |
title_fullStr | Deep Stochastic Logic Gate Networks |
title_full_unstemmed | Deep Stochastic Logic Gate Networks |
title_short | Deep Stochastic Logic Gate Networks |
title_sort | deep stochastic logic gate networks |
topic | Logic gates networks reparameterization sampling stochastic process |
url | https://ieeexplore.ieee.org/document/10301592/ |
work_keys_str_mv | AT youngsungkim deepstochasticlogicgatenetworks |