Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation

We focus on a distributed learning problem in a communication network, consisting of <inline-formula> <tex-math notation="LaTeX">$N$ </tex-math></inline-formula> distributed nodes and a central parameter server (PS). The PS is responsible for performing the computat...

Full description

Bibliographic Details
Main Authors: Tamir L. S. Gez, Kobi Cohen
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10168898/
_version_ 1797688558781202432
author Tamir L. S. Gez
Kobi Cohen
author_facet Tamir L. S. Gez
Kobi Cohen
author_sort Tamir L. S. Gez
collection DOAJ
description We focus on a distributed learning problem in a communication network, consisting of <inline-formula> <tex-math notation="LaTeX">$N$ </tex-math></inline-formula> distributed nodes and a central parameter server (PS). The PS is responsible for performing the computation based on data received from the nodes, which are transmitted over a multiple access channel (MAC). The objective function for this problem is the sum of the local loss functions of the nodes. This problem has gained attention in the field of distributed sensing systems, as well as in the area of federated learning (FL) recently. However, current approaches to solving this problem rely on the assumption that the loss functions are continuously differentiable. In this paper, we first address the case where this assumption does not hold. We develop a novel algorithm called Sub-Gradient descent Multiple Access (SGMA) to solve the learning problem over MAC. SGMA involves each node transmitting an analog shaped waveform of its local subgradient over MAC, and the PS receiving a superposition of the noisy analog signals, resulting in a bandwidth-efficient over-the-air (OTA) computation used to update the learned model. We analyze the performance of SGMA and prove that it has a convergence rate that approaches that of the centralized subgradient algorithm in large networks. Simulation results using real datasets show the effectiveness of SGMA.
first_indexed 2024-03-12T01:32:39Z
format Article
id doaj.art-cef92d6237cf4ecc8423d642479a4e59
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-03-12T01:32:39Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-cef92d6237cf4ecc8423d642479a4e592023-09-11T23:01:20ZengIEEEIEEE Access2169-35362023-01-0111946239463510.1109/ACCESS.2023.329102310168898Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air ComputationTamir L. S. Gez0Kobi Cohen1https://orcid.org/0000-0003-0532-009XSchool of Electrical and Computer Engineering, Ben-Gurion University of the Negev, Be&#x2019;er Sheva, IsraelSchool of Electrical and Computer Engineering, Ben-Gurion University of the Negev, Be&#x2019;er Sheva, IsraelWe focus on a distributed learning problem in a communication network, consisting of <inline-formula> <tex-math notation="LaTeX">$N$ </tex-math></inline-formula> distributed nodes and a central parameter server (PS). The PS is responsible for performing the computation based on data received from the nodes, which are transmitted over a multiple access channel (MAC). The objective function for this problem is the sum of the local loss functions of the nodes. This problem has gained attention in the field of distributed sensing systems, as well as in the area of federated learning (FL) recently. However, current approaches to solving this problem rely on the assumption that the loss functions are continuously differentiable. In this paper, we first address the case where this assumption does not hold. We develop a novel algorithm called Sub-Gradient descent Multiple Access (SGMA) to solve the learning problem over MAC. SGMA involves each node transmitting an analog shaped waveform of its local subgradient over MAC, and the PS receiving a superposition of the noisy analog signals, resulting in a bandwidth-efficient over-the-air (OTA) computation used to update the learned model. We analyze the performance of SGMA and prove that it has a convergence rate that approaches that of the centralized subgradient algorithm in large networks. Simulation results using real datasets show the effectiveness of SGMA.https://ieeexplore.ieee.org/document/10168898/Distributed learninggradient descent (GD)-type learningsubgradient methodsfederated learning (FL)multiple access channel (MAC)over-the-air (OTA) computation
spellingShingle Tamir L. S. Gez
Kobi Cohen
Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation
IEEE Access
Distributed learning
gradient descent (GD)-type learning
subgradient methods
federated learning (FL)
multiple access channel (MAC)
over-the-air (OTA) computation
title Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation
title_full Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation
title_fullStr Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation
title_full_unstemmed Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation
title_short Subgradient Descent Learning Over Fading Multiple Access Channels With Over-the-Air Computation
title_sort subgradient descent learning over fading multiple access channels with over the air computation
topic Distributed learning
gradient descent (GD)-type learning
subgradient methods
federated learning (FL)
multiple access channel (MAC)
over-the-air (OTA) computation
url https://ieeexplore.ieee.org/document/10168898/
work_keys_str_mv AT tamirlsgez subgradientdescentlearningoverfadingmultipleaccesschannelswithovertheaircomputation
AT kobicohen subgradientdescentlearningoverfadingmultipleaccesschannelswithovertheaircomputation