Online Network Coding for Time-Division Duplexing

We study an online random linear network coding approach for time division duplexing (TDD) channels under Poisson arrivals. We model the system as a bulk-service queue with variable bulk size and with feedback, i.e., when a set of packets are serviced at a given time, they might be reintroduced...

Full description

Bibliographic Details
Main Authors: Lucani, Daniel Enrique, Medard, Muriel, Stojanovic, Milica
Other Authors: Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Format: Article
Language:en_US
Published: Institute of Electrical and Electronics Engineers 2010
Online Access:http://hdl.handle.net/1721.1/60309
https://orcid.org/0000-0003-4059-407X
Description
Summary:We study an online random linear network coding approach for time division duplexing (TDD) channels under Poisson arrivals. We model the system as a bulk-service queue with variable bulk size and with feedback, i.e., when a set of packets are serviced at a given time, they might be reintroduced to the queue to form part of the next service batch. We show that there is an optimal number of coded data packets that the sender should transmit back-to-back before stopping to wait for an acknowledgement from the receiver. This number depends on the latency, probability of packet erasure, degrees of freedom at the receiver, the size of the coding window, and the arrival rate of the Poisson process. Random network coding is performed across a moving window of packets that depends on the packets in the queue, design constraints on the window size, and the feedback sent from the receiver. We study the mean time between generating a packet at the source and it being “seen”, but not necessarily decoded, at the receiver. We also analyze the mean time between a decoding event and the next, defined as the decoding of all the packets that have been previously “seen” and those packets involved in the current window of packets. Inherently, a decoding event implies an inorder decoding of a batch of data packets. We present numerical results illustrating the trade-off between mean delay and mean time between decoding events.