Abstract
IoT wireless connectivity often presents a lossy and error prone radio link layer between the wireless edge device and the wireless base station. Data transacted over the radio link layer often uses TCP, which provides programmatic error mitigation elements, e.g., loss of and out-of-order packet detection, combined with ACK/NACK from far end to trigger packet retransmission. In the use case of subscription based wireless connectivity, such as cellular or satellite, the information payload is often metered and billed per amount of data passed over the radio link. Hence, a wireless edge device can incur significant data overage charges to transmit a given amount of application information due to the increased billable TCP-driven retransmissions. The aim of this paper is to quantify the TCP-retransmission rate caused by a lossy radio link as it is experienced by a typical IoT LTE wireless device. The authors also propose the creation of a functional layer between the application layer and the lower layers on the wireless edge device, referred to as “RF Fidelity Layer”. Its purpose is to provide real-time situational knowledge of the radio link layer to the application software, which in turn can “tune” the information flow for maximum efficiency and minimum billable data overage due to TCP-driven retransmissions.