Abstract
This work studies the latency experienced by a TCP connection in an operational LTE RAN. After exhibiting the well-known downlink bufferbloat phenomenon, our experiments shed some light on the less studied RAN uplink jitter. We explain this uplink jitter by the uplink grant-based access method. These results are reproduced in a lab environment based on the OpenAirInterface software RAN, and demonstrate the importance of RAN configuration and limitations in the current LTE standard. We conclude on open issues in the 5G grant allocation process and the current grant-free access methods.
Users
Please
log in to take part in the discussion (add own reviews or comments).