Аннотация
This work studies the latency experienced by a TCP connection in an operational LTE RAN. After exhibiting the well-known downlink bufferbloat phenomenon, our experiments shed some light on the less studied RAN uplink jitter. We explain this uplink jitter by the uplink grant-based access method. These results are reproduced in a lab environment based on the OpenAirInterface software RAN, and demonstrate the importance of RAN configuration and limitations in the current LTE standard. We conclude on open issues in the 5G grant allocation process and the current grant-free access methods.
Пользователи данного ресурса
Пожалуйста,
войдите в систему, чтобы принять участие в дискуссии (добавить собственные рецензию, или комментарий)