Deep reinforcement learning-based CIO and energy control for LTE mobility load balancing
Author's Department
Electronics & Communications Engineering Department
Second Author's Department
Electronics & Communications Engineering Department
Find in your Library
https://doi.org/10.1109/CCNC49032.2021.9369525
Document Type
Research Article
Publication Title
2021 IEEE 18th Annual Consumer Communications and Networking Conference, CCNC 2021
Publication Date
1-9-2021
doi
10.1109/CCNC49032.2021.9369525
Abstract
cellular networks' congestion has been one of the most common problems in cellular networks due to the huge increase in network load resulted from enhancing communication quality as well as increasing the number of users. Since mobile users are not uniformly distributed in the network, the need for load balancing as a cellular networks' self-optimization technique has increased recently. Then, the congestion problem can be handled by evenly distributing the network load among the network resources. Lots of research has been dedicated to developing load balancing models for cellular networks. Most of these models rely on adjusting the Cell Individual Offset (CIO) parameters which are designed for self-optimization techniques in cellular networks. In this paper, a new deep reinforcement learning-based load balancing approach is proposed as a solution for the LTE Downlink congestion problem. This approach does not rely only on adapting the CIO parameters, but it rather has two degrees of control; the first one is adjusting the CIO parameters, and the second is adjusting the eNodeBs' transmission power. The proposed model uses Double Deep Q-Network (DDQN) to learn how to adjust these parameters so that a better load distribution in the overall network is achieved. Simulation results prove the effectiveness of the proposed approach by improving the network overall throughput by up to 21.4% and 6.5% compared to the base-line scheme and the scheme that only adapts CIOs, respectively.
Recommended Citation
APA Citation
Alsuhli, G.
Ismail, H.
Alansary, K.
Rumman, M.
...
(2021). Deep reinforcement learning-based CIO and energy control for LTE mobility load balancing. 2021 IEEE 18th Annual Consumer Communications and Networking Conference, CCNC 2021,
10.1109/CCNC49032.2021.9369525
https://fount.aucegypt.edu/faculty_journal_articles/2441
MLA Citation
Alsuhli, Ghada, et al.
"Deep reinforcement learning-based CIO and energy control for LTE mobility load balancing." 2021 IEEE 18th Annual Consumer Communications and Networking Conference, CCNC 2021, 2021,
https://fount.aucegypt.edu/faculty_journal_articles/2441