Recently, the demand for small cell base stations (SBSs) has been exploding to accommodate the explosive increase in mobile data traffic.In ultra-dense small cell networks (UDSCNs), because the spatial and temporal traffic distributions are significantly disproportionate, the efficient what does pre 98 mean management of the energy consumption of SBSs is crucial.Therefore, we herein propose a multi-agent distributed Q-learning algorithm that maximizes energy efficiency (EE) while minimizing the number of outage users.
Through intensive simulations, we demonstrate that the proposed algorithm outperforms conventional algorithms in terms of EE and the number of outage users.Even though the proposed reinforcement learning algorithm has significantly lower computational complexity than the centralized approach, it is shown that it can converge 8n joico lumishine to the optimal solution.