Abstract
Multi-access edge computing (MEC) and ultra-dense networking (UDN) are recognized as two promising paradigms for future mobile networks that can be utilized to improve the spectrum efficiency and the quality of computational experience (QoCE). In this paper, we study the task offloading problem in an MEC-enabled UDN architecture with the aim to minimize the task duration while satisfying the energy budget constraints. Due to the dynamics associated with the environment and parameter uncertainty, designing an optimal task offloading algorithm is highly challenging. Consequently, we propose an online task offloading algorithm based on a state-of-the-art deep reinforcement learning (DRL) technique: asynchronous advantage actor-critic (A3C). It is worthy of remark that the proposed method requires neither instantaneous channel state information (CSI) nor prior knowledge of the computational capabilities of the base stations. Simulations show that the our method is able to learn a good offloading policy to obtain a near-optimal task allocation while meeting energy budget constraints of mobile devices in the UDN environment.
Original language | English |
---|---|
Article number | 68 |
Pages (from-to) | 1-17 |
Number of pages | 17 |
Journal | ACM Transactions on Internet Technology |
Volume | 22 |
Issue number | 3 |
DOIs | |
Publication status | Published - Aug 2022 |