Multi-access edge computing (MEC) and ultra-dense networking (UDN) are recognized as two promising paradigms for future mobile networks that can be utilized to improve the spectrum efficiency and the quality of computational experience (QoCE). In this paper, we study the task offloading problem in an MEC-enabled UDN architecture with the aim to minimize the task duration while satisfying the energy budget constraints. Due to the dynamics associated with the environment and parameter uncertainty, designing an optimal task offloading algorithm is highly challenging. Consequently, we propose an online task offloading algorithm based on a state-of-the-art deep reinforcement learning (DRL) technique: asynchronous advantage actor-critic (A3C). It is worthy of remark that the proposed method requires neither instantaneous channel state information (CSI) nor prior knowledge of the computational capabilities of the base stations. Simulations show that the our method is able to learn a good offloading policy to obtain a near-optimal task allocation while meeting energy budget constraints of mobile devices in the UDN environment.