Indexed by:
Abstract:
In mobile edge computing (MEC) systems, unmanned aerial vehicles (UAVs) facilitate edge service providers (ESPs) offering flexible resource provisioning with broader communication coverage and thus improving the Quality of Service (QoS). However, dynamic system states and various traffic patterns seriously hinder efficient cooperation among UAVs. Existing solutions commonly rely on prior system knowledge or complex neural network models, lacking adaptability and causing excessive overheads. To address these critical challenges, we propose the DisOff, a novel profit-aware cooperative offloading framework in UAV-enabled MEC with lightweight deep reinforcement learning (DRL). First, we design an improved DRL with twin critic-networks and delay mechanism, which solves the $Q$ -value overestimation and high variance and thus approximates the optimal UAV cooperative offloading and resource allocation. Next, we develop a new multiteacher distillation mechanism for the proposed DRL model, where the policies of multiple UAVs are integrated into one DRL agent, compressing the model size while maintaining superior performance. Using the real-world datasets of user traffic, extensive experiments are conducted to validate the effectiveness of the proposed DisOff. Compared to benchmark methods, the DisOff enhances ESP profits while reducing the DRL model size and training costs.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE INTERNET OF THINGS JOURNAL
ISSN: 2327-4662
Year: 2024
Issue: 12
Volume: 11
Page: 21325-21336
8 . 2 0 0
JCR@2023
Cited Count:
WoS CC Cited Count: 16
SCOPUS Cited Count: 20
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: