Indexed by:
Abstract:
In mobile edge computing (MEC) systems, unmanned aerial vehicles (UAVs) facilitate edge service providers (ESPs) offering flexible resource provisioning with broader communication coverage and thus improving the Quality of Service (QoS). However, dynamic system states and various traffic patterns seriously hinder efficient cooperation among UAVs. Existing solutions commonly rely on prior system knowledge or complex neural network models, lacking adaptability and causing excessive overheads. To address these critical challenges, we propose the DisOff, a novel profit-aware cooperative offloading framework in UAV-enabled MEC with lightweight deep reinforcement learning (DRL). First, we design an improved DRL with twin critic-networks and delay mechanism, which solves the Q -value overestimation and high variance and thus approximates the optimal UAV cooperative offloading and resource allocation. Next, we develop a new multiteacher distillation mechanism for the proposed DRL model, where the policies of multiple UAVs are integrated into one DRL agent, compressing the model size while maintaining superior performance. Using the real-world datasets of user traffic, extensive experiments are conducted to validate the effectiveness of the proposed DisOff. Compared to benchmark methods, the DisOff enhances ESP profits while reducing the DRL model size and training costs. © 2014 IEEE.
Keyword:
Reprint 's Address:
Source :
IEEE Internet of Things Journal
ISSN: 2327-4662
Year: 2024
Issue: 12
Volume: 11
Page: 21325-21336
8 . 2 0 0
JCR@2023
Affiliated Colleges:
查看更多>>操作日志
管理员 2025-02-10 13:17:21 更新被引
管理员 2025-02-10 13:16:09 更新被引
管理员 2025-01-28 15:46:07 追加
管理员 2025-01-02 11:14:42 追加