Open Access iconOpen Access

ARTICLE

crossmark

Deep Reinforcement Learning Empowered Edge Collaborative Caching Scheme for Internet of Vehicles

by Xin Liu1, Siya Xu1, Chao Yang2, Zhili Wang1,*, Hao Zhang3, Jingye Chi1, Qinghan Li4

1 State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing, 100876, China
2 Information and Communication Branch, State Grid Liaoning Electric Power Co., Ltd., Shenyang, 110000, China
3 China Global Energy Interconnection Research Institute, Nanjing, 210000, China
4 Syracuse University, New York, 13244, USA

* Corresponding Author: Zhili Wang. Email: email

Computer Systems Science and Engineering 2022, 42(1), 271-287. https://doi.org/10.32604/csse.2022.022103

Abstract

With the development of internet of vehicles, the traditional centralized content caching mode transmits content through the core network, which causes a large delay and cannot meet the demands for delay-sensitive services. To solve these problems, on basis of vehicle caching network, we propose an edge collaborative caching scheme. Road side unit (RSU) and mobile edge computing (MEC) are used to collect vehicle information, predict and cache popular content, thereby provide low-latency content delivery services. However, the storage capacity of a single RSU severely limits the edge caching performance and cannot handle intensive content requests at the same time. Through content sharing, collaborative caching can relieve the storage burden on caching servers. Therefore, we integrate RSU and collaborative caching to build a MEC-assisted vehicle edge collaborative caching (MVECC) scheme, so as to realize the collaborative caching among cloud, edge and vehicle. MVECC uses deep reinforcement learning to predict what needs to be cached on RSU, which enables RSUs to cache more popular content. In addition, MVECC also introduces a mobility-aware caching replacement scheme at the edge network to reduce redundant cache and improving cache efficiency, which allows RSU to dynamically replace the cached content in response to the mobility of vehicles. The simulation results show that the proposed MVECC scheme can improve cache performance in terms of energy cost and content hit rate.

Keywords


Cite This Article

APA Style
Liu, X., Xu, S., Yang, C., Wang, Z., Zhang, H. et al. (2022). Deep reinforcement learning empowered edge collaborative caching scheme for internet of vehicles. Computer Systems Science and Engineering, 42(1), 271-287. https://doi.org/10.32604/csse.2022.022103
Vancouver Style
Liu X, Xu S, Yang C, Wang Z, Zhang H, Chi J, et al. Deep reinforcement learning empowered edge collaborative caching scheme for internet of vehicles. Comput Syst Sci Eng. 2022;42(1):271-287 https://doi.org/10.32604/csse.2022.022103
IEEE Style
X. Liu et al., “Deep Reinforcement Learning Empowered Edge Collaborative Caching Scheme for Internet of Vehicles,” Comput. Syst. Sci. Eng., vol. 42, no. 1, pp. 271-287, 2022. https://doi.org/10.32604/csse.2022.022103



cc Copyright © 2022 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2260

    View

  • 1190

    Download

  • 0

    Like

Share Link