Open Access
ARTICLE
Proactive Caching at the Wireless Edge: A Novel Predictive User Popularity-Aware Approach
1 College of Computer Science, Chongqing University, Chongqing, 400044, China
2 School of Computer and Software Engineering, Xihua University, Chengdu, 610039, China
3 School of Computer and Information Engineering, Jiangxi Normal University, Nanchang, 330022, China
4 Electric Power Research Institute of State Grid Ningxia Electric Power Co., Ltd., Yinchuan, 750002, China
5 College of Mechanical and Vehicle Engineering, Chongqing University, Chongqing, 400030, China
6 School of Computer Science and Technology, Beijing Institute of Technology, Beijing, 100083, China
7 School of Computer Science and Technology, Dongguan University of Technology, Dongguan, 523808, China
8 School of Emergency Management, Xihua University, Chengdu, 610039, China
9 College of Computer and Information Science, Chongqing Normal University, Chongqing, 401331, China
* Corresponding Author: Yunni Xia. Email:
(This article belongs to the Special Issue: Machine Learning Empowered Distributed Computing: Advance in Architecture, Theory and Practice)
Computer Modeling in Engineering & Sciences 2024, 140(2), 1997-2017. https://doi.org/10.32604/cmes.2024.048723
Received 16 December 2023; Accepted 07 March 2024; Issue published 20 May 2024
Abstract
Mobile Edge Computing (MEC) is a promising technology that provides on-demand computing and efficient storage services as close to end users as possible. In an MEC environment, servers are deployed closer to mobile terminals to exploit storage infrastructure, improve content delivery efficiency, and enhance user experience. However, due to the limited capacity of edge servers, it remains a significant challenge to meet the changing, time-varying, and customized needs for highly diversified content of users. Recently, techniques for caching content at the edge are becoming popular for addressing the above challenges. It is capable of filling the communication gap between the users and content providers while relieving pressure on remote cloud servers. However, existing static caching strategies are still inefficient in handling the dynamics of the time-varying popularity of content and meeting users’ demands for highly diversified entity data. To address this challenge, we introduce a novel method for content caching over MEC, i.e., PRIME. It synthesizes a content popularity prediction model, which takes users’ stay time and their request traces as inputs, and a deep reinforcement learning model for yielding dynamic caching schedules. Experimental results demonstrate that PRIME, when tested upon the MovieLens 1M dataset for user request patterns and the Shanghai Telecom dataset for user mobility, outperforms its peers in terms of cache hit rates, transmission latency, and system cost.Keywords
Cite This Article
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.