Affordable Access

Mobility-Aware Service Caching in Mobile Edge Computing for Internet of Things

Authors
  • wei, hua
  • luo, hong
  • sun, yan
Publication Date
Jan 22, 2020
Source
MDPI
Keywords
Language
English
License
Green
External links

Abstract

The mobile edge computing architecture successfully solves the problem of high latency in cloud computing. However, current research focuses on computation offloading and lacks research on service caching issues. To solve the service caching problem, especially for scenarios with high mobility in the Sensor Networks environment, we study the mobility-aware service caching mechanism. Our goal is to maximize the number of users who are served by the local edge-cloud, and we need to make predictions about the user&rsquo / s target location to avoid invalid service requests. First, we propose an idealized geometric model to predict the target area of a user&rsquo / s movement. Since it is difficult to obtain all the data needed by the model in practical applications, we use frequent patterns to mine local moving track information. Then, by using the results of the trajectory data mining and the proposed geometric model, we make predictions about the user&rsquo / s target location. Based on the prediction result and existing service cache, the service request is forwarded to the appropriate base station through the service allocation algorithm. Finally, to be able to train and predict the most popular services online, we propose a service cache selection algorithm based on back-propagation (BP) neural network. The simulation experiments show that our service cache algorithm reduces the service response time by about 13.21% on average compared to other algorithms, and increases the local service proportion by about 15.19% on average compared to the algorithm without mobility prediction.

Report this publication

Statistics

Seen <100 times