Affordable Access

Access to the full text

A graph-based big data optimization approach using hidden Markov model and constraint satisfaction problem

Authors
  • Sassi, Imad1
  • Anter, Samir1
  • Bekkhoucha, Abdelkrim1
  • 1 Computer Science Laboratory (LIM), FSTM, Hassan II University, Casablanca, Morocco , Casablanca (Morocco)
Type
Published Article
Journal
Journal of Big Data
Publisher
Springer Nature
Publication Date
Jun 29, 2021
Volume
8
Issue
1
Identifiers
DOI: 10.1186/s40537-021-00485-z
Source
Springer Nature
Keywords
Disciplines
  • Research
License
Green

Abstract

To address the challenges of big data analytics, several works have focused on big data optimization using metaheuristics. The constraint satisfaction problem (CSP) is a fundamental concept of metaheuristics that has shown great efficiency in several fields. Hidden Markov models (HMMs) are powerful machine learning algorithms that are applied especially frequently in time series analysis. However, one issue in forecasting time series using HMMs is how to reduce the search space (state and observation space). To address this issue, we propose a graph-based big data optimization approach using a CSP to enhance the results of learning and prediction tasks of HMMs. This approach takes full advantage of both HMMs, with the richness of their algorithms, and CSPs, with their many powerful and efficient solver algorithms. To verify the validity of the model, the proposed approach is evaluated on real-world data using the mean absolute percentage error (MAPE) and other metrics as measures of the prediction accuracy. The conducted experiments show that the proposed model outperforms the conventional model. It reduces the MAPE by 0.71% and offers a particularly good trade-off between computational costs and the quality of results for large datasets. It is also competitive with benchmark models in terms of the running time and prediction accuracy. Further comparisons substantiate these experimental findings.

Report this publication

Statistics

Seen <100 times