Home | Repositories | Statistics | About



Subject: Big Data, ETL, Cloud computing, Spot price prediction, ARIMA, Spark


Year: 2021


Type: Article



Title: Cost optimization for big data workloads based on dynamic scheduling and cluster-size tuning


Author: Grzegorowski, Marek
Author: Zdravevski, Eftim
Author: Janusz, Andrzej
Author: Lameski, Petre
Author: Apanowicz, Cas
Author: Ślęzak, Dominik



Abstract: Analytical data processing has become the cornerstone of today’s businesses success, and it is facilitated by Big Data platforms that offer virtually limitless scalability. However, minimizing the total cost of ownership (TCO) for the infrastructure can be challenging. We propose a novel method to build resilient clusters on cloud resources that are fine-tuned to the particular data processing task. The presented architecture follows the infrastructure-as-a-code paradigm so that the cluster can be dynamically configured and managed. It first identifies the optimal cluster size to perform a job in the required time. Then, by analyzing spot instance price history and using ARIMA models, it optimizes the schedule of the job execution to leverage the discounted prices of the cloud spot market. In particular, we evaluated savings opportunities when using Amazon EC2 spot instances comparing to on-demand resources. The performed experiments confirmed that the prediction module significantly improved the costeffectiveness of the solution – up to 80% savings compared to the on-demand prices, and at the worstcase, 1% more cost than the absolute minimum. The production deployments of the architecture show that it is invaluable for minimizing the total cost of ownership of analytical data processing solutions.


Publisher: Elsevier


Relation: Big Data Research



Identifier: oai:repository.ukim.mk:20.500.12188/21011
Identifier: http://hdl.handle.net/20.500.12188/21011



TitleDateViews
Cost optimization for big data workloads based on dynamic scheduling and cluster-size tuning202125