Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware

Cloud systems and microservices are becoming powerful tools for businesses. The evidence of the advantages of offering infrastructure, hardware or software as a service (IaaS, PaaS, SaaS) is overwhelming. Microservices and decoupled applications are increasingly popular. These architectures, based o...

Full description

Bibliographic Details
Main Authors: Lluis Mas Ruiz, Pere Pinol Pueyo, Jordi Mateo-Fornes, Jordi Vilaplana Mayoral, Francesc Solsona Tehas
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9732997/
_version_ 1811272810418405376
author Lluis Mas Ruiz
Pere Pinol Pueyo
Jordi Mateo-Fornes
Jordi Vilaplana Mayoral
Francesc Solsona Tehas
author_facet Lluis Mas Ruiz
Pere Pinol Pueyo
Jordi Mateo-Fornes
Jordi Vilaplana Mayoral
Francesc Solsona Tehas
author_sort Lluis Mas Ruiz
collection DOAJ
description Cloud systems and microservices are becoming powerful tools for businesses. The evidence of the advantages of offering infrastructure, hardware or software as a service (IaaS, PaaS, SaaS) is overwhelming. Microservices and decoupled applications are increasingly popular. These architectures, based on containers, have facilitated the efficient development of complex SaaS applications. A big challenge is to manage and design microservices with a massive range of different facilities, from processing and data storage to computing predictive and prescriptive analytics. Computing providers are mainly based on data centers formed of massive and heterogeneous virtualized systems, which are continuously growing and diversifying over time. Moreover, these systems require integrating into current systems while meeting the Quality of Service (QoS) constraints. The primary purpose of this work is to present an on-premise architecture based on Kubernetes and Docker containers aimed at improving QoS regarding resource usage and service level objectives (SLOs). The main contribution of this proposal is its dynamic autoscaling capabilities to adjust system resources to the current workload while improving QoS.
first_indexed 2024-04-12T22:47:00Z
format Article
id doaj.art-60e3533dec6f4949afdaf693292a7384
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-12T22:47:00Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-60e3533dec6f4949afdaf693292a73842022-12-22T03:13:30ZengIEEEIEEE Access2169-35362022-01-0110330833309410.1109/ACCESS.2022.31587439732997Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-AwareLluis Mas Ruiz0https://orcid.org/0000-0002-9163-5364Pere Pinol Pueyo1Jordi Mateo-Fornes2https://orcid.org/0000-0002-8188-4914Jordi Vilaplana Mayoral3https://orcid.org/0000-0002-1660-0380Francesc Solsona Tehas4https://orcid.org/0000-0002-4830-9184Department of Computer Science and INSPIRES, University of Lleida, Lleida, SpainDepartment of Computer Science and INSPIRES, University of Lleida, Lleida, SpainDepartment of Computer Science and INSPIRES, University of Lleida, Lleida, SpainDepartment of Computer Science and INSPIRES, University of Lleida, Lleida, SpainDepartment of Computer Science and INSPIRES, University of Lleida, Lleida, SpainCloud systems and microservices are becoming powerful tools for businesses. The evidence of the advantages of offering infrastructure, hardware or software as a service (IaaS, PaaS, SaaS) is overwhelming. Microservices and decoupled applications are increasingly popular. These architectures, based on containers, have facilitated the efficient development of complex SaaS applications. A big challenge is to manage and design microservices with a massive range of different facilities, from processing and data storage to computing predictive and prescriptive analytics. Computing providers are mainly based on data centers formed of massive and heterogeneous virtualized systems, which are continuously growing and diversifying over time. Moreover, these systems require integrating into current systems while meeting the Quality of Service (QoS) constraints. The primary purpose of this work is to present an on-premise architecture based on Kubernetes and Docker containers aimed at improving QoS regarding resource usage and service level objectives (SLOs). The main contribution of this proposal is its dynamic autoscaling capabilities to adjust system resources to the current workload while improving QoS.https://ieeexplore.ieee.org/document/9732997/CloudmicroservicesKubernetesSLOQoS
spellingShingle Lluis Mas Ruiz
Pere Pinol Pueyo
Jordi Mateo-Fornes
Jordi Vilaplana Mayoral
Francesc Solsona Tehas
Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware
IEEE Access
Cloud
microservices
Kubernetes
SLO
QoS
title Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware
title_full Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware
title_fullStr Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware
title_full_unstemmed Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware
title_short Autoscaling Pods on an On-Premise Kubernetes Infrastructure QoS-Aware
title_sort autoscaling pods on an on premise kubernetes infrastructure qos aware
topic Cloud
microservices
Kubernetes
SLO
QoS
url https://ieeexplore.ieee.org/document/9732997/
work_keys_str_mv AT lluismasruiz autoscalingpodsonanonpremisekubernetesinfrastructureqosaware
AT perepinolpueyo autoscalingpodsonanonpremisekubernetesinfrastructureqosaware
AT jordimateofornes autoscalingpodsonanonpremisekubernetesinfrastructureqosaware
AT jordivilaplanamayoral autoscalingpodsonanonpremisekubernetesinfrastructureqosaware
AT francescsolsonatehas autoscalingpodsonanonpremisekubernetesinfrastructureqosaware