The rise of data-intensive applications such as \gls{ai}, \gls{iot} and 5G, coupled with the disaggregation of data centers, has placed immense pressure on the network infrastructure. This has led to the cloud edge continuum, where computational resources span from edge devices to centralized data centers. This thesis addresses the challenge of optimizing resource efficiency across this continuum. First, we explore how \gls{ai} can be leveraged for intelligent orchestration of network resources. Although \gls{ai}-driven orchestration holds great promise, it often overlooks the cost of learning, the resources consumed by \gls{ai} tasks such as training and decision making. To bridge this gap, we introduce a cost-aware resource management model that balances resource allocation efficiency and \gls{ai} performance. This model ensures that \gls{ai}tasks (the learning plane) do not consume resources needed for user-facing services (the data plane). Building on this, the second part addresses execution efficiency to complement intelligent scheduling. Although \gls{ai}-driven orchestration optimizes resource utilization, gains can be overshadowed by inefficiencies in CPU execution of virtual network functions. As data volumes grow, CPUs struggle to meet processing demands, leading to high costs per packet in terms of latency, power consumption, and other resources. This challenge highlights the need for programmable hardware like FPGAs and SmartNICs. Although these specialized accelerators offer high performance, they often lack runtime flexibility, limiting adaptability to changing workloads. To overcome this, we introduce an abstraction layer that virtualizes match tables--a critical component of packet processors. This framework provides runtime configurability and flexibility, enabling scalable and efficient networking systems.

Optimizing Resource Efficiency in the Cloud-Edge Continuum / Lahmer, SEYYID AHMED. - (2025 Mar 24).

Optimizing Resource Efficiency in the Cloud-Edge Continuum

LAHMER, SEYYID AHMED
2025

Abstract

The rise of data-intensive applications such as \gls{ai}, \gls{iot} and 5G, coupled with the disaggregation of data centers, has placed immense pressure on the network infrastructure. This has led to the cloud edge continuum, where computational resources span from edge devices to centralized data centers. This thesis addresses the challenge of optimizing resource efficiency across this continuum. First, we explore how \gls{ai} can be leveraged for intelligent orchestration of network resources. Although \gls{ai}-driven orchestration holds great promise, it often overlooks the cost of learning, the resources consumed by \gls{ai} tasks such as training and decision making. To bridge this gap, we introduce a cost-aware resource management model that balances resource allocation efficiency and \gls{ai} performance. This model ensures that \gls{ai}tasks (the learning plane) do not consume resources needed for user-facing services (the data plane). Building on this, the second part addresses execution efficiency to complement intelligent scheduling. Although \gls{ai}-driven orchestration optimizes resource utilization, gains can be overshadowed by inefficiencies in CPU execution of virtual network functions. As data volumes grow, CPUs struggle to meet processing demands, leading to high costs per packet in terms of latency, power consumption, and other resources. This challenge highlights the need for programmable hardware like FPGAs and SmartNICs. Although these specialized accelerators offer high performance, they often lack runtime flexibility, limiting adaptability to changing workloads. To overcome this, we introduce an abstraction layer that virtualizes match tables--a critical component of packet processors. This framework provides runtime configurability and flexibility, enabling scalable and efficient networking systems.
Optimizing Resource Efficiency in the Cloud-Edge Continuum
24-mar-2025
Optimizing Resource Efficiency in the Cloud-Edge Continuum / Lahmer, SEYYID AHMED. - (2025 Mar 24).
File in questo prodotto:
File Dimensione Formato  
thesis_Seyyidahmed_LAHMER-1.pdf

accesso aperto

Descrizione: thesis_seyyidahmed_lahmer
Tipologia: Tesi di dottorato
Dimensione 1.69 MB
Formato Adobe PDF
1.69 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3550708
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact