Emerging Research Directions: Difference between revisions
Line 58: | Line 58: | ||
The exponential growth of the Internet of Things (IoT) devices, coupled with the emergence of artificial intelligence (AI) and high-speed communication networks (5G/6G), has led to the proliferation of edge computing. In an edge computing paradigm, data processing is distributed away from centralized cloud data centers and relocated closer to the data source or end-users | The exponential growth of the Internet of Things (IoT) devices, coupled with the emergence of artificial intelligence (AI) and high-speed communication networks (5G/6G), has led to the proliferation of edge computing. In an edge computing paradigm, data processing is distributed away from centralized cloud data centers and relocated closer to the data source or end-users. This architectural shift offers benefits such as reduced network latency, efficient bandwidth usage, and real-time analytics. However, the distribution of processing to a multiplicity of geographically dispersed devices has profound implications for energy consumption. Although large-scale data centers have been the subject of extensive research concerning energy efficiency, smaller edge nodes, including micro data centers, IoT gateways, and embedded systems, also generate significant carbon emissions. | ||
=== High-Level Edge-Fog-Cloud Architecture === | |||
Figure~\ref{fig:arch} illustrates a conceptual view of an edge-fog-cloud architecture, which underpins many modern IoT and AI-driven systems. Data generated by IoT sensors and devices typically undergo initial processing at edge nodes or micro data centers. This approach reduces the volume of data transmitted to the cloud, alleviating network bottlenecks and latency constraints. Fog nodes may aggregate data from multiple edge nodes for more sophisticated analytics or buffering. Finally, cloud data centers handle large-scale data storage and complex computational tasks that exceed the capacity of edge or fog layers. | |||
=== Lifecycle of an Edge Device === | |||
Another key consideration in evaluating carbon footprint is the full lifecycle of an edge device, illustrated in Figure~\ref{fig:lifecycle}. Manufacturing often involves significant energy consumption and raw materials. During deployment and operation, issues of energy efficiency and cooling are paramount. Maintenance and updates can prolong device lifespans, whereas end-of-life disposal or recycling raises additional environmental concerns. Each stage presents opportunities to reduce carbon emissions by adopting strategies such as modular upgrades, use of recycled materials, and eco-friendly disposal. | |||
=== Hardware-Level Approaches === | |||
Research on hardware-focused strategies for reducing the carbon footprint at the edge has been extensive. Xu et al.~\cite{Xu2019} examined system-on-chips (SoCs) designed specifically for energy efficiency, integrating ultra-low-power states and selective core activation. Mendez and Ha~\cite{Mendez2020} evaluated heterogeneous multicore processors for embedded systems, highlighting the benefits of activating only the cores necessary to meet real-time performance requirements. Similarly, the introduction of custom AI accelerators has been shown to yield significant power savings for neural network inference tasks~\cite{Ramesh2022}. | |||
Bae et al.~\cite{Bae2021} emphasized that sustainable manufacturing practices and the use of recycled materials can reduce the overall lifecycle emissions of edge devices. Kim et al.~\cite{Kim2023} explored biologically inspired materials to enhance heat dissipation at the package level, while Liu and Zhang~\cite{Liu2019} demonstrated that compact liquid-cooling solutions are viable even for micro data centers near the edge. | |||
=== Software-Level Optimizations === | |||
Energy-aware software design is integral to achieving sustainability in edge computing. Wan et al.~\cite{Wan2018} initiated the discussion on applying dynamic voltage and frequency scaling (DVFS) within edge-based real-time systems. Martinez et al.~\cite{Martinez2021} refined DVFS strategies by incorporating reinforcement learning methods that adaptively tune voltage and frequency according to workload fluctuations, illustrating substantial improvements in power efficiency. On the task scheduling front, Li et al.~\cite{Li2019} proposed multi-objective algorithms to distribute computing workloads among heterogeneous IoT gateways, balancing performance, latency, and energy considerations. | |||
Partial offloading techniques have also gained traction, particularly in AI inference. Zhang et al.~\cite{Zhang2022} presented a partitioning mechanism whereby only computationally heavy layers of a neural network are offloaded to specialized infrastructure, while simpler layers run on the edge device. Hassan et al.~\cite{Hassan2021} and Moreno et al.~\cite{Moreno2023} examined lightweight containerization at the edge, demonstrating that resource overhead can be minimized through optimized container runtimes such as Docker, containerd, and CRI-O. | |||
=== System-Level Coordination and Policy Frameworks === | |||
A holistic perspective that spans hardware, network, and orchestration layers has been pivotal in advancing carbon footprint reduction. Chiang et al.~\cite{Chiang2018} and Yang and Li~\cite{Yang2023} introduced integrated edge-fog-cloud architectures, showing how workload migration across geographically distributed nodes can leverage variations in carbon intensity. Qiu et al.~\cite{Qiu2020} and Nguyen et al.~\cite{Nguyen2021} developed adaptive networking protocols to reduce base-station energy consumption, such as utilizing sleep modes during off-peak periods or coordinating workload consolidation across neighboring edge gateways. These system-wide efforts are increasingly driven by AI-based methods, where machine learning algorithms predict resource utilization or carbon intensity to trigger proactive power management~\cite{Tang2019,He2022}. | |||
Policy and regulation also play a crucial role. White et al.~\cite{White2020} underscored the need for standardized carbon footprint metrics in edge infrastructures, while Gao et al.~\cite{Gao2023} examined regional regulations enforcing minimum energy efficiency levels for gateways and micro data centers. Johnson et al.~\cite{Johnson2021} explored how carbon credits or tax benefits can incentivize low-power chipset adoption, and Schaefer et al.~\cite{Schaefer2022} investigated the impact of green certifications on consumer purchasing behaviors. Devic et al.~\cite{Devic2024} integrated eco-design principles, such as modular battery packs and real-time energy monitoring, to extend hardware life and reduce e-waste. | |||
== 7.5 Data Persistence == | == 7.5 Data Persistence == |
Revision as of 15:26, 3 April 2025
Emerging Research Directions
7.1 Task and Resource Scheduling
https://ieeexplore.ieee.org/document/9519636 Q. Luo, S. Hu, C. Li, G. Li and W. Shi, "Resource Scheduling in Edge Computing: A Survey," in IEEE Communications Surveys & Tutorials, vol. 23, no. 4, pp. 2131-2165, Fourthquarter 2021, doi: 10.1109/COMST.2021.3106401. keywords: {Edge computing;Processor scheduling;Task analysis;Resource management;Cloud computing;Job shop scheduling;Internet of Things;Internet of things;edge computing;resource allocation;computation offloading;resource provisioning},
https://www.sciencedirect.com/science/article/abs/pii/S014036641930831X Congfeng Jiang, Tiantian Fan, Honghao Gao, Weisong Shi, Liangkai Liu, Christophe Cérin, Jian Wan, Energy aware edge computing: A survey, Computer Communications, Volume 151, 2020, Pages 556-580, ISSN 0140-3664, https://doi.org/10.1016/j.comcom.2020.01.004. (https://www.sciencedirect.com/science/article/pii/S014036641930831X) Abstract: Edge computing is an emerging paradigm for the increasing computing and networking demands from end devices to smart things. Edge computing allows the computation to be offloaded from the cloud data centers to the network edge and edge nodes for lower latency, security and privacy preservation. Although energy efficiency in cloud data centers has been broadly investigated, energy efficiency in edge computing is largely left uninvestigated due to the complicated interactions between edge devices, edge servers, and cloud data centers. In order to achieve energy efficiency in edge computing, a systematic review on energy efficiency of edge devices, edge servers, and cloud data centers is required. In this paper, we survey the state-of-the-art research work on energy-aware edge computing, and identify related research challenges and directions, including architecture, operating system, middleware, applications services, and computation offloading. Keywords: Edge computing; Energy efficiency; Computing offloading; Benchmarking; Computation partitioning
https://onlinelibrary.wiley.com/doi/10.1002/spe.3340 https://www.sciencedirect.com/science/article/abs/pii/S0167739X18319903 Wazir Zada Khan, Ejaz Ahmed, Saqib Hakak, Ibrar Yaqoob, Arif Ahmed, Edge computing: A survey, Future Generation Computer Systems, Volume 97, 2019, Pages 219-235, ISSN 0167-739X, https://doi.org/10.1016/j.future.2019.02.050. (https://www.sciencedirect.com/science/article/pii/S0167739X18319903) Abstract: In recent years, the Edge computing paradigm has gained considerable popularity in academic and industrial circles. It serves as a key enabler for many future technologies like 5G, Internet of Things (IoT), augmented reality and vehicle-to-vehicle communications by connecting cloud computing facilities and services to the end users. The Edge computing paradigm provides low latency, mobility, and location awareness support to delay-sensitive applications. Significant research has been carried out in the area of Edge computing, which is reviewed in terms of latest developments such as Mobile Edge Computing, Cloudlet, and Fog computing, resulting in providing researchers with more insight into the existing solutions and future applications. This article is meant to serve as a comprehensive survey of recent advancements in Edge computing highlighting the core applications. It also discusses the importance of Edge computing in real life scenarios where response time constitutes the fundamental requirement for many applications. The article concludes with identifying the requirements and discuss open research challenges in Edge computing. Keywords: Mobile edge computing; Edge computing; Cloudlets; Fog computing; Micro clouds; Cloud computing
https://www.sciencedirect.com/science/article/abs/pii/S1383762121001570 Akhirul Islam, Arindam Debnath, Manojit Ghose, Suchetana Chakraborty, A Survey on Task Offloading in Multi-access Edge Computing, Journal of Systems Architecture, Volume 118, 2021, 102225, ISSN 1383-7621, https://doi.org/10.1016/j.sysarc.2021.102225. (https://www.sciencedirect.com/science/article/pii/S1383762121001570) Abstract: With the advent of new technologies in both hardware and software, we are in the need of a new type of application that requires huge computation power and minimal delay. Applications such as face recognition, augmented reality, virtual reality, automated vehicles, industrial IoT, etc. belong to this category. Cloud computing technology is one of the candidates to satisfy the computation requirement of resource-intensive applications running in UEs (User Equipment) as it has ample computational capacity, but the latency requirement for these applications cannot be satisfied by the cloud due to the propagation delay between UEs and the cloud. To solve the latency issues for the delay-sensitive applications a new network paradigm has emerged recently known as Multi-Access Edge Computing (MEC) (also known as mobile edge computing) in which computation can be done at the network edge of UE devices. To execute the resource-intensive tasks of UEs in the MEC servers hosted in the network edge, a UE device has to offload some of the tasks to MEC servers. Few survey papers talk about task offloading in MEC, but most of them do not have in-depth analysis and classification exclusive to MEC task offloading. In this paper, we are providing a comprehensive survey on the task offloading scheme for MEC proposed by many researchers. We will also discuss issues, challenges, and future research direction in the area of task offloading to MEC servers. Keywords: Multi-access edge computing; Task offloading; Mobile edge computing; Survey
7.2 Edge for AR/VR
7.3 Vehicle Computing
7.4 Energy-Efficient Edge Architectures
The exponential growth of the Internet of Things (IoT) devices, coupled with the emergence of artificial intelligence (AI) and high-speed communication networks (5G/6G), has led to the proliferation of edge computing. In an edge computing paradigm, data processing is distributed away from centralized cloud data centers and relocated closer to the data source or end-users. This architectural shift offers benefits such as reduced network latency, efficient bandwidth usage, and real-time analytics. However, the distribution of processing to a multiplicity of geographically dispersed devices has profound implications for energy consumption. Although large-scale data centers have been the subject of extensive research concerning energy efficiency, smaller edge nodes, including micro data centers, IoT gateways, and embedded systems, also generate significant carbon emissions.
High-Level Edge-Fog-Cloud Architecture
Figure~\ref{fig:arch} illustrates a conceptual view of an edge-fog-cloud architecture, which underpins many modern IoT and AI-driven systems. Data generated by IoT sensors and devices typically undergo initial processing at edge nodes or micro data centers. This approach reduces the volume of data transmitted to the cloud, alleviating network bottlenecks and latency constraints. Fog nodes may aggregate data from multiple edge nodes for more sophisticated analytics or buffering. Finally, cloud data centers handle large-scale data storage and complex computational tasks that exceed the capacity of edge or fog layers.
Lifecycle of an Edge Device
Another key consideration in evaluating carbon footprint is the full lifecycle of an edge device, illustrated in Figure~\ref{fig:lifecycle}. Manufacturing often involves significant energy consumption and raw materials. During deployment and operation, issues of energy efficiency and cooling are paramount. Maintenance and updates can prolong device lifespans, whereas end-of-life disposal or recycling raises additional environmental concerns. Each stage presents opportunities to reduce carbon emissions by adopting strategies such as modular upgrades, use of recycled materials, and eco-friendly disposal.
Hardware-Level Approaches
Research on hardware-focused strategies for reducing the carbon footprint at the edge has been extensive. Xu et al.~\cite{Xu2019} examined system-on-chips (SoCs) designed specifically for energy efficiency, integrating ultra-low-power states and selective core activation. Mendez and Ha~\cite{Mendez2020} evaluated heterogeneous multicore processors for embedded systems, highlighting the benefits of activating only the cores necessary to meet real-time performance requirements. Similarly, the introduction of custom AI accelerators has been shown to yield significant power savings for neural network inference tasks~\cite{Ramesh2022}.
Bae et al.~\cite{Bae2021} emphasized that sustainable manufacturing practices and the use of recycled materials can reduce the overall lifecycle emissions of edge devices. Kim et al.~\cite{Kim2023} explored biologically inspired materials to enhance heat dissipation at the package level, while Liu and Zhang~\cite{Liu2019} demonstrated that compact liquid-cooling solutions are viable even for micro data centers near the edge.
Software-Level Optimizations
Energy-aware software design is integral to achieving sustainability in edge computing. Wan et al.~\cite{Wan2018} initiated the discussion on applying dynamic voltage and frequency scaling (DVFS) within edge-based real-time systems. Martinez et al.~\cite{Martinez2021} refined DVFS strategies by incorporating reinforcement learning methods that adaptively tune voltage and frequency according to workload fluctuations, illustrating substantial improvements in power efficiency. On the task scheduling front, Li et al.~\cite{Li2019} proposed multi-objective algorithms to distribute computing workloads among heterogeneous IoT gateways, balancing performance, latency, and energy considerations.
Partial offloading techniques have also gained traction, particularly in AI inference. Zhang et al.~\cite{Zhang2022} presented a partitioning mechanism whereby only computationally heavy layers of a neural network are offloaded to specialized infrastructure, while simpler layers run on the edge device. Hassan et al.~\cite{Hassan2021} and Moreno et al.~\cite{Moreno2023} examined lightweight containerization at the edge, demonstrating that resource overhead can be minimized through optimized container runtimes such as Docker, containerd, and CRI-O.
System-Level Coordination and Policy Frameworks
A holistic perspective that spans hardware, network, and orchestration layers has been pivotal in advancing carbon footprint reduction. Chiang et al.~\cite{Chiang2018} and Yang and Li~\cite{Yang2023} introduced integrated edge-fog-cloud architectures, showing how workload migration across geographically distributed nodes can leverage variations in carbon intensity. Qiu et al.~\cite{Qiu2020} and Nguyen et al.~\cite{Nguyen2021} developed adaptive networking protocols to reduce base-station energy consumption, such as utilizing sleep modes during off-peak periods or coordinating workload consolidation across neighboring edge gateways. These system-wide efforts are increasingly driven by AI-based methods, where machine learning algorithms predict resource utilization or carbon intensity to trigger proactive power management~\cite{Tang2019,He2022}.
Policy and regulation also play a crucial role. White et al.~\cite{White2020} underscored the need for standardized carbon footprint metrics in edge infrastructures, while Gao et al.~\cite{Gao2023} examined regional regulations enforcing minimum energy efficiency levels for gateways and micro data centers. Johnson et al.~\cite{Johnson2021} explored how carbon credits or tax benefits can incentivize low-power chipset adoption, and Schaefer et al.~\cite{Schaefer2022} investigated the impact of green certifications on consumer purchasing behaviors. Devic et al.~\cite{Devic2024} integrated eco-design principles, such as modular battery packs and real-time energy monitoring, to extend hardware life and reduce e-waste.