Jump to content

User:Farhan: Difference between revisions

From Edge Computing Wiki
Farhan (talk | contribs)
Edge Computing Architecture and Layers
Farhan (talk | contribs)
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
='''Edge Computing Architecture and Layers'''=
='''Edge Computing Architecture and Layers'''=
==='''Introduction'''===
==='''Introduction'''===
Edge computing is a new paradigm that moves computational processing nearer to where data is
Edge computing is a new model that locates computer processing near the source of data generation in an effort to restrict latency and bandwidth consumption. Edge computing is distinct from traditional cloud-based systems with centralized data centers employed for processing data in that computational power is spread across devices and local nodes within the network at the edge. The architectural change is a necessity in real-time applications with stringent high-speed data processing demands like autonomous cars, health monitoring, and industrial automation.
generated in an attempt to reduce latency and bandwidth usage. In contrast to conventional cloud-based
systems with centralized data centers for processing data, edge computing disperses computational
capacity among devices and local nodes in the network at the edge. The architectural change is essential
in real-time applications that require stringent high-speed data processing demands like autonomous cars,
health monitoring, and industrial automation.


The edge computing infrastructure is composed of multiple layers, with each layer playing its role. At the
The infrastructure of edge computing consists of various layers, and each one performs its function. At the ground level, there are edge devices like sensors and actuators that collect data and perform the initial processing. Then there is fog computing, which is an intermediate layer between various edge nodes, collects data, pre-processes, and transmits it to the cloud. The upper cloud performs the heavy computation and long-term storage of data.
ground level, there are edge devices such as sensors and actuators that gather data and do the primary
processing. Moving upwards, fog computing is an intermediary layer between multiple edge nodes, which
consolidates data, pre-processes, and sends it to the cloud. The cloud at the top does the heavy
computation and long-term storage of data.


The confluence of IoT, mobile computing, digital twins, and cloud infrastructure creates a robust edge
The confluence of IoT, mobile computing, digital twins, and cloud infrastructure creates a robust edge computing architecture. We present in this chapter the basic building blocks and layers of edge computing, such as IoT and digital twins, cloud infrastructure, fog computing, and the edge-cloud continuum.
computing architecture. We present in this chapter the basic building blocks and layers of edge
computing, such as IoT and digital twins, cloud infrastructure, fog computing, and the edge-cloud
continuum.
----
----
=='''1.1 IoT, Mobile, and Digital Twins'''==
=='''2.1 IoT, Mobile, and Digital Twins'''==
==='''Internet of Things (IoT)'''===
==='''Internet of Things (IoT)'''===


The Internet of Things (IoT) is networked physical objects that communicate using the internet. Examples
The Internet of Things (IoT) is a network of physical objects that interact via the internet. Examples range from common consumer devices such as smart thermostats, manufacturing equipment, health wearable sensors, and pieces of urban infrastructure. IoT provides additional connectivity and automation by way of ongoing data creation and analysis.
are everyday consumer devices like smart thermostats, industrial equipment, wearable health trackers, and
components of urban infrastructure. IoT offers greater connectivity and automation via continuous data
generation and analysis.


==='''Key Features of IoT:'''===
==='''Key Features of IoT:'''===
#'''Connectivity:''' Utilizes protocols such as Wi-Fi, Bluetooth, Zigbee, and LoRaWAN to transmit
#'''Connectivity:''' Implements protocols like Wi-Fi, Bluetooth, Zigbee, and LoRaWAN for data transmission,
data.
#'''Data Processing:''' Employs cloud and edge computing platforms to process the data that is gathered by sensors,
#'''Data Processing:''' Uses both cloud and edge computing systems to analyze data collected from
#'''Automation:''' Employs AI-driven or rule-based systems to make decisions autonomously,
sensors.
#'''Examples:''' Smart homes, wearables in healthcare, industrial IoT (IIoT), smart cities, and connected vehicles.
#'''Automation:''' Employs AI-driven or rule-based systems for autonomous decision-making.
#'''Examples:''' Smart homes, healthcare wearables, industrial IoT (IIoT), smart cities, and connected
vehicles.


==='''Mobile Cloud Computing (MCC)'''===
==='''Mobile Cloud Computing (MCC)'''===
Mobile Cloud Computing (MCC) integrates mobile devices and cloud computing to improve processing
Mobile Cloud Computing (MCC) combines cloud computing and mobile phones to maximize processing power and storage. The system enables computation-intensive processing to be offloaded from mobile phones to cloud servers, thereby conserving battery life while improving performance.
capability and storage. The paradigm enables computation-intensive processing to be offloaded from
mobile devices to cloud servers, thereby conserving battery life while improving performance.


==='''Benefits of MCC:'''===
==='''Benefits of MCC:'''===
*'''Extended Battery Life:''' Offloading processing tasks to the cloud reduces local energy
*'''Extended Battery Life:''' Processing tasks are offloaded to the cloud, which reduces local energy consumption,
consumption.
*'''Higher Processing Power:''' Allows sophisticated applications like AI and machine learning,
*'''Higher Processing Power:''' Supports complex applications like AI and machine learning.
*'''Scalability:''' Dynamically adjusts with changing user loads,
*'''Scalability:''' Dynamically scales to accommodate fluctuating user demand.
*'''Real-Time Access:''' Provides access to cloud-hosted applications from remote locations
*'''Real-Time Access:''' Facilitates access to cloud-hosted applications from anywhere.
*'''Applications:''' Cloud-hosted AI personal assistants (like Google Assistant), cloud gaming, and AR/VR applications
*'''Applications:''' Mobile AI assistants (like Google Assistant), cloud gaming, and AR/VR
applications.


==='''Digital Twins'''===
==='''Digital Twins'''===
Digital twins are computerized replicas of physical assets, updated regularly with real-time data in order
===='''What Is a Digital Twin?'''====
to simulate, predict, and improve performance. Digital twins are being used more in manufacturing,
A Digital Twin is a virtual replica of a physical object, system, or process that is continuously updated with real-time data. It uses sensors, connectivity, and analytics to mirror and simulate the behavior and performance of its physical counterpart. This enables organizations to monitor, diagnose, predict, and optimize operations virtually before applying changes in the real world.
healthcare, and smart city initiatives, where they can provide predictive maintenance and system
optimization.


Digital twins connect physical devices with virtual environments. Digital twins function inside edge
===='''How Digital Twins Relate to IoT'''====
nodes in edge computing to analyze information locally, limiting the necessity of sending huge volumes
The Internet of Things (IoT) plays a foundational role in enabling Digital Twin technology. IoT devices—such as sensors, actuators, smart meters, and wearables—collect data from the physical environment and stream it into the digital twin model in real time. Here’s how the connection works:
of data to the cloud. The arrangement is specifically valuable in applications requiring real-time analytics,
#'''Data Collection:''' IoT sensors gather metrics such as temperature, vibration, pressure, humidity, or motion,
for example, industrial automation and medical monitoring.
#'''Data Transmission:''' These metrics are transmitted over a network to a central system,
#'''Real-Time Sync:''' The digital twin receives this data to update its model, reflecting the real-time status of the asset,
#'''Insight Generation:''' Analytics, AI/ML models, and simulations applied to the digital twin enable predictive maintenance, performance tuning, and scenario testing without affecting the physical system.
 
===='''Practical Use Cases'''====
*'''Smart Manufacturing:''' IoT-enabled machinery is mirrored by digital twins to optimize workflows, reduce downtime, and simulate design changes,
*'''Smart Cities:''' Digital twins of buildings, transport systems, or power grids leverage data from IoT sensors to improve efficiency and public safety,
*'''Healthcare:''' Wearables and connected medical devices provide biometric data to patient-specific digital twins for personalized treatment simulations.
 
===='''Benefits:'''====
* Real-Time Monitoring,
* Predictive Analytics & Maintenance
* Virtual Experimentation
* Enhanced Decision-Making
* Enhanced Decision-Making
 
This foundation makes it easy to later explore where digital twins should be deployed (edge vs cloud), based on '''latency, bandwidth, data sensitivity''', and '''computational demand'''.
 
 
Digital twins are virtual representations of physical objects that are regularly updated with real-time data so as to replicate, predict, and maximize functionality. Digital twins are increasingly applied in manufacturing, healthcare, and smart city projects that enable intelligent cities, where predictive maintenance and systems optimization can be realized.
 
Digital twins connect physical devices with virtual spaces. Digital twins are run at edge nodes in edge computing to carry out local processing of information to minimize the necessity to send colossal amounts of data to the cloud. The configuration is especially applicable in cases requiring real-time analytics, for example, industrial automation and healthcare monitoring.


'''Diagram 1: Digital Twin Edge Network Architecture'''
'''Diagram 1: Digital Twin Edge Network Architecture'''
[[File:|500px|thumb|center|Real-Time Edge Adoption]]
[[File:Intro_to_edge.png|500px|thumb|center|Real-Time Edge Adoption]]


This diagram illustrates a layered architecture where virtual twins communicate with physical IoT devices
This diagram illustrates a layered architecture where virtual twins communicate with physical IoT devices through edge nodes. It showcases intelligent transportation systems, 6G networks, and IoT applications as part of a hierarchical edge-cloud framework.
through edge nodes. It showcases intelligent transportation systems, 6G networks, and IoT applications as
part of a hierarchical edge-cloud framework.


----
----


==='''1.2 Cloud'''===
==='''2.2 Cloud'''===
 
Cloud computing is an on-demand model of ubiquitous, easy, network access to shared pools of programmable computer infrastructure such as networks, servers, storage, software applications, and services that may be rapidly de-provisioned and provisioned with little administrative effort. Cloud computing provides access to high-bandwidth computational power without the necessity of owning and maintaining physical infrastructures.
 
===='''Why the Continuum Will Dominate the Future'''====


Cloud computing is a model of providing ubiquitous, easy, on-demand network access to a shared pool of
'''1. Proximity and Real-Time Responsiveness'''
configurable computer resources such as networks, servers, storage, applications, and services that can be
Edge computing brings computation closer to data sources—ideal for latency-sensitive applications such as autonomous vehicles, augmented reality, and real-time video analytics. The continuum optimizes where computation occurs—at the edge, near edge, or cloud—based on performance needs.
rapidly provisioned and released with minimal management effort. Cloud computing provides an ability
 
to use high-capacity computational resources without owning physical infrastructure or maintaining it.
'''2. Scalability Across Diverse Infrastructures'''
From smart homes to industrial automation to smart cities, edge devices vary widely in capabilities. The continuum provides a unified model to scale applications from small edge sensors to large cloud-hosted AI models.
 
'''3. Flexibility and Adaptability'''
Workloads can dynamically shift based on resource availability, cost, network conditions, or privacy constraints. For instance, video streams can be pre-processed at the edge and fully analyzed in the cloud only when needed, reducing bandwidth usage and improving efficiency.
 
'''4. Support for Advanced Use Cases'''
Emerging applications such as connected autonomous systems, Industry 4.0, and real-time federated learning rely on the interplay between edge and cloud. The continuum supports these by allowing AI, analytics, and orchestration tools to operate fluidly across environments.
 
'''5. Improved Security, Privacy, and Data Sovereignty'''
The continuum allows sensitive data to be processed locally, enforcing data residency laws and reducing exposure to breaches. Meanwhile, global-scale analytics can be performed on anonymized or aggregated data in the cloud.
 
'''6. Optimized Resource Utilization and Cost Efficiency'''
The continuum benefits from statistical multiplexing, allowing workloads to be scheduled and shifted to underutilized resources whether at the edge or in the cloud, enhancing overall utilization and reducing operational costs.
 
===='''Technical Innovations:'''====
*'''Federation Standards (e.g., IEEE 2302-2021):''' Enables trust and interoperability across diverse environments,
*'''Zero-Trust Security Models:''' Secure workloads dynamically across distributed nodes,
*'''Microservices and Containers:''' Allow modular, portable deployments across cloud and edge,
*'''Orchestration Platforms (like Kubernetes and KubeEdge):''' Manage application components across the continuum.
 
===='''Broader Impact and Vision'''====
The edge–cloud continuum supports global digital transformation. It’s essential to sectors like '''smart healthcare, national disaster response systems, autonomous transportation, and cyber-physical systems,''' all of which need '''reliability, responsiveness, and scalability''' that neither cloud nor edge can provide alone.


===='''Characteristics of Cloud Computing:'''====
===='''Characteristics of Cloud Computing:'''====
# '''On-Demand Self-Service:''' Users can independently access computing resources as needed
# '''On-Demand Self-Service:''' Consumers may access computing capabilities on their own without human interference from the provider,
without requiring human intervention from the service provider.
#'''Broad Network Access:''' The cloud services may be accessed from the network and are capable of supporting heterogeneous client platforms like laptop computers, cell phones, and workstations,
#'''Broad Network Access:''' Cloud services are accessible over the network and support
#'''Resource Pooling:''' The computer resources of the provider are collected to provide service to multiple consumers through a multi-tenant model,
heterogeneous client platforms such as mobile phones, laptops, and workstations.
#'''Rapid Elasticity:''' Abilities are provisioned elastically and also de-provisioned to grow quickly in response to demand,
#'''Resource Pooling:''' The provider's computing resources are pooled to serve multiple consumers
#'''Measured Service: ''' Resources utilization is metered, controlled, and reported, and offers visibility for the provider as well as the consumer.
using a multi-tenant model.
#'''Rapid Elasticity:''' Capabilities can be elastically provisioned and released to scale rapidly
according to demand.
#'''Measured Service: '''Resource usage is monitored, controlled, and reported, providing
transparency for both the provider and the consumer.


===='''Cloud Service Models:'''====
===='''Cloud Service Models:'''====
* '''Infrastructure as a Service (IaaS): ''' Offers fundamental computing resources like virtual
* '''Infrastructure as a Service (IaaS): ''' Offers raw computing resources like virtual machines, storage, and networks. AWS EC2 and Google Compute Engine are examples,
machines, storage, and networks. Examples include AWS EC2 and Google Compute Engine.
* '''Platform as a Service (PaaS): ''' Platform as a Service (PaaS): Offers environments in which applications can be created, tested, and run. It covers up the underlying infrastructure, focusing on application development (e.g., AWS Elastic Beanstalk, Heroku)
* '''Platform as a Service (PaaS): ''' Provides environments to develop, test, and deploy applications.
* '''Software as a Service (SaaS): ''' Delivers applications over the internet on a pay-per-use basis, without the need to install or maintain software (e.g., Microsoft 365, Salesforce).
It abstracts the underlying infrastructure, focusing on application development (e.g., AWS Elastic
Beanstalk, Heroku).
* '''Software as a Service (SaaS): ''' Delivers applications over the internet on a subscription basis,
eliminating the need to install or maintain software (e.g., Microsoft 365, Salesforce).


===='''Deployment Models:'''====
===='''Deployment Models:'''====
#'''Public Cloud:'''  Operated by third-party providers, offering services over the public internet.
#'''Public Cloud:'''  Third-party suppliers manage it, offering services over the public internet. For low-security needs and high scalability,
Suitable for low-security needs and high scalability.
#'''Private Cloud:''' Dedicated for use by one organization, offering more control and security. Installed on-premises or by a third party,
#'''Private Cloud:''' Exclusively used by a single organization, providing enhanced control and
#'''Hybrid Cloud:''' Combines public and private clouds, enabling sharing of applications and data between them,
security. Hosted either on-premises or by a third party.
#'''Multi-Cloud:''' Utilizes services from multiple cloud vendors to avoid vendor lock-in and enhance reliability.
#'''Hybrid Cloud:''' Combines public and private clouds, enabling data and application sharing
between them.
#'''Multi-Cloud:''' Utilizes services from multiple cloud providers to avoid vendor lock-in and
increase reliability.


===='''Challenges in Cloud Computing:'''====
===='''Challenges in Cloud Computing:'''====
*'''Latency Issues:''' Cloud-based processing can introduce delays, especially in real-time
*'''Latency Problems:''' Cloud processing may lead to delays, especially in real-time systems,
applications.
*'''Privacy of Data:''' Transmission of data to the cloud can be a security risk,
*'''Data Privacy Concerns:''' Transmitting sensitive data to the cloud raises potential security risks.
*'''Cost of Bandwidth:''' Ongoing data transfer between equipment and the cloud is costly,
*'''Bandwidth Costs:''' Frequent data transfer between devices and the cloud can incur significant
*'''Outages and Downtime:''' Interruptions in cloud services affect availability.
costs.
*'''Downtime and Outages:''' Service disruptions in cloud infrastructure can affect availability.


==='''Cloud-Edge Integration:'''===
==='''Integration of Cloud and Edge:'''===


Edge computing is an improvement over cloud computing because it computes closer to the source and
Edge computing is superior to cloud computing because it computes closer to the source and leverages the cloud for intense analytics and data storage. An example would be when in autonomous vehicles, local real-time calculation (i.e., object detection) happens while machine learning model training data are sent to the cloud.
utilizes the cloud for more in-depth analytics and storage. An example is when in the context of
autonomous vehicles, where real-time information (i.e., obstacle detection) is computed locally but
machine learning model training data is sent to the cloud.


'''Diagram 2: Edge Computing Architecture'''
'''Diagram 2: Edge Computing Architecture'''
[[File:|500px|thumb|center|Real-Time Edge Adoption]]
[[File:Diagram_2_Edge_Computing_Architecture.png|500px|thumb|center|Real-Time Edge Adoption]]
 
This diagram represents the layered structure of edge computing, highlighting how the cloud layer interacts with network and application edges. The cloud primarily handles heavy computation and data storage, while edge devices focus on local data processing.


This diagram represents the layered structure of edge computing, highlighting how the cloud layer
interacts with network and application edges. The cloud primarily handles heavy computation and data
storage, while edge devices focus on local data processing.
----
----


==='''1.3 Edge and Fog'''===
=='''2.3 Edge and Fog'''==
===='''Edge Computing:'''====
===='''Edge Computing:'''====
Edge computing handles data at or close to the place where the data is situated, minimizing the latency in
Edge computing handles data at or close to the place where the data is situated, minimizing the latency in transmitting data to the cloud servers centrally. It is critical for applications that need real-time data processing and response, including autonomous vehicles, industrial automation, and medical monitoring​.
transmitting data to the cloud servers centrally. It is critical for applications that need real-time data
processing and response, including autonomous vehicles, industrial automation, and medical monitoring.


#'''Reduced Latency:''' Edge computing reduces the latency of transmitting data over long distances
#'''Reduced Latency:''' Edge computing reduces the latency of transmitting data over long distances as it processes the data locally.
as it processes the data locally.
#'''Improved Data Privacy:''' Personal data is processed closer to the source, reducing the risk of exposure.
#'''Improved Data Privacy:''' Personal data is processed closer to the source, reducing the risk of
#'''Bandwidth Efficiency:''' There is only transmission of processed data or key information to the cloud, which minimizes data traffic,
exposure.
#'''High Reliability:''' Edge devices can operate independently even when the network connectivity with the cloud is severed.
#'''Bandwidth Efficiency:''' There is only transmission of processed data or key information to the
cloud, which minimizes data traffic.
#'''High Reliability:''' Edge devices can operate independently even when the network connectivity
with the cloud is severed.
===='''Applications of Edge Computing:'''====
===='''Applications of Edge Computing:'''====
*''''Autonomous Vehicles:''' Real-time data processing for navigation and obstacle avoidance.
*''''Autonomous Vehicles:''' Real-time data processing for navigation and obstacle avoidance,
*'''Smart Cities:'''
*'''Smart Cities:''' Adaptive lighting networks and traffic monitoring,
*'''Healthcare:''' Real-time remote monitoring equipment to analyze patients' information.
*'''Healthcare:''' Real-time remote monitoring equipment to analyze patients' information,
*'''Industrial IoT:''' Predictive maintenance through sensor data.
*'''Industrial IoT:''' Predictive maintenance through sensor data.


==='''Fog Computing:'''===
==='''Fog Computing:'''===
Fog computing is an extension of edge computing in the way that it creates a middle layer between the
Fog computing is an extension of edge computing in the way that it creates a middle layer between the cloud and the edge devices. Fog computing enables data to be collected, computed, and decision-making performed nearer to the data source but not on the device itself​.
cloud and the edge devices. Fog computing enables data to be collected, computed, and decision-making
performed nearer to the data source but not on the device itself.


===='''Features of Fog Computing:'''====
===='''Features of Fog Computing:'''====
#'''Intermediate Processing:''' Relieves edge devices of some processing tasks prior to reaching the
#'''Intermediate Processing:''' Relieves edge devices of some processing tasks prior to reaching the cloud,
cloud.
#'''Data Aggregation:''' It combines data from various sources to eliminate redundancy,
#'''Data Aggregation:''' It combines data from various sources to eliminate redundancy.
#'''Enhanced Scalability:''' Enables massive IoT networks by distributing computation tasks
#'''Enhanced Scalability:''' Enables massive IoT networks by distributing computation tasks.
#'''Real-Time Analytics: ''' Processes real-time data in real time without depending on distant servers
#'''Real-Time Analytics: ''' Processes real-time data in real time without depending on distant servers.


===='''Comparison: Edge vs. Fog:'''====
===='''Comparison: Edge vs. Fog:'''====
*'''Processing Location:''' Edge directly on devices; Fog on intermediate nodes.
*'''Processing Location:''' Edge directly on devices; Fog on intermediate nodes
*'''Latency:''' Edge has ultra-low latency; Fog has low but comparatively higher latency.
*'''Latency:''' Edge has ultra-low latency; Fog has low but comparatively higher latency
*'''Scalability:''' Fog is more scalable because of the capability of data combining.
*'''Scalability:''' Fog is more scalable because of the capability of data combining
*'''Data Aggregation:''' Edge deals with individual data, while Fog consolidates data from multiple
*'''Data Aggregation:''' Edge deals with individual data, while Fog consolidates data from multiple sources.
sources.


'''Diagram 3: Edge-Fog-Cloud Network Architecture'''
'''Diagram 3: Edge-Fog-Cloud Network Architecture'''
[[File:|500px|thumb|center|Real-Time Edge Adoption]]
[[File:Diagram_3_Edge_Fog_Cloud_Network_Architecture.png|500px|thumb|center|Real-Time Edge Adoption]]
 
The diagram shows the interplay between cloud, fog, and edge layers, emphasizing the data flow from localized edge processing through fog nodes to the centralized cloud for deeper analytics​.


The diagram shows the interplay between cloud, fog, and edge layers, emphasizing the data flow from
localized edge processing through fog nodes to the centralized cloud for deeper analytics.
----
----


=='''1.4 Edge-Cloud Continuum'''==
=='''2.4 Edge-Cloud Continuum'''==
Edge-Cloud Continuum is defined as an adaptive and dynamic computational paradigm that balances
Edge-Cloud Continuum is defined as an adaptive and dynamic computational paradigm that balances edge, fog, and cloud computation workloads. Through this integration of environments, applications can toggle between local and remote processing depending on workload, latency constraints, and network status​.
edge, fog, and cloud computation workloads. Through this integration of environments, applications can
toggle between local and remote processing depending on workload, latency constraints, and network
status.


===='''Core Functions:'''====
===='''Core Functions:'''====
#'''Dynamic Workload Distribution:''' Real-time tasks are processed at the edge, while less critical
#'''Dynamic Workload Distribution:''' Real-time tasks are processed at the edge, while less critical operations are pushed to the cloud,
operations are pushed to the cloud.
#'''Real-Time Data Processing:''' Edge devices manage time-sensitive data, like sensor readings, without sending it to remote servers
#'''Real-Time Data Processing:''' Edge devices manage time-sensitive data, like sensor readings,
#'''Data Filtering and Aggregation:''' Fog nodes filter data before forwarding it to the cloud, minimizing data transfer
without sending it to remote servers.
#'''Scalable and Elastic Resource Management:''' Allocates resources adaptively according to the computational demand
#'''Data Filtering and Aggregation:''' Fog nodes filter data before forwarding it to the cloud,
#'''Security and Privacy Management:''' Implements localized data encryption and access control,
minimizing data transfer.
#'''Interoperability:''' Facilitates smooth interaction between edge, fog, and cloud systems through intelligent orchestration.
#'''Scalable and Elastic Resource Management:''' Allocates resources adaptively according to the
computational demand.
#'''Security and Privacy Management:''' Implements localized data encryption and access control.
#'''Interoperability:''' Facilitates smooth interaction between edge, fog, and cloud systems through
intelligent orchestration.


===='''Applications:'''
===='''Applications:'''====
*'''Smart Grids:''' Real-time monitoring and load balancing between local power sources and
*'''Smart Grids:''' Real-time monitoring and load balancing between local power sources and centralized control
centralized control.
*'''Smart Agriculture:''' Local analysis of environmental data, with broader data analytics performed in the cloud
*'''Smart Agriculture:''' Local analysis of environmental data, with broader data analytics performed
*'''Healthcare Monitoring:''' Real-time patient monitoring at the edge, with historical analysis in the cloud
in the cloud.
*'''Healthcare Monitoring:''' Real-time patient monitoring at the edge, with historical analysis in the
cloud.


By blending edge and cloud processing, the continuum provides a flexible infrastructure that adapts to
By blending edge and cloud processing, the continuum provides a flexible infrastructure that adapts to real-time requirements while maintaining efficient long-term data management​.
real-time requirements while maintaining efficient long-term data management.

Latest revision as of 08:59, 25 April 2025

Edge Computing Architecture and Layers[edit]

Introduction[edit]

Edge computing is a new model that locates computer processing near the source of data generation in an effort to restrict latency and bandwidth consumption. Edge computing is distinct from traditional cloud-based systems with centralized data centers employed for processing data in that computational power is spread across devices and local nodes within the network at the edge. The architectural change is a necessity in real-time applications with stringent high-speed data processing demands like autonomous cars, health monitoring, and industrial automation.

The infrastructure of edge computing consists of various layers, and each one performs its function. At the ground level, there are edge devices like sensors and actuators that collect data and perform the initial processing. Then there is fog computing, which is an intermediate layer between various edge nodes, collects data, pre-processes, and transmits it to the cloud. The upper cloud performs the heavy computation and long-term storage of data.

The confluence of IoT, mobile computing, digital twins, and cloud infrastructure creates a robust edge computing architecture. We present in this chapter the basic building blocks and layers of edge computing, such as IoT and digital twins, cloud infrastructure, fog computing, and the edge-cloud continuum.


2.1 IoT, Mobile, and Digital Twins[edit]

Internet of Things (IoT)[edit]

The Internet of Things (IoT) is a network of physical objects that interact via the internet. Examples range from common consumer devices such as smart thermostats, manufacturing equipment, health wearable sensors, and pieces of urban infrastructure. IoT provides additional connectivity and automation by way of ongoing data creation and analysis.

Key Features of IoT:[edit]

  1. Connectivity: Implements protocols like Wi-Fi, Bluetooth, Zigbee, and LoRaWAN for data transmission,
  2. Data Processing: Employs cloud and edge computing platforms to process the data that is gathered by sensors,
  3. Automation: Employs AI-driven or rule-based systems to make decisions autonomously,
  4. Examples: Smart homes, wearables in healthcare, industrial IoT (IIoT), smart cities, and connected vehicles.

Mobile Cloud Computing (MCC)[edit]

Mobile Cloud Computing (MCC) combines cloud computing and mobile phones to maximize processing power and storage. The system enables computation-intensive processing to be offloaded from mobile phones to cloud servers, thereby conserving battery life while improving performance.

Benefits of MCC:[edit]

  • Extended Battery Life: Processing tasks are offloaded to the cloud, which reduces local energy consumption,
  • Higher Processing Power: Allows sophisticated applications like AI and machine learning,
  • Scalability: Dynamically adjusts with changing user loads,
  • Real-Time Access: Provides access to cloud-hosted applications from remote locations
  • Applications: Cloud-hosted AI personal assistants (like Google Assistant), cloud gaming, and AR/VR applications

Digital Twins[edit]

What Is a Digital Twin?[edit]

A Digital Twin is a virtual replica of a physical object, system, or process that is continuously updated with real-time data. It uses sensors, connectivity, and analytics to mirror and simulate the behavior and performance of its physical counterpart. This enables organizations to monitor, diagnose, predict, and optimize operations virtually before applying changes in the real world.

How Digital Twins Relate to IoT[edit]

The Internet of Things (IoT) plays a foundational role in enabling Digital Twin technology. IoT devices—such as sensors, actuators, smart meters, and wearables—collect data from the physical environment and stream it into the digital twin model in real time. Here’s how the connection works:

  1. Data Collection: IoT sensors gather metrics such as temperature, vibration, pressure, humidity, or motion,
  2. Data Transmission: These metrics are transmitted over a network to a central system,
  3. Real-Time Sync: The digital twin receives this data to update its model, reflecting the real-time status of the asset,
  4. Insight Generation: Analytics, AI/ML models, and simulations applied to the digital twin enable predictive maintenance, performance tuning, and scenario testing without affecting the physical system.

Practical Use Cases[edit]

  • Smart Manufacturing: IoT-enabled machinery is mirrored by digital twins to optimize workflows, reduce downtime, and simulate design changes,
  • Smart Cities: Digital twins of buildings, transport systems, or power grids leverage data from IoT sensors to improve efficiency and public safety,
  • Healthcare: Wearables and connected medical devices provide biometric data to patient-specific digital twins for personalized treatment simulations.

Benefits:[edit]

  • Real-Time Monitoring,
  • Predictive Analytics & Maintenance
  • Virtual Experimentation
  • Enhanced Decision-Making
  • Enhanced Decision-Making

This foundation makes it easy to later explore where digital twins should be deployed (edge vs cloud), based on latency, bandwidth, data sensitivity, and computational demand.


Digital twins are virtual representations of physical objects that are regularly updated with real-time data so as to replicate, predict, and maximize functionality. Digital twins are increasingly applied in manufacturing, healthcare, and smart city projects that enable intelligent cities, where predictive maintenance and systems optimization can be realized.

Digital twins connect physical devices with virtual spaces. Digital twins are run at edge nodes in edge computing to carry out local processing of information to minimize the necessity to send colossal amounts of data to the cloud. The configuration is especially applicable in cases requiring real-time analytics, for example, industrial automation and healthcare monitoring.

Diagram 1: Digital Twin Edge Network Architecture

Real-Time Edge Adoption

This diagram illustrates a layered architecture where virtual twins communicate with physical IoT devices through edge nodes. It showcases intelligent transportation systems, 6G networks, and IoT applications as part of a hierarchical edge-cloud framework.


2.2 Cloud[edit]

Cloud computing is an on-demand model of ubiquitous, easy, network access to shared pools of programmable computer infrastructure such as networks, servers, storage, software applications, and services that may be rapidly de-provisioned and provisioned with little administrative effort. Cloud computing provides access to high-bandwidth computational power without the necessity of owning and maintaining physical infrastructures.

Why the Continuum Will Dominate the Future[edit]

1. Proximity and Real-Time Responsiveness Edge computing brings computation closer to data sources—ideal for latency-sensitive applications such as autonomous vehicles, augmented reality, and real-time video analytics. The continuum optimizes where computation occurs—at the edge, near edge, or cloud—based on performance needs.

2. Scalability Across Diverse Infrastructures From smart homes to industrial automation to smart cities, edge devices vary widely in capabilities. The continuum provides a unified model to scale applications from small edge sensors to large cloud-hosted AI models.

3. Flexibility and Adaptability Workloads can dynamically shift based on resource availability, cost, network conditions, or privacy constraints. For instance, video streams can be pre-processed at the edge and fully analyzed in the cloud only when needed, reducing bandwidth usage and improving efficiency.

4. Support for Advanced Use Cases Emerging applications such as connected autonomous systems, Industry 4.0, and real-time federated learning rely on the interplay between edge and cloud. The continuum supports these by allowing AI, analytics, and orchestration tools to operate fluidly across environments.

5. Improved Security, Privacy, and Data Sovereignty The continuum allows sensitive data to be processed locally, enforcing data residency laws and reducing exposure to breaches. Meanwhile, global-scale analytics can be performed on anonymized or aggregated data in the cloud.

6. Optimized Resource Utilization and Cost Efficiency The continuum benefits from statistical multiplexing, allowing workloads to be scheduled and shifted to underutilized resources whether at the edge or in the cloud, enhancing overall utilization and reducing operational costs.

Technical Innovations:[edit]

  • Federation Standards (e.g., IEEE 2302-2021): Enables trust and interoperability across diverse environments,
  • Zero-Trust Security Models: Secure workloads dynamically across distributed nodes,
  • Microservices and Containers: Allow modular, portable deployments across cloud and edge,
  • Orchestration Platforms (like Kubernetes and KubeEdge): Manage application components across the continuum.

Broader Impact and Vision[edit]

The edge–cloud continuum supports global digital transformation. It’s essential to sectors like smart healthcare, national disaster response systems, autonomous transportation, and cyber-physical systems, all of which need reliability, responsiveness, and scalability that neither cloud nor edge can provide alone.

Characteristics of Cloud Computing:[edit]

  1. On-Demand Self-Service: Consumers may access computing capabilities on their own without human interference from the provider,
  2. Broad Network Access: The cloud services may be accessed from the network and are capable of supporting heterogeneous client platforms like laptop computers, cell phones, and workstations,
  3. Resource Pooling: The computer resources of the provider are collected to provide service to multiple consumers through a multi-tenant model,
  4. Rapid Elasticity: Abilities are provisioned elastically and also de-provisioned to grow quickly in response to demand,
  5. Measured Service: Resources utilization is metered, controlled, and reported, and offers visibility for the provider as well as the consumer.

Cloud Service Models:[edit]

  • Infrastructure as a Service (IaaS): Offers raw computing resources like virtual machines, storage, and networks. AWS EC2 and Google Compute Engine are examples,
  • Platform as a Service (PaaS): Platform as a Service (PaaS): Offers environments in which applications can be created, tested, and run. It covers up the underlying infrastructure, focusing on application development (e.g., AWS Elastic Beanstalk, Heroku)
  • Software as a Service (SaaS): Delivers applications over the internet on a pay-per-use basis, without the need to install or maintain software (e.g., Microsoft 365, Salesforce).

Deployment Models:[edit]

  1. Public Cloud: Third-party suppliers manage it, offering services over the public internet. For low-security needs and high scalability,
  2. Private Cloud: Dedicated for use by one organization, offering more control and security. Installed on-premises or by a third party,
  3. Hybrid Cloud: Combines public and private clouds, enabling sharing of applications and data between them,
  4. Multi-Cloud: Utilizes services from multiple cloud vendors to avoid vendor lock-in and enhance reliability.

Challenges in Cloud Computing:[edit]

  • Latency Problems: Cloud processing may lead to delays, especially in real-time systems,
  • Privacy of Data: Transmission of data to the cloud can be a security risk,
  • Cost of Bandwidth: Ongoing data transfer between equipment and the cloud is costly,
  • Outages and Downtime: Interruptions in cloud services affect availability.

Integration of Cloud and Edge:[edit]

Edge computing is superior to cloud computing because it computes closer to the source and leverages the cloud for intense analytics and data storage. An example would be when in autonomous vehicles, local real-time calculation (i.e., object detection) happens while machine learning model training data are sent to the cloud.

Diagram 2: Edge Computing Architecture

Real-Time Edge Adoption

This diagram represents the layered structure of edge computing, highlighting how the cloud layer interacts with network and application edges. The cloud primarily handles heavy computation and data storage, while edge devices focus on local data processing.


2.3 Edge and Fog[edit]

Edge Computing:[edit]

Edge computing handles data at or close to the place where the data is situated, minimizing the latency in transmitting data to the cloud servers centrally. It is critical for applications that need real-time data processing and response, including autonomous vehicles, industrial automation, and medical monitoring​.

  1. Reduced Latency: Edge computing reduces the latency of transmitting data over long distances as it processes the data locally.
  2. Improved Data Privacy: Personal data is processed closer to the source, reducing the risk of exposure.
  3. Bandwidth Efficiency: There is only transmission of processed data or key information to the cloud, which minimizes data traffic,
  4. High Reliability: Edge devices can operate independently even when the network connectivity with the cloud is severed.

Applications of Edge Computing:[edit]

  • 'Autonomous Vehicles: Real-time data processing for navigation and obstacle avoidance,
  • Smart Cities: Adaptive lighting networks and traffic monitoring,
  • Healthcare: Real-time remote monitoring equipment to analyze patients' information,
  • Industrial IoT: Predictive maintenance through sensor data.

Fog Computing:[edit]

Fog computing is an extension of edge computing in the way that it creates a middle layer between the cloud and the edge devices. Fog computing enables data to be collected, computed, and decision-making performed nearer to the data source but not on the device itself​.

Features of Fog Computing:[edit]

  1. Intermediate Processing: Relieves edge devices of some processing tasks prior to reaching the cloud,
  2. Data Aggregation: It combines data from various sources to eliminate redundancy,
  3. Enhanced Scalability: Enables massive IoT networks by distributing computation tasks
  4. Real-Time Analytics: Processes real-time data in real time without depending on distant servers

Comparison: Edge vs. Fog:[edit]

  • Processing Location: Edge directly on devices; Fog on intermediate nodes
  • Latency: Edge has ultra-low latency; Fog has low but comparatively higher latency
  • Scalability: Fog is more scalable because of the capability of data combining
  • Data Aggregation: Edge deals with individual data, while Fog consolidates data from multiple sources.

Diagram 3: Edge-Fog-Cloud Network Architecture

Real-Time Edge Adoption
The diagram shows the interplay between cloud, fog, and edge layers, emphasizing the data flow from localized edge processing through fog nodes to the centralized cloud for deeper analytics​.

2.4 Edge-Cloud Continuum[edit]

Edge-Cloud Continuum is defined as an adaptive and dynamic computational paradigm that balances edge, fog, and cloud computation workloads. Through this integration of environments, applications can toggle between local and remote processing depending on workload, latency constraints, and network status​.

Core Functions:[edit]

  1. Dynamic Workload Distribution: Real-time tasks are processed at the edge, while less critical operations are pushed to the cloud,
  2. Real-Time Data Processing: Edge devices manage time-sensitive data, like sensor readings, without sending it to remote servers
  3. Data Filtering and Aggregation: Fog nodes filter data before forwarding it to the cloud, minimizing data transfer
  4. Scalable and Elastic Resource Management: Allocates resources adaptively according to the computational demand
  5. Security and Privacy Management: Implements localized data encryption and access control,
  6. Interoperability: Facilitates smooth interaction between edge, fog, and cloud systems through intelligent orchestration.

Applications:[edit]

  • Smart Grids: Real-time monitoring and load balancing between local power sources and centralized control
  • Smart Agriculture: Local analysis of environmental data, with broader data analytics performed in the cloud
  • Healthcare Monitoring: Real-time patient monitoring at the edge, with historical analysis in the cloud

By blending edge and cloud processing, the continuum provides a flexible infrastructure that adapts to real-time requirements while maintaining efficient long-term data management​.