Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Edge Computing Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Federated Learning
(section)
Page
Discussion
British English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Upload file
Special pages
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== 5.3.1 Key Aggregation Algorithms == {| class="wikitable" |+ Comparison of Aggregation Algorithms in Federated Learning ! Algorithm !! Description !! Handles Non-IID Data !! Server-Side Optimization !! Typical Use Case |- | FedAvg || Performs weighted averaging of client models based on dataset size. Simple and communication-efficient. || Limited || No || Basic federated setups with IID or mildly non-IID data. |- | FedProx || Adds a proximal term to the local loss function to prevent client drift. Stabilizes training with diverse data. || Yes || No || Suitable for edge deployments with high data heterogeneity or resource-limited clients. |- | FedOpt || Applies adaptive optimizers (e.g., FedAdam, FedYogi) on aggregated updates. Enhances convergence in dynamic systems. || Yes || Yes || Used in large-scale systems or settings with unstable participation and gradient variability. |} Aggregation is the cornerstone of FL, where locally computed model updates from edge devices are combined into a global model. The most widely adopted aggregation method is Federated Averaging (FedAvg), introduced in the foundational work by McMahan et al.<sup>[1]</sup> FedAvg operates by averaging model parameters received from participating clients, typically weighted by the size of each client’s local dataset. This simple yet powerful method reduces the frequency of communication by allowing each device to perform multiple local updates before sending gradients to the server. However, FedAvg performs optimally only when data across clients is balanced and independent and identically distributed (IID)—conditions rarely satisfied in edge computing environments, where client datasets are often highly non-IID, sparse, or skewed.<sup>[2]</sup> To address these limitations, several advanced aggregation algorithms have been proposed. One notable extension is FedProx, which modifies the local optimization objective by adding a proximal term that penalizes large deviations from the global model. This constrains local training and improves stability in heterogeneous data scenarios. FedProx also allows flexible participation by clients with limited resources or intermittent connectivity, making it more robust in practical edge deployments. Another family of aggregation algorithms is FedOpt, which includes adaptive server-side optimization techniques such as FedAdam and FedYogi. These algorithms build on optimization methods used in centralized training and apply them at the aggregation level, enabling faster convergence and improved generalization under complex, real-world data distributions. Collectively, these variants of aggregation address critical FL challenges such as slow convergence, client drift, and update divergence due to heterogeneity in both data and device capabilities.<sup>[1]</sup><sup>[3]</sup>
Summary:
Please note that all contributions to Edge Computing Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Edge Computing Wiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)