Repository logo
 

Craft-DFL: Navigating the Performance of Decentralized Federated Learning Deployments

dc.contributor.authorJiang, Chengyan
dc.contributor.copyright-releaseNot Applicableen_US
dc.contributor.degreeMaster of Computer Scienceen_US
dc.contributor.departmentFaculty of Computer Scienceen_US
dc.contributor.ethics-approvalNot Applicableen_US
dc.contributor.external-examinern/aen_US
dc.contributor.manuscriptsNot Applicableen_US
dc.contributor.thesis-readerYujie Tangen_US
dc.contributor.thesis-readerSaurabh Deyen_US
dc.contributor.thesis-supervisorIsraat Haqueen_US
dc.date.accessioned2024-09-04T15:20:57Z
dc.date.available2024-09-04T15:20:57Z
dc.date.defence2024-08-13
dc.date.issued2024-08-27
dc.description.abstractThe widespread adoption of smartphones and smart wearable devices has led to the widespread use of Centralized Federated Learning (CFL) for training powerful machine learning models while preserving data privacy. However, CFL faces limitations due to overreliance on a central server, which impacts latency and system robustness. Decentralized Federated Learning (DFL) is introduced to address these challenges. DFL facilitates direct collaboration among participating devices without relying on a central server. Each device can independently connect with other devices and share model parameters. This work explores crucial factors influencing the convergence and generalization capacity of DFL models, emphasizing network topologies, non-IID data distribution, and training strategies. We first derive the convergence rate of different DFL model deployment strategies. Then, we comprehensively analyze various network topologies (e.g., linear, ring, star, and mesh) with different degrees of non-IID data and evaluate them over widely adopted machine-learning models (e.g., classical, deep neural networks, and Large Language Models) and real-world datasets. The results reveal that models converge to the optimal one for IID data. However, the convergence rate is inversely proportional to the degree of non-IID data distribution. Our findings will serve as valuable guidelines for designing effective DFL model deployments in practical applications.en_US
dc.identifier.urihttp://hdl.handle.net/10222/84561
dc.language.isoenen_US
dc.subjectDecentralized Federated Learningen_US
dc.titleCraft-DFL: Navigating the Performance of Decentralized Federated Learning Deploymentsen_US
dc.typeThesisen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ChengyanJiang2024.pdf
Size:
657.32 KB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: