DFL: High-Performance Blockchain-Based Federated Learning

Yongding Tian, Zhuoran Guo, Jiaxuan Zhang, Zaid Al-Ars

Research output: Contribution to journalArticleScientificpeer-review

15 Downloads (Pure)

Abstract

Many researchers have proposed replacing the aggregation server in federated learning with a blockchain system to improve privacy, robustness, and scalability. In this approach, clients would upload their updated models to the blockchain ledger and use a smart contract to perform model averaging. However, the significant delay and limited computational capabilities of blockchain systems make it inefficient to support machine learning applications on the blockchain.In this article, we propose a new public blockchain architecture called DFL, which is specially optimized for distributed federated machine learning. Our architecture inherits the merits of traditional blockchain systems while achieving low latency and low resource consumption by waiving global consensus. To evaluate the performance and robustness of our architecture, we implemented a prototype and tested it on a physical four-node network, and also developed a simulator to simulate larger networks and more complex situations. Our experiments show that the DFL architecture can reach over 90% accuracy for non-I.I.D. datasets, even in the presence of model poisoning attacks, while ensuring that the blockchain part consumes less than 5% of hardware resources.
Original languageEnglish
Article number20
Pages (from-to)1-25
Number of pages25
JournalDistributed Ledger Technologies: Research and Practice
Volume2
Issue number3
DOIs
Publication statusPublished - 2023

Keywords

  • Federated machine learning
  • blockchain
  • partial consensus
  • reputation

Fingerprint

Dive into the research topics of 'DFL: High-Performance Blockchain-Based Federated Learning'. Together they form a unique fingerprint.

Cite this