Computer Science

Echo-CGC: A Communication-Efficient Byzantine-Tolerant Distributed Machine Learning Algorithm in Single-Hop Radio Network

Document Type

Conference Paper

Abstract

In the past few years, many Byzantine-tolerant distributed machine learning (DML) algorithms have been proposed in the point-to-point communication model. In this paper, we focus on a popular DML framework - the parameter server computation paradigm and iterative learning algorithms that proceed in rounds, e.g., [11, 8, 6]. One limitation of prior algorithms in this domain is the high communication complexity. All the Byzantine-tolerant DML algorithms that we are aware of need to send n d-dimensional vectors from worker nodes to the parameter server in each round, where n is the number of workers and d is the number of dimensions of the feature space (which may be in the order of millions). In a wireless network, power consumption is proportional to the number of bits transmitted. Consequently, it is extremely difficult, if not impossible, to deploy these algorithms in power-limited wireless devices. Motivated by this observation, we aim to reduce the communication complexity of Byzantine-tolerant DML algorithms in the single-hop radio network [1, 3, 14].

Publication Title

Leibniz International Proceedings in Informatics, LIPIcs

Publication Date

2021

Volume

184

ISSN

1868-8969

ISBN

9783959771764

DOI

10.4230/LIPIcs.OPODIS.2020.7

Keywords

Byzantine fault, communication complexity, distributed machine learning, parameter server, single-hop radio network, wireless communication

APA Citation

Zhang, Q., & Tseng, L. (2020). Echo-CGC: A communication-efficient byzantine-tolerant distributed machine learning algorithm in single-hop radio network. arXiv preprint arXiv:2011.07447.

Share

COinS