Given a learning task where the data is distributed among several parties, communication is one of the fundamental resources which the parties would like to minimize.
We present a distributed boosting algorithm which is resilient to a limited amount of noise.
Our algorithm is similar to classical boosting algorithms, although it is equipped with a new component, inspired by Impagliazzo’s hard-core lemma (Impagliazzo, 1995), adding a robustness quality to the algorithm.
We also complement this result by showing that resilience to any asymptotically larger noise is not achievable by a communication-efficient algorithm.