Distributed Self-Paced Learning in Alternating Direction Method of Multipliers
Xuchao Zhang, Liang Zhao, Zhiqian Chen
Abstract
Self-paced learning (SPL) mimics the cognitive process of humans, who generally learn from easy samples to hard ones. One key issue in SPL is the training process required for each instance weight depends on the other samples and thus cannot easily be run in a distributed manner in a large-scale dataset. In this paper, we reformulate the self-paced learning problem into a distributed setting and propose a novel Distributed Self-Paced Learning method (DSPL) to handle large scale datasets. Specifically, both the model and instance weights can be optimized in parallel for each batch based on a consensus alternating direction method of multipliers. We also prove the convergence of our algorithm under mild conditions. Extensive experiments on both synthetic and real datasets demonstrate that our approach is superior to those of existing methods.
Xuchao Zhang, Liang Zhao, Zhiqian Chen, Chang-Tien Lu: Distributed Self-Paced Learning in Alternating Direction Method of Multipliers. IJCAI 2018: 3148-3154
People
Publication Details
- Date of publication:
- Conference:
- International Joint Conference on Artificial Intelligence
- Page number(s):
- 3148-3154