Meta-probability Weighting for Improving Reliability of DNNs to Label Noise.

Researchers

Journal

Modalities

Models

Abstract

Training noise-robust deep neural networks (DNNs) in label noise scenario is a crucial task. In this paper, we first demonstrates that the DNNs learning with label noise exhibits over-fitting issue on noisy labels because of the DNNs is too confidence in its learning capacity. More significantly, however, it also potentially suffers from under-learning on samples with clean labels. DNNs essentially should pay more attention on the clean samples rather than the noisy samples. Inspired by the sample-weighting strategy, we propose a meta-probability weighting (MPW) algorithm which weights the output probability of DNNs to prevent DNNs from over-fitting to label noise and alleviate the under-learning issue on the clean sample. MPW conducts an approximation optimization to adaptively learn the probability weights from data under the supervision of a small clean dataset, and achieves iterative optimization between probability weights and network parameters via meta-learning paradigm. The ablation studies substantiate the effectiveness of MPW to prevent the deep neural networks from overfitting to label noise and improve the learning capacity on clean samples. Furthermore, MPW achieves competitive performance with other state-of-the-art methods on both synthetic and real-world noises.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *