Soft Dropout and Its Variational Bayes Approximation

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP), 2019, 2019-October
Issue Date:
2019-10-13
Filename Description Size
08918818.pdfPublished version331.32 kB
Adobe PDF
Full metadata record
Soft dropout, a generalization of standard “hard” dropout, is introduced to regularize the parameters in neural networks and prevent overfitting. We replace the “hard” dropout mask following a Bernoulli distribution with the “soft” mask following a beta distribution to drop the hidden nodes in different levels. The soft dropout method can introduce continuous mask coefficients in the interval of [0, 1], rather than only zero and one. Meanwhile, in order to implement the adaptive dropout rate via adaptive distribution parameters, we respectively utilize the half-Gaussian distributed and the half-Laplace distributed variables to approximate the beta distributed masks and apply a variation of variational Bayes optimization called stochastic gradient variational Bayes (SGVB) algorithm to optimize the distribution parameters. In the experiments, compared with the standard soft dropout with fixed dropout rate, the adaptive soft dropout method generally improves the performance. In addition, the proposed soft dropout and its adaptive versions achieve performance improvement compared with the referred methods on both image classification and regression tasks.
Please use this identifier to cite or link to this item: