Learning in uncertain, noisy, or adversarial environments is a challenging task for deep neural networks (DNNs). We propose a new theoretically grounded and efficient approach for robust learning that builds upon Bayesian estimation and Variational Inference. We formulate the problem of density propagation through layers of a DNN and solve it using an Ensemble Density Propagation (EnDP) scheme. The EnDP approach allows us to propagate moments of the variational probability distribution across the layers of a Bayesian DNN, enabling the estimation of the mean and covariance of the predictive distribution at the output of the model. Our experiments using MNIST and CIFAR-10 datasets show a significant improvement in the robustness of the trained models to random noise and adversarial attacks.
G. Carannante, D. Dera, G. Rasool, N. C. Bouaynaya and L. Mihaylova, "Robust Learning via Ensemble Density Propagation in Deep Neural Networks," 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP), 2020, pp. 1-6, doi: 10.1109/MLSP49062.2020.9231635.
2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP)