I am an incoming postdoc researcher at Purdue university, directed by Prof. Ziran Wang and Prof. Ruqi Zhang. I received my Ph.D. degree of Computer Science at Shanghai Jiao Tong University in June 2024, supervised by Prof. Haibing Guan, and co-supervised by Prof. Yang Hua, Prof. Tao Song, and Prof. Ruhui Ma.
PhD in Computer Science and Technology, 2019.9 - 2024.6
Shanghai Jiao Tong University
BSc in Computer Science and Technology (IEEE Honor Class), 2015.9 - 2019.6
Shanghai Jiao Tong University
Electrical Engineering International Intensive program, 2017.6 - 2017.8
University of Washington
In this paper, we propose Information Bound as a metric of the amount of information in Bayesian neural networks. Different from mutual information on deterministic neural networks where modification of network structure or specific input data is usually necessary, Information Bound can be easily estimated on current Bayesian neural networks without any modification of network structures or training processes.
In this paper, we argue that the randomness of sampling in Bayesian neural networks causes errors in the updating of model parameters during training and some sampled models with poor performance in testing. We propose to train Bayesian neural networks with Adversarial Distribution as a theoretical solution. We further present the Adversarial Sampling method as an approximation in practice.
We propose Spectral Expectation Bound Regularization (SEBR) to enhance the robustness of Bayesian neural networks. Our theoretical analysis reveals that training with SEBR improves the robustness to adversarial noises. We also prove that training with SEBR can reduce the epistemic uncertainty of the model.
Mentor: Justin Ding. Responsibilities include:
Responsibilities include: