65th SAMS Congress
06-08 December 2022
Stellenbosch University
SUN Logo

Efficient and robust optimization methods for training binarized deep neural networks
Bubacarr Bah\(^*\), African Institute for Mathematical Sciences (AIMS) South Africa & Stellenbosch University
Jannis Kurtz, University of Siegen, Germany

SAMS Subject Classification Number: 23

Compared to classical deep neural networks its binarized versions are, among other things, useful for applications on resource-limited devices due to their reduction in memory consumption and computational demands. In this work we study deep neural networks with binary activation functions and continuous or integer weights (BDNN). We show that the BDNN can be reformulated as a mixed-integer linear program with bounded weight space which can be solved to global optimality by classical mixed-integer programming solvers. Additionally, a local search heuristic is presented to calculate locally optimal networks. Furthermore to improve efficiency we present an iterative data-splitting heuristic which iteratively splits the training set into smaller subsets by using the k-mean method. Afterwards all data points in a given subset are forced to follow the same activation pattern, which leads to a much smaller number of integer variables in the mixed- integer programming formulation and therefore to computational improvements. Finally for the first time a robust model is presented which enforces robustness of the BDNN during training. All methods are tested on random and real datasets and our results indicate that all models can often compete with or even outperform classical DNNs on small network architectures confirming the viability for applications having restricted memory or computing power, details in [1].

References

[1] J. Kurtz and B. Bah, Efficient and Robust Mixed-Integer Optimization Methods for Training Binarized Deep Neural Networks. arXiv preprint arXiv:2110.11382, 2021.