A Quantitative Analysis Of The Robustness Of Neural Networks For Tabular Data - Archive ouverte HAL Access content directly
Conference Papers Year :

A Quantitative Analysis Of The Robustness Of Neural Networks For Tabular Data

(1, 2) , (3) , (3) , (1, 2)
1
2
3

Abstract

This paper presents a quantitative approach to demonstrate the robustness of neural networks for tabular data. These data form the backbone of the data structures found in most industrial applications. We analyse the effect of various widely used techniques we encounter in neural network practice, such as regularization of weights, addition of noise to the data, and positivity constraints. This analysis is performed by using three state-of-the-art techniques, which provide mathematical proofs of robustness in terms of Lipschitz constant for feed-forward networks. The experiments are carried out on two prediction tasks and one classification task. Our work brings insights into building robust neural network architectures for safety critical systems that require certification or approval from a competent authority.
Fichier principal
Vignette du fichier
ICASSP_2021 (3).pdf (248.66 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03527634 , version 1 (20-01-2022)

Identifiers

Cite

Kavya Gupta, Beatrice Pesquet-Popescu, Fateh Kaakai, Jean-Christophe Pesquet. A Quantitative Analysis Of The Robustness Of Neural Networks For Tabular Data. ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Jun 2021, Toronto, Canada. pp.8057-8061, ⟨10.1109/ICASSP39728.2021.9413858⟩. ⟨hal-03527634⟩
70 View
51 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More