Skip to Main content Skip to Navigation
Conference papers

A Unifying Mutual Information View of Metric Learning: Cross-Entropy vs. Pairwise Losses

Abstract : Statistical methods protecting sensitive information or the identity of the data owner have become critical to ensure privacy of individuals as well as of organizations. This paper investigates anonymization methods based on representation learning and deep neural networks, and motivated by novel informationtheoretical bounds. We introduce a novel training objective for simultaneously training a predictor over target variables of interest (the regular labels) while preventing an intermediate representation to be predictive of the private labels. The architecture is based on three sub-networks: one going from input to representation, one from representation to predicted regular labels, and one from representation to predicted private labels. The training procedure aims at learning representations that preserve the relevant part of the information (about regular labels) while dismissing information about the private labels which correspond to the identity of a person. We demonstrate the success of this approach for two distinct classification versus anonymization tasks (handwritten digits and sentiment analysis).
Complete list of metadata

https://hal-centralesupelec.archives-ouvertes.fr/hal-03351131
Contributor : Pablo Piantanida Connect in order to contact the contributor
Submitted on : Tuesday, September 21, 2021 - 11:26:09 PM
Last modification on : Monday, September 27, 2021 - 9:15:04 AM

File

cross_entropy_metric_learning....
Files produced by the author(s)

Identifiers

Citation

Malik Boudiaf, Jérôme Rony, Imtiaz Masud Ziko, Ismail Ayed, Eric Granger, et al.. A Unifying Mutual Information View of Metric Learning: Cross-Entropy vs. Pairwise Losses. 16th European Conference on Computer Vision (ECCV), Sep 2021, Glasgow (virtual), United Kingdom. pp.548-564, ⟨10.1007/978-3-030-58539-6_33⟩. ⟨hal-03351131⟩

Share

Metrics

Record views

19

Files downloads

10