A Unifying Mutual Information View of Metric Learning: Cross-Entropy vs. Pairwise Losses - Archive ouverte HAL Access content directly
Conference Papers Year : 2020

A Unifying Mutual Information View of Metric Learning: Cross-Entropy vs. Pairwise Losses

(1) , (1) , (1) , (1) , (1) , (1) , (2) , (1)
1
2

Abstract

Statistical methods protecting sensitive information or the identity of the data owner have become critical to ensure privacy of individuals as well as of organizations. This paper investigates anonymization methods based on representation learning and deep neural networks, and motivated by novel informationtheoretical bounds. We introduce a novel training objective for simultaneously training a predictor over target variables of interest (the regular labels) while preventing an intermediate representation to be predictive of the private labels. The architecture is based on three sub-networks: one going from input to representation, one from representation to predicted regular labels, and one from representation to predicted private labels. The training procedure aims at learning representations that preserve the relevant part of the information (about regular labels) while dismissing information about the private labels which correspond to the identity of a person. We demonstrate the success of this approach for two distinct classification versus anonymization tasks (handwritten digits and sentiment analysis).
Fichier principal
Vignette du fichier
cross_entropy_metric_learning.pdf (629.79 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03351131 , version 1 (21-09-2021)

Identifiers

Cite

Malik Boudiaf, Jérôme Rony, Imtiaz Masud Ziko, Ismail Ben Ayed, Eric Granger, et al.. A Unifying Mutual Information View of Metric Learning: Cross-Entropy vs. Pairwise Losses. 16th European Conference on Computer Vision (ECCV), Sep 2021, Glasgow (virtual), United Kingdom. pp.548-564, ⟨10.1007/978-3-030-58539-6_33⟩. ⟨hal-03351131⟩
64 View
37 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More