Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Spurious Correlation and Shortcut Learning: Foundations and Solutions

Privacy Risks and Memorization of Spurious Correlated Data

Chenxiang Zhang · Jun Pang · Sjouke Mauw

Keywords: [ spurious correlation ] [ privacy ] [ membership inference attacks ]


Abstract:

Neural networks are vulnerable to privacy attacks aimed at stealing sensitive data. The risks are amplified in real-world scenario when models are trained on limited and biased data.In this work, we investigate the impact of spurious correlation bias on privacy vulnerability. We introduce spurious privacy leakage, a phenomenon where spurious groups are more vulnerable to privacy attacks compared to other groups.Through empirical analysis, we counterintuitively demonstrate that reducing spurious correlation fails to address the privacy disparity between groups.This leads us to introduce a new perspective on privacy disparity based on data memorization. We show that mitigating spurious correlation does not reduce the degree of data memorization, and therefore, neither the privacy risks.Our findings highlight the need to rethink privacy with spurious learning.

Chat is not available.


OSZAR »