CERIAS 2025 Annual Security Symposium


2025 Symposium Posters

Posters > 2025

Diffstats: Mitigating Data Poisoning Attacks to Local Differential Privacy


PDF

Primary Investigator:
Wenhai Sun

Project Members
Xiaolin Li, Wenhai Sun
Abstract
Local Differential Privacy (LDP) has emerged as a widely adopted privacy-preserving tool, with practical implementations by major industry players such as Google and Apple. However, recent data poisoning attacks have posed significant threats to LDP systems. In these attacks, adversaries inject malicious data to boost the estimated frequency of target items, thereby undermining the reliability of LDP-based statistical tasks. Existing detection methods have shown limited effectiveness in identifying such poisoning attacks. In this work, we propose a novel anomaly user detection method, Diffstats, which leverages the statistical differences in bit settings between attackers and benign users. Our experimental results demonstrate that Diffstats achieves substantial improvements in F1-score compared to current detection approaches (e.g., FIAD) on real-world datasets against the state-of-the-art poisoning attack, the Maximal Gain Attack (MGA).