DPFace: Formal Privacy for Facial Images
Primary Investigator:
Chris Clifton
Tao Li, Rohan Ashok, Chris Clifton
Abstract
There is growing concern about image privacy
due to the popularity of social media and photo devices, along with increasing use of face recognition systems. However, established image de-identification techniques either are breakable and subject to re-identification, produce photos that are insufficiently realistic, or both. We present a definition for formally private image de-identification based on concepts from differential privacy. We also present a novel approach for image obfuscation by adding random noise to latent spaces of an unconditionally trained generative model that is able to synthesize photo-realistic facial images of high resolution; at low privacy levels (little noise) the original image is reproduced, but the images differ as the noise is increase while maintaining plausible facial images. To our knowledge, this is the first approach to image privacy that satisfies a noise-based formal privacy definition for the person.