New website getting online, testing
    • 摘要: 基于卷积神经网络(CNN)的识别器,由于其高识别率已经在人脸识别中广泛应用,但其滥用也带来隐私保护问题。本文提出了局部背景区域的人脸对抗攻击(BALA),可以作为一种针对CNN人脸识别器的隐私保护方案。局部背景区域添加扰动克服了现有方法在前景人脸区域添加扰动所导致的原始面部特征损失的缺点。BALA使用了两阶段损失函数以及灰度化、均匀化方法,在更好地生成对抗块的同时提升了数字域到物理域的对抗效果。在照片重拍和场景实拍实验中,BALA对VGG-FACE人脸识别器的攻击成功率(ASR)比现有方法分别提升12%和3.8%。

       

      Abstract: Recognizers based on the convolutional neural networks (CNN) have been widely used in face recognition because of their high recognition rate. But its abuse also brings privacy protection problems. In this paper, we propose a local background area-based face confrontation attack (BALA), which can be used as a privacy protection scheme for CNN face recognizer. Adding disturbance in the local background region overcomes the loss of original facial features caused by adding disturbance to the foreground face region in existing methods. BALA uses a two-stage loss function, graying, and homogenization methods to better generate adversarial blocks and improve the adversarial effect after digital to physical domain conversion. In the photo retake and live shot experiments, BALA's attack success rate (ASR) against the VGG-FACE face recognizer is more than 12% and 3.8% higher than the current methods.