Incomplete Physical Adversarial Attack on Face Recognition

HU Weitao , XU Wujun

Journal of Donghua University(English Edition) ›› 2025, Vol. 42 ›› Issue (4) : 442 -448.

PDF (4480KB)
Journal of Donghua University(English Edition) ›› 2025, Vol. 42 ›› Issue (4) : 442 -448. DOI: 10.19884/j.1672-5220.202407001
research-article

Incomplete Physical Adversarial Attack on Face Recognition

Author information +
History +
PDF (4480KB)

Abstract

In recent work, adversarial stickers are widely used to attack face recognition(FR) systems in the physical world. However, it is difficult to evaluate the performance of physical attacks because of the lack of volunteers in the experiment. In this paper, a simple attack method called incomplete physical adversarial attack(IPAA) is proposed to simulate physical attacks. Different from the process of physical attacks, when an IPAA is conducted, a photo of the adversarial sticker is embedded into a facial image as the input to attack FR systems, which can obtain results similar to those of physical attacks without inviting any volunteers. The results show that IPAA has a higher similarity with physical attacks than digital attacks, indicating that IPAA is able to evaluate the performance of physical attacks. IPAA is effective in quantitatively measuring the impact of the sticker location on the results of attacks.

Keywords

physical attack / digital attack / face recognition / interferential variable / adversarial example

Cite this article

Download citation ▾
HU Weitao, XU Wujun. Incomplete Physical Adversarial Attack on Face Recognition. Journal of Donghua University(English Edition), 2025, 42(4): 442-448 DOI:10.19884/j.1672-5220.202407001

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

LIU X L, SHEN F R, ZHAO J, et al. EAP:an effective black-box impersonation adversarial patch attack method on face recognition in the physical world[J]. Neurocomputing, 2024,580:127517.

[2]

KHEDR Y M, LIU X, HE K. TransMix:crafting highly transferable adversarial examples to evade face recognition models[J]. Image and Vision Computing, 2024,146:105022.

[3]

HU C, LI Y B, FENG Z H, et al. Attention-guided evolutionary attack with elastic-net regularization on face recognition[J]. Pattern Recognition, 2023,143:109760.

[4]

AGRAWAL K, BHATNAGAR C.A black-box based attack generation approach to create the transferable patch attack[C]// 2023 7th International Conference on Intelligent Computing and Control Systems (ICICCS). New York: IEEE,2023:1376-1380.

[5]

ZHENG X, FAN Y B, WU B Y, et al. Robust physical-world attacks on face recognition[J]. Pattern Recognition, 2023,133:109009.

[6]

SURYANTO N, KIM Y, KANG H, et al. DTA:physical camouflage attacks using differentiable transformation network[C]// 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). New York: IEEE,2022:15284-15293.

[7]

SCHNEIDER J, APRUZZESE G. Dual adversarial attacks:fooling humans and classifiers[J]. Journal of Information Security and Applications, 2023,75:103502.

[8]

HU C Y, SHI W W, TIAN L. Adversarial color projection:a projector-based physical-world attack to DNNs[J]. Image and Vision Computing, 2023,140:104861.

[9]

HU Z H, HUANG S Y, ZHU X P, et al.Adversarial texture for fooling person detectors in the physical world[C]// 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). New York: IEEE,2022:13297-13306.

[10]

ZHAO M N, WANG B, GUO W K, et al. Protecting by attacking:a personal information protecting method with cross-modal adversarial examples[J]. Neurocomputing, 2023,551:126481.

[11]

QIN Y X, ZHANG K J, PAN H W. Adversarial attack for object detectors under complex conditions[J]. Computers & Security, 2023,134:103460.

[12]

ATHALYE A, ENGSTROM L, ILYAS A, et al. Synthesizing robust adversarial examples[C]// International Conference on Machine Learning.Stockholm, Sweden:IMLS,2018:284-293.

[13]

RYU G, PARK H, CHOI D. Adversarial attacks by attaching noise markers on the face against deep face recognition[J]. Journal of Information Security and Applications, 2021,60:102874.

[14]

WEI X X, GUO Y, YU J. Adversarial sticker:a stealthy attack method in the physical world[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023, 45(3):2711-2725.

[15]

WEI X X, YU J, HUANG Y.Physically adversarial infrared patches with learnable shapes and locations[C]// 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). New York: IEEE,2023:12334-12342.

[16]

KOMKOV S, PETIUSHKO A. AdvHat:real-world adversarial attack on ArcFace face ID system[C]// 2020 25th International Conference on Pattern Recognition (ICPR). New York: IEEE,2021:819-826.

[17]

KURAKIN A, GOODFELLOW I J, BENGIO S. Adversarial examples in the physical world[EB/OL].(2016-07-08)[2024-06-20]. http://arxiv.org/pdf/1607.02533.

[18]

ZHANG K P, ZHANG Z P, LI Z F, et al. Joint face detection and alignment using multitask cascaded convolutional networks[J]. IEEE Signal Processing Letters, 2016, 23(10):1499-1503.

[19]

PARKHI O M, VEDALDI A, ZISSERMAN A. Deep face recognition[C]// Proceedings of the British Machine Vision Conference 2015. Swansea: British Machine Vision Association,2015:41.1-41.12.

[20]

SCHROFF F, KALENICHENKO D, PHILBIN J. FaceNet:a unified embedding for face recognition and clustering[C]// 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). New York: IEEE,2015:815-823.

[21]

DENG J K, GUO J, YANG J, et al. ArcFace:additive angular margin loss for deep face recognition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018,44:5962-5979.

[22]

CHEN S, LIU Y, GAO X, et al. MobileFaceNets:efficient CNNs for accurate real-time face verification on mobile devices[C]//13th Chinese Conference.[S.l.]:CCBR, 2018.

[23]

Computer Vision Lab. Labeled faces in the wild[D]. Massachusetts: University of Massachusetts Amherst, 2007.

AI Summary AI Mindmap
PDF (4480KB)

555

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/