Invisible Mask: Practical Attacks on Face Recognition with Infrared

03/13/2018
by   Zhe Zhou, et al.
0

Accurate face recognition techniques make a series of critical applications possible: policemen could employ it to retrieve criminals' faces from surveillance video streams; cross boarder travelers could pass a face authentication inspection line without the involvement of officers. Nonetheless, when public security heavily relies on such intelligent systems, the designers should deliberately consider the emerging attacks aiming at misleading those systems employing face recognition. We propose a kind of brand new attack against face recognition systems, which is realized by illuminating the subject using infrared according to the adversarial examples worked out by our algorithm, thus face recognition systems can be bypassed or misled while simultaneously the infrared perturbations cannot be observed by raw eyes. Through launching this kind of attack, an attacker not only can dodge surveillance cameras. More importantly, he can impersonate his target victim and pass the face authentication system, if only the victim's photo is acquired by the attacker. Again, the attack is totally unobservable by nearby people, because not only the light is invisible, but also the device we made to launch the attack is small enough. According to our study on a large dataset, attackers have a very high success rate with a over 70% success rate for finding such an adversarial example that can be implemented by infrared. To the best of our knowledge, our work is the first one to shed light on the severity of threat resulted from infrared adversarial examples against face recognition.

READ FULL TEXT

page 4

page 9

research
10/15/2022

Is Face Recognition Safe from Realizable Attacks?

Face recognition is a popular form of biometric authentication and due t...
research
08/14/2019

AdvFaces: Adversarial Face Synthesis

Face recognition systems have been shown to be vulnerable to adversarial...
research
11/29/2018

Attacks on State-of-the-Art Face Recognition using Attentional Adversarial Attack Generative Network

With the broad use of face recognition, its weakness gradually emerges t...
research
04/28/2022

Morphing Attack Potential

In security systems the risk assessment in the sense of common criteria ...
research
08/23/2020

Vulnerability of Face Recognition Systems Against Composite Face Reconstruction Attack

Rounding confidence score is considered trivial but a simple and effecti...
research
05/26/2022

A Physical-World Adversarial Attack Against 3D Face Recognition

3D face recognition systems have been widely employed in intelligent ter...
research
10/15/2019

On adversarial patches: real-world attack on ArcFace-100 face recognition system

Recent works showed the vulnerability of image classifiers to adversaria...

Please sign up or login with your details

Forgot password? Click here to reset