The Chinese and American teams succeeded in “changing their faces” and deceived AI face recognition.

The Chinese and American teams succeeded in “changing their faces” and deceived AI face recognition.

With the continuous improvement of artificial intelligence (AI) technology, image recognition systems have been widely used in life. Some devices have also begun to use face recognition technology for identity authentication, but is the security of this technology really impeccable?

Just as with previous Psychedelic stickers and cotton field concert pictures, researchers are trying to make image recognition technology safer by testing any potential hacking methods, while the research teams from China and the United States have recently been on the arXiv platform. A paper was published detailing the details of how they found out how to deceive the "face recognition" application.

AI并非完美:中美团队用红外线投射成功“变脸”骗过人脸识别

â–² Infrared LED placed in the hat. (Source:Zhe Zhou)

In fact, the concept of the team is very simple. To directly deceive the face recognition application, the most direct way is to "replace" a face for the identified person. In order to achieve this, the team first used a deep neural network to interpret some facial images, and then through the miniature infrared LEDs connected in the baseball cap, the interpreted face images were projected onto the face of the identifiable person using numerous infrared spots. In order to achieve the effect of covering up the identity.

Of course, since the projection of other people's faces, as long as the concept is successful, under the premise of using the face recognition system as the identity authentication, it is also possible to let the identifiable person impersonate another person's identity.

To test the theory, the research team selected four random photos to try to deceive facial recognition software, which also included photos of the American musician Moby. In the experiment, the researchers found that the scam recognition system has a success rate of about 70% as long as the identifier is slightly similar to the source of the projected face.

AI并非完美:中美团队用红外线投射成功“变脸”骗过人脸识别

â–² The first column below refers to the distance between the identified person and the source of the projected face. The second column is the theoretically applied gap, and the third column is the actual gap. (Source:Zhe Zhou)

Since infrared is a kind of non-visible light, people can't detect it by the naked eye. At the same time, the LED attached to the inner side of the brim is also very small. It can even be hidden in other wearables. Even if it is safe for humans, it is difficult. The situation in which the system is deceived is perceived, which increases the possibility of being exploited.

Of course, it must be mentioned that this is just a small study that has not been peer-reviewed, so there may be some controversy, but based on these findings and tests, the team believes that today's face is identified by the need to authenticate or monitor key scenarios. There is still a long way to go before the identification technology is safe and reliable.

"Related researchers should pay more attention to the threat of infrared."

Car Phone Holder

Car Phone Holder,Mobile Holder For Cars,Mobile Phone Holder For Dashboard,Mobile Phone Holder For Car Dashboard

Ningbo Luke Automotive Supplies Ltd. , https://www.car-phone-holder.com

This entry was posted in on