Home > Cyber News > Facial Recognition Systems Bypassed Via Machine Learning Weakness
CYBER NEWS

Facial Recognition Systems Bypassed Via Machine Learning Weakness

Security experts published findings on how modern facial recognition technologies can be fooled by malicious users using a discovered weakness in the machine learning algorithms used. Successful exploit of this technique will make the system think it’s seeing the target person and provide credentials to the secured resources.




Facial Recognition Systems Fooled in New Abuse Technique

Computer security experts have uncovered a new method that can effectively bypass contemporary facial recognition technologies. Their findings have been published in a research article called Dopple-ganging up on Facial Recognition System and depend on a weakness in the machine learning algorithms which are implemented in these systems. They are designed to judge the authenticity of the images that they see and if they are verified and legitimate in order to unlock the hidden resources.

One of the discovered methods relies on the use of special software that are designed to generate photorealistic faces. This attack model relies on the use of several frameworks for creating such images. Machine learning depends on a lot of training — data sets are being fed to the input of the system in order to achieve the designed level of security implementation.

For this reason the proposed attack model relies on using this very same factor in order to train the machine. Instead of feeding safe valid images using an authentic and legitimate sets, the hackers can use their own malicious images. The main goal is to minimize the distance between a legitimate and a crafted facial image. When the facial recognition system recognizes a very minimal distance in some cases a misclassification can happen – this will lead to the verification of the malware crafted face.

A demonstration of this concept was done by training a system on a 1500 images sets of researchers which have been captured from video and presented as stills. Multiple expressions have been provided in order to accurately represent how valid passport photos are fed in such systems.

This demonstration has showed that relying on facial technology for security alone is not recommended. The main recommendation is for vendors and device manufacturers that include such technology in their products to oversee the creation of security standards in order to protect from such model hacking techniques.

Martin Beltov

Martin graduated with a degree in Publishing from Sofia University. As a cyber security enthusiast he enjoys writing about the latest threats and mechanisms of intrusion.

More Posts

Follow Me:
Twitter

Leave a Comment

Your email address will not be published. Required fields are marked *

This website uses cookies to improve user experience. By using our website you consent to all cookies in accordance with our Privacy Policy.
I Agree