NYU Tandon Researchers has shown that a machine learning application can master the finger print and fool fingerprint sensors, including those used on smartphones. They used a neural network trained to create human fingerprints and created fake fingerprints that can cheat a touch-based authentication systems.
Much the way that a master key can unlock every door in a building, these “DeepMasterPrints” use artificial intelligence to match a large number of prints stored in fingerprint databases and could thus theoretically unlock a large number of devices.
The work builds on earlier research led by Nasir Memon, professor of computer science and engineering and associate dean for online learning at NYU Tandon. Memon, who coined the term “MasterPrint,” described how fingerprint-based systems use partial fingerprints, rather than full ones, to confirm identity. Devices typically allow users to enrol several different finger images, and a match for any saved partial print is enough to confirm identity. Partial fingerprints are less likely to be unique than full prints, and Memon’s work demonstrated that enough similarities exist between partial prints to create MasterPrints capable of matching many stored partials in a database.
“Fingerprint-based authentication is still a strong way to protect a device or a system, but at this point, most systems don’t verify whether a fingerprint or other biometric is coming from a real person or a replica,” said Bontrager. “These experiments demonstrate the need for multi-factor authentication and should be a wake-up call for device manufacturers about the potential for artificial fingerprint attacks.”
This research has applications in fields beyond security. Togelius noted that their Latent Variable Evolution method used here to generate fingerprints can also be used to make designs in other industries—notably game development. The technique has already been used to generate new levels in popular video games.
This is surely a case that proves the technology kills technology phenomenon.