The move to biometric authentication started as far back as 2004 when IBM introduced fingerprint readers. Biometrics went mainstream when Apple released TouchID on the iPhone in 2013.
Since then we’ve seen almost a decade of innovation leading to Biometric Authentication methods for the enterprise reaching the Slope of Enlightenment on the latest Gartner Hype Cycle for Identity and Access Management Technologies, 2021.
Still, the jury is out on facial recognition technologies. Done right for authentication and as an integrated component of a zero trust architecture, they produce an excellent user experience, preserve privacy, facilitate secure access, and resist spoofing and hacks. But, facial recognition implemented as an add-on to an antiquated technology stack arguably causes more harm than good. Let’s dig deep.
Recently two specific issues with facial recognition have made news. The first relates to the importance of securing user biometrics and controlling access to personal information. The second raises the issue of Identity Decisioning Bias in algorithms designed to verify identity but built in a way that disadvantages certain populations. Both are serious concerns and both should be examined in light of the fundamental reason for replacing passwords with biometrics – namely to improve security and deliver a more convenient user experience. Privacy and efficacy are core to these objectives. I’ll address each issue in turn.
First, let’s cover security. There is and always will be a need to balance security and convenience. In regard to user PII, it can certainly be convenient to store and manage this information centrally, as has been the trend over the last two decades. Unfortunately, when a hacker breaches the perimeter and successfully gains access to admin credentials, they can gain access to the honeypot of PII data. We’ve seen this happen all too often, and that makes this trade of convenience over security a poor one made all the worse when the PII contains biometric data. The solution?
By storing PII in a distributed ledger (such as a private and permissioned blockchain) access and administration fundamentally change. There is no honeypot! A hacker with access to admin credentials cannot access PII which simply isn’t there. As for convenience, the entire landscape shifts. With no central administration of data, administration simplifies to managing the servers supporting the distributed ledger. This makes blockchain an ideal technology for storing and managing PII.
This leads me to my next point – privacy. Privacy has become a big problem for organizations to manage. Why? Because it’s complicated – it varies across regions and there are heavy fines for non-compliance. When an organization centrally manages user PII, questions about who has access to this data arise. Is the information truly private, and, again, what risks do the organizations assume when access credentials to this information get compromised?
By using public-private cryptographic keys for end users to control their data, user privacy is greatly simplified. But storage isn’t the only challenge. User’s personal information needs to be kept private at all points in a process. You can’t really have users transfer their PII in the clear and replicate it to other systems without proper controls and then still call it private because it’s stored in a secure location.
When it comes to using a user biometric in place of passwords, standards bodies such as FIDO and NIST have published certification guidelines to ensure privacy at every step of the process. This makes it important that any such solution used for biometric enrollment and/or authentication not only comply with standards, but undergo rigorous certification testing to ensure compliance to standards. Without these certifications, organizations really can’t rely on those solutions to comply at a business level with directives such as Know Your Customer and Anti-Money Laundering.
In my next blog, I’ll continue this discussion by covering the importance of user experience and identity decision bias and what this means for deploying identity verification techniques.