BIOMETRIC BIAS
BIOMETRIC BIAS
While biometric bias can be part of an identity lifecycle , biometric technology itself is not inherently biased – it is the design of biometric technology that can introduce discrimination , explains Mitek CTO Stephen Ritter .
“ Biometric systems analyse the physiological or behavioural traits of an individual for the purposes of identity verification and authentication . This is often conducted through fingerprint and facial recognition technology built on machine learning and AI – all powered by algorithms . Bias occurs when the algorithm operates in a discriminatory fashion , which often stems from how the algorithm is built , designed or tested .”
First solution : Testing standards “ First ,” Ritter says , “ we need a way to evaluate biometric bias . There is currently no standardised , thirdparty measurement for evaluating demographic bias in biometric technologies .
“ The industry needs a way to evaluate the equity and inclusion of biometric technologies . This would give service providers a way to ensure that their solution is equitable , regardless of whether it was built in-house or based on third-party technology from a vendor . This benchmark would provide the public with the information they need to select a service provider that ’ s more equitable .
Second solution : Global AI guidelines “ Determining ‘ what is right ’ goes beyond creating accuracy benchmarks – we also need to create ethical guidelines ,” Ritter explains . “ Until there are ethical guidelines for the use of this technology , there is no way to understand what is ‘ right .’
“ AI ethical guidelines would serve to solidify the rights and freedoms of individuals using or subject to datadriven biometric technologies . “ Until we define what is and is not an ethical use of biometric technology , there is no metric or benchmark that will exist to gauge the quality of technology .”
66 June 2023