Thursday, April 25, 2024

Trending Topics

HomeInnovationMitigating bias and equity in use of facial recognition technology by the U.S. Customs and Border Protection

Mitigating bias and equity in use of facial recognition technology by the U.S. Customs and Border Protection

spot_img

By Nicol Turner LeeChairwoman Barragán, Ranking Member Higgins, and distinguished members on the House Subcommittee on Border Security, Facilitation, & Operations, thank you for the invitation to testify as part of today’s hearing on the use of facial recognition technology by the U. S. Customs and Border Protection (CBP), where I intend to center my concerns around diversity, equity, and transparency over how this technology is applied in various contexts.

I am Dr. Nicol Turner Lee, Senior Fellow of Governance Studies, and Director of the Center for Technology Innovation at the Brookings Institution. With a history of over 100 years, Brookings is committed to evidenced-based, nonpartisan research in a range of focus areas.

My research encompasses data collection and analysis around regulatory and legislative policies that govern telecommunications and high-tech industries, along with the impacts of broadband access, the digital divide, artificial intelligence, and machine-learning algorithms on vulnerable consumers. My forthcoming book, Digitally invisible: How the internet is creating the new underclass (Brookings, 2022), addresses these topics and more. Today, I come before you with my own opinions.

CBP and emerging technological adoption and use As an agency, CBP is primarily responsible for border management and control. Responsibilities also lie around matters of custom and immigration, and the required verification of identities of travelers coming in and out of the United States. In 2013, CBP received funding to improve biometric identification and with that, moved to adopt facial recognition technology (FRT) to streamline existing matching processes, with the aim of modernizing and increasing efficiency for travelers and the federal government “without sacrificing safety and security by reducing the reliance on manual identity verification processes.

”1 Since its inception, CBP has been transparent in their adoption and use of facial recognition technologies as part of their national security efforts. Generally, the agency uses face detection and facial recognition technologies to confirm the identities of domestic and foreign travelers at Ports of Entry (POEs) for land, air, and sea borders. Over 187 million travelers have undergone such biometric screenings since its inception.

2 For air POEs, usually airports, CBP uses two processes, Simplified Arrival, for travelers entering the U. S. , and air exit, the program for travelers departing from the country.

3 As of December 2019, the CBP has spent $1. 241 billion in the rollout of facial recognition technology, which is also referred to as “Biometric Facial Comparison Technology. ”4 However, the widespread adoption and use of FRT by CBP has not come without challenges.

For my testimony, I focus on the intended and unintended consequences of FRT, and its implications for human rights and civil liberties that the agency should further consider as it expands these programs. In the spirit of common language before Congress and my fellow witnesses today, I define facial recognition technologies in accordance with the National Institute for Science and Technology, whose focus is on the comparison of “an individual’s facial features to available images for verification or identification purposes. 5 I will offer three points in my statement regarding: (1) the general efficacy and accuracy of facial recognition technologies among diverse populations (2) the sociological implications and trade-offs imposed on consumers when applied in commercial and public safety contexts (3) recommendations on what Congress and other policymakers can do to make these systems more fair, equitable, and responsible in the public safety/national security contexts Taken together, these aspects of my testimony can help facilitate improved dialogues on how to make FRT more diverse, equitable, and fair, especially among subjects that are already over-surveilled due to their racial and ethnic differences, and other cultural stereotypes.

<>.


From: brookings
URL: https://www.brookings.edu/testimonies/mitigating-bias-and-equity-in-use-of-facial-recognition-technology-by-the-u-s-customs-and-border-protection/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News