[House Hearing, 117 Congress]
[From the U.S. Government Publishing Office]


                     FACIAL RECOGNITION TECHNOLOGY:
                  EXAMINING ITS USE BY LAW ENFORCEMENT

=======================================================================

                                HEARING

                               BEFORE THE

                   SUBCOMMITTEE ON CRIME, TERRORISM,
                         AND HOMELAND SECURITY

                                 OF THE

                       COMMITTEE ON THE JUDICIARY

                     U.S. HOUSE OF REPRESENTATIVES

                    ONE HUNDRED SEVENTEENTH CONGRESS

                             FIRST SESSION

                               __________

                         TUESDAY, JULY 13, 2021

                               __________

                           Serial No. 117-33

                               __________

         Printed for the use of the Committee on the Judiciary
         
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]         


               Available via: http://judiciary.house.gov
               
                                __________
 
                    U.S. GOVERNMENT PUBLISHING OFFICE                    
48-329                     WASHINGTON : 2022                     
          
-----------------------------------------------------------------------------------                  

                      COMMITTEE ON THE JUDICIARY

                    JERROLD NADLER, New York, Chair
                MADELEINE DEAN, Pennsylvania, Vice-Chair

ZOE LOFGREN, California              JIM JORDAN, Ohio, Ranking Member
SHEILA JACKSON LEE, Texas            STEVE CHABOT, Ohio
STEVE COHEN, Tennessee               LOUIE GOHMERT, Texas
HENRY C. ``HANK'' JOHNSON, Jr.,      DARRELL ISSA, California
    Georgia                          KEN BUCK, Colorado
THEODORE E. DEUTCH, Florida          MATT GAETZ, Florida
KAREN BASS, California               MIKE JOHNSON, Louisiana
HAKEEM S. JEFFRIES, New York         ANDY BIGGS, Arizona
DAVID N. CICILLINE, Rhode Island     TOM McCLINTOCK, California
ERIC SWALWELL, California            W. GREG STEUBE, Florida
TED LIEU, California                 TOM TIFFANY, Wisconsin
JAMIE RASKIN, Maryland               THOMAS MASSIE, Kentucky
PRAMILA JAYAPAL, Washington          CHIP ROY, Texas
VAL BUTLER DEMINGS, Florida          DAN BISHOP, North Carolina
J. LUIS CORREA, California           MICHELLE FISCHBACH, Minnesota
MARY GAY SCANLON, Pennsylvania       VICTORIA SPARTZ, Indiana
SYLVIA R. GARCIA, Texas              SCOTT FITZGERALD, Wisconsin
JOE NEGUSE, Colorado                 CLIFF BENTZ, Oregon
LUCY McBATH, Georgia                 BURGESS OWENS, Utah
GREG STANTON, Arizona
VERONICA ESCOBAR, Texas
MONDAIRE JONES, New York
DEBORAH ROSS, North Carolina
CORI BUSH, Missouri

        PERRY APELBAUM, Majority Staff Director & Chief Counsel
              CHRISTOPHER HIXON, Minority Staff Director 
                              
                              ------                                

        SUBCOMMITTEE ON CRIME, TERRORISM, AND HOMELAND SECURITY

                    SHEILA JACKSON LEE, Texas, Chair
                    CORI BUSH, Missouri, Vice-Chair

KAREN BASS, California               ANDY BIGGS, Arizona, Ranking 
VAL DEMINGS, Florida                     Member
LUCY McBATH, Georgia                 STEVE CHABOT, Ohio
MADELEINE DEAN, Pennsylvania         LOUIE GOHMERT, Texas
MARY GAY SCANLON, Pennsylvania       W. GREGORY STEUBE, Florida
DAVID CICILLINE, Rhode Island        TOM TIFFANY, Wisconsin
TED LIEU, California                 THOMAS MASSIE, Kentucky
LOU CORREA, California               VICTORIA SPARTZ, Indiana
VERONICA ESCOBAR, Texas              SCOTT FITZGERALD, Wisconsin
STEVE COHEN, Tennessee               BURGESS OWENS, Utah

                   JOE GRAUPENSPERGER, Chief Counsel
                    JASON CERVENAK, Minority Counsel
                         
                         C O N T E N T S

                              ----------                              

                         Tuesday, July 13, 2021

                                                                   Page

                           OPENING STATEMENTS

The Honorable Sheila Jackson Lee, Chair of the Subcommittee on 
  Crime, Terrorism, and Homeland Security from the State of Texas     1
The Honorable Andy Biggs, Ranking Member of the Subcommittee on 
  Crime, Terrorism, and Homeland Security from the State of 
  Arizona........................................................     4
The Honorable Jerrold Nadler, Chair of the Committee on the 
  Judiciary from the State of New York...........................    24
The Honorable Jim Jordan, Ranking Member of the Committee on the 
  Judiciary from the State of Ohio...............................    25

                               WITNESSES

Gretta L. Goodwin, Director, Homeland Security and Justice, U.S. 
  Government Accountability Office
  Oral Testimony.................................................    28
  Prepared Testimony.............................................    31
Barry Friedman, Jacob D. Fuchsberg Professor of Law and Faculty 
  Director, Policing Project, New York University School of Law
  Oral Testimony.................................................    49
  Prepared Testimony.............................................    51
Robert Williams, Farmington Hills, Michigan
  Oral Testimony.................................................    64
  Prepared Testimony.............................................    66
Bertram Lee Jr., Media and Tech Policy Counsel, The Leadership 
  Conference on Civil and Human Rights
  Oral Testimony.................................................    71
  Prepared Testimony.............................................    73
Kara Frederick, Research Fellow, Center for Technology Policy, 
  The Heritage Foundation
  Oral Testimony.................................................    80
  Prepared Testimony.............................................    82
Jennifer E. Laurin, The Wright C. Morrow Professor of Law, The 
  University of Texas at Austin School of Law
  Oral Testimony.................................................    89
  Prepared Testimony.............................................    91
Brett Tolman, Executive Director, Right on Crime
  Oral Testimony.................................................    94
  Prepared Testimony.............................................    96
Cedric L. Alexander, Former Public Safety Director, Dekalb 
  County, Georgia and Former Member of President Obama's Task 
  Force on 21st Century Policing
  Oral Testimony.................................................   100
  Prepared Testimony.............................................   102

          LETTERS, STATEMENTS, ETC. SUBMITTED FOR THE HEARING

Materials submitted by the Honorable Andy Biggs, Ranking Member 
  of the Subcommittee on Crime, Terrorism, and Homeland Security 
  from the State of Arizona, for the record
  A letter from Matthew Feeney, Director, Project on Emerging 
    Technologies, Cato Institute, July 12, 2021..................     8
  An article entitled, ``Capitol Police to use Army surveillance 
    gear to monitor Americans and `identify emerging threats,' '' 
    American Military News.......................................    12
  An article entitled, ``Did agents raid home of wrong woman over 
    Jan. 6 riot? Maybe,'' AP News................................    14
  An article entitled, ``Search warrant reveals why the FBI 
    raided an Alaska couple's home,'' Alaska's News Source.......    18
  An article entitled, ``Homer residents said home was searched 
    by FBI in connection to the Jan. 6 Capitol riots,'' Homer 
    News.........................................................    21
Materials submitted by the Honorable Jerrold Nadler, Chair of the 
  Committee on the Judiciary from the State of New York, for the 
  record
  A document entitled, ``Civil Rights Concerns Regarding Law 
    Enforcement Use of Face Recognition Technology,'' June 3, 
    2021.........................................................   114
  A letter from Don Erickson, CEO, Security Industry Association, 
    July 13, 2021................................................   123
  A statement from Tawana Petty, National Organizing Director, 
    Data for Black Lives.........................................   125
  A statement from the Electronic Frontier Foundation............   129
  A statement from the Project on Government Oversight...........   132
Materials submitted by the Honorable Sheila Jackson Lee, Chair of 
  the Subcommittee on Crime, Terrorism, and Homeland Security 
  from the State of Texas, for the record
  A report entitled, ``Facial Recognition Technology: Federal Law 
    Enforcement Agencies Should Have Better Awareness of Systems 
    Used By Employees,'' United States Government Accountability 
    Office.......................................................   168
  An article entitled, ``The Problem of Bias in Facial 
    Recognition,'' Center for Strategic and International Studies   194
  An article entitled, ``How America's surveillance networks 
    helped the FBI catch the Capitol mob,'' The Washington Post..   200
  An article entitled, ``Federal study confirms racial bias of 
    many facial-recognition systems, casts doubt on their 
    expanding use,'' The Washington Post.........................   208

                                APPENDIX

Matrials submitted by the Honorable Sheila Jackson Lee, Chair of 
  the Subcommittee on Crime, Terrorism, and Homeland Security 
  from the State of Texas, for the record
  A letter from Rashad Robinson, President, Color Of Change, July 
    16, 2021.....................................................   214
  A letter from the NAACP Legal Defense and Educational Fund, 
    Inc., July 20, 2021..........................................   221

                 QUESTIONS AND RESPONSES FOR THE RECORD

Questions for Gretta L. Goodwin, Brett Tolman, and Kara 
  Frederick, submitted by the Honorable Louie Gohmert, a Member 
  of the Subcommittee on Crime, Terrorism, and Homeland Security 
  from the State of Texas, for the record........................   236
Answers from Gretta L. Goodwin, submitted by the Honorable Louie 
  Gohmert, a Member of the Subcommittee on Crime, Terrorism, and 
  Homeland Security from the State of Texas, for the record......   239
Answers from Brett Tolman, submitted by the Honorable Louie 
  Gohmert, a Member of the Subcommittee on Crime, Terrorism, and 
  Homeland Security from the State of Texas, for the record......   242
Answers from Kara Frederick, submitted by the Honorable Louie 
  Gohmert, a Member of the Subcommittee on Crime, Terrorism, and 
  Homeland Security from the State of Texas, for the record......   244

 
  FACIAL RECOGNITION TECHNOLOGY: EXAMINING ITS USE BY LAW ENFORCEMENT

                              ----------                              


                         Tuesday, July 13, 2021

                        House of Representatives

        Subcommittee on Crime, Terrorism, and Homeland Security

                       Committee on the Judiciary

                             Washington, DC

    The Subcommittee met, pursuant to call, at 10:01 a.m., via 
Zoom, Hon. Sheila Jackson Lee [Chair of the Subcommittee] 
presiding.
    Present: Representatives Jackson Lee, Nadler, Bass, 
Demings, McBath, Dean, Scanlon, Cicilline, Lieu, Correa, Cohen, 
Biggs, Jordan, Chabot, Gohmert, Steube, Tiffany, Massie, 
Fitzgerald, and Owens.
    Staff Present: Arya Hariharan, Chief Oversight Counsel; 
John Doty, Senior Advisor; Moh Sharma, Director of Member 
Services & Outreach and Policy Advisor; Jacqui Kappler, 
Oversight Counsel; Priyanka Mara, Professional Staff Member and 
Legislative Aide; Cierra Fontenot, Chief Clerk; Atarah McCoy, 
Staff Assistant; Gabriel Barnett, Staff Assistant; Merrick 
Nelson, Digital Director; Kayla Hamedi, Deputy Communications 
Director; Ben Hernandez-Stern, Counsel, Subcommittee on Crime, 
Terrorism, and Homeland Security; Ken David, Minority Counsel; 
Jason Cervenak, Minority Chief Counsel for Crime; and Kiley 
Bidelman, Minority Clerk.
    Ms. Jackson Lee. The Subcommittee will come to order. 
Without objection, the Chair is authorized to declare recesses 
of the Subcommittee at any time.
    We welcome everyone to this morning's hearing on ``Facial 
Recognition Technology: Examining Its Use by Law Enforcement.''
    Before we begin, I would like to remind Members that we 
have established an email address and distribution list 
dedicated to circulating exhibits, motions, or other written 
materials that Members might want to offer as part of our 
hearing today. If you would like to submit materials, please 
send them to the email address that has been previously 
distributed to your offices and we will circulate the materials 
to Members and staff as quickly as we can.
    I would also ask all Members to mute your microphones when 
you are not speaking. This will help prevent feedback and other 
technical issues. You may unmute yourself any time you seek 
recognition.
    I will now recognize myself for an opening statement.
    As I said, good morning and welcome to a very important 
hearing, one that I hope will generate not only an 
understanding, but as I will ask a number of Witnesses, 
legislative fixes that may be relevant and important in this 
rise of technology matched against a document that has been a 
living document for more than a century, obviously, centuries, 
and that is the Constitution of the United States of America.
    Like many other leaps forward in technology, facial 
recognition technology offers our society both promise and 
peril. Proponents extol the potential benefits of modernized 
policing, greater certainty in law enforcement, safer borders, 
and ultimately a more efficient criminal justice system.
    At the same time, facial recognition systems have clear 
potential for misuse and have been found to be inaccurate at 
identifying certain groups of people.
    Moreover, the technology and how it has been used in 
prosecution raises, as I said earlier, constitutional concerns 
that call into question the basic tenets of fairness and due 
process underpinning all criminal prosecution.
    The criminal consequences implicated by these darker 
elements of facial recognition technology warrant closer 
examination by this body. Congress has found itself at a 
disadvantage in understanding this powerful tool because the 
information on how law enforcement agencies have adopted facial 
recognition technology remains underreported or nonexistent.
    Today, we are bringing facial recognition technology out of 
the shadows and into the light. This dialogue must start with 
the simple question of whether this technology is sufficiently 
accurate to justify its use by the police and if the risks of 
depriving someone of their liberty unjustly are too great.
    To add untested and unvetted facial recognition technology 
to our policing would only serve to exacerbate the systemic 
issues still plaguing our criminal justice system.
    Worse, large-scale adoption of this technology would inject 
further inequity into a system at a time when we should be 
moving to make the criminal justice system more equitable for 
Americans, as we're working very hard to pass the George Floyd 
Justice in Policing Act that was passed out of this particular 
Committee some many, many months ago.
    There are many unknowns, but we can be certain of one 
thing: Most, if not all, facial recognition systems are less 
accurate for people of color and women. For the most part, we 
can be confident that the darker your skin tone, the higher the 
error rate.
    Studies have found error rates in facial recognition 
software to be up to 34 percent higher for darker-skinned women 
than lighter-skinned men. It is not just sometimes wrong, it 
can be wrong up to a third of the time.
    Additionally, a 2019 study by the National Institute of 
Standards and Technology found empirical evidence that most 
facial recognition algorithms exhibit demographic differentials 
that can worsen their accuracy based on a person's age, gender, 
or race. According to the study, systems varied widely in their 
accuracy and depending on the algorithm and type of search.
    America is a multicultural, multi-ethnic Nation. This is 
particularly important as relates to these findings.
    Asian and African-American people were up to 100 times more 
likely to be misidentified than White men. Native Americans had 
the highest false positive rate of all ethnicities.
    Women were more likely to be falsely identified than men. 
The elderly and children were more likely to be misidentified 
than those in other age groups. It could play a really strong 
role in trying to find missing persons, for example, if it was 
to be used in that.
    Middle-aged White men generally benefited from the highest 
accuracy rate.
    Yet, in spite of these flaws, facial recognition technology 
is being rolled out across the country and has in many 
instances proven valuable to law enforcement. From the largest 
Federal law enforcement agencies to small local police 
departments, facial recognition systems are quietly be 
incorporated into American policing.
    There lies the complexity, because you have small 
departments, 18,000 in the United States, that may not have the 
training and are using these technologies.
    We have heard the success stories, such as facial 
recognition technology being used to identify the shooter in 
the tragic Capital Gazette massacre and to help identify 
victims of sex trafficking by matching missing persons photos 
to online ads.
    Most recently we have learned that facial recognition 
technology has been used to identify several of the 
insurrectionists who stormed the citadel of American democracy 
on January 6.
    However, while we must applaud the utilization of new 
technology for the apprehension of domestic terrorists, this 
must be weighed against the fact that many of the systems, both 
privately owned or owned by government entities, operate with 
minimal transparency, limited oversight, and questionable 
informational security.
    Let me remind you again, 18,000 different police 
departments across America of varying sizes, varying training, 
and varying technology capability. Again, it is noted that they 
are using technology, again, with minimal transparency, limited 
oversight, and questionable informational security.
    In some cases, individual employees are testing this 
private software without the knowledge or approval of their 
superiors or supervising agencies.
    Without notice, many communities find facial recognition 
systems deployed in their neighborhoods. The adoption and use 
of facial recognition systems must involve community 
consideration, technological evaluation, and sober legal 
analysis.
    As these trends have developed, the Federal Government has 
been largely absent. What we do not know towers over what we 
do, and that needs to change. Seemingly little thought has been 
put into the widespread adoption of a technology that could 
materially alter community interactions with law enforcement.
    This is the issue this Committee hopes to correct today and 
going forward. This hearing is not meant to prejudge facial 
recognition technology or its value to law enforcement, but 
rather it is an important first step to proactively engage with 
this technology and adapt our laws accordingly.
    As I indicated, it is important that out of this Committee 
hearing we also work on productive and constructive legislation 
as needed.
    I am looking forward to engaging with the Witnesses about 
the benefits and pitfalls inherent in facial recognition 
technology and to forging a way forward towards transparency, 
equity, and accountability in its use.
    It is now my pleasure to recognize the Ranking Member of 
the Subcommittee, the gentleman from Arizona, Mr. Biggs, for 
his opening statement.
    Mr. Biggs. Madam Chair, thank you for yielding time to me 
and I thank you for holding this very important hearing on 
facial recognition technology and its use by law enforcement.
    I think there is so much of what you just said that I can 
associate myself with. There are some things that we can work 
together on a bipartisan basis, not just you and I, but 
hopefully all the Members of the United States Congress.
    Facial recognition technology is a powerful tool when used 
properly. When used improperly or relied upon too heavily, it 
raises some very serious concerns and has some very real world 
negative consequences.
    Today, we'll hear from a Witness who was detained after the 
Detroit Police Department relied on facial recognition 
technology that pointed a finger wrongly at this man.
    Facial recognition technology is not perfect--and actually, 
quite honestly, it's far from it. To be reliable, images must 
be captured in the most favorable conditions with lighting, 
angle, aging, and other factors ultimately reducing 
reliability.
    Worse yet, facial recognition algorithms perform poorly 
when tasked with analyzing the faces of women, people of color, 
elderly, and children.
    I am also concerned about the potential First and Fourth 
Amendment erosions that facial recognition technology can 
cause. Law enforcement agencies could potentially use the 
systems for the surveillance of individuals not involved in any 
suspicious activity whatsoever.
    Moreover, Federal law enforcement agencies with assistance 
from their State and local law enforcement partners have access 
to images of an overwhelming number of otherwise law-abiding 
citizens. State and motor vehicle agencies possess high-quality 
photographs of most citizens that are a natural source for 
facial recognition programs and could easily be combined with 
public surveillance or other cameras in the construction of a 
comprehensive system of identification and tracking.
    Federal law enforcement agencies are also not limited to 
facial recognition technology systems they control or those 
controlled by other law enforcement partners at the State and 
local level. Federal law enforcement can use nongovernment 
facial recognition service providers, such as Vigilant 
Solutions or Clearview AI.
    For example, law enforcement officers with a Clearview AI 
account can use a computer or smartphone to upload a photo of 
an unknown individual with Clearview AI's facial recognition 
system. The system then returns search results that show 
potential photos of the unknown individual as well as links to 
sites where the photos were obtained, such as Facebook or 
Instagram.
    The use of facial recognition technology is largely 
unregulated and raises numerous concerns. The government could 
use facial recognition technology to monitor or target people 
exercising their First Amendment rights, including freedom of 
association.
    Facial recognition technology also poses significant 
privacy concerns, and improper use could violate the Fourth and 
First Amendment rights protecting people from unreasonable 
searches and seizures and guaranteeing freedom of association.
    With these constitutional reliability concerns with this 
technology, the GAO just issued a report indicating that not 
only had the Federal law enforcement agencies not assessed 
privacy and other risks associated with the use of this 
technology, but some of them don't also even know which systems 
their agents are using.
    So, in sum, I think this is a real opportunity for us to 
gain some important information and knowledge today. I have 
enormous concerns that I will summarize here.
    The technology here is absolutely problematic and 
inconsistent. It's not been perfected. Images of law-abiding 
people have been acquired. The security of the databases is an 
issue, as we have seen with recent breaches of databases. 
Private companies acquiring and scraping information and data 
and selling it to the feds and other law enforcement. 
Manipulation and enhancements of photos can take place. It 
expands the surveillance State that we have been railing 
against. The GAO report gave some important concerns.
    Those are just some of my--that's a brief summary of some 
of these things that I am concerned with. I do think it's a 
place where we can work, Madam Chair, to find common ground.
    If we're talking about nationalizing police forces, I don't 
think we'll get there. If we're talking about finding some kind 
of meaningful regulation and oversight of the implementation of 
facial recognition technology, which I think is what the Chair 
is alluding to, then I think we can find a lot of common ground 
here.
    With that, Madam Chair, I would like to submit for the 
record a letter to the Committee dated July 12, 2021, from 
Matthew Feeney of the Cato Institute.
    Ms. Jackson Lee. You want to finish the letter explanation?
    Mr. Biggs. Yes. Yes, if I can, please.
    Also, I wish to submit an article from American Military 
News from July of 2021 regarding the Capitol Police adoption of 
Army surveillance gear to monitor Americans and identify 
threats.
    I also wish to include several articles about the Alaskan 
couple that through facial recognition technology that was 
erroneously used. One would be an AP news article on riots, 
technology, capitol siege, and government and politics, another 
one from Alaska's News Source, and another one from 
Homenews.com, if it's possible, Madam Chair.
    Ms. Jackson Lee. Without objection, so ordered.
    [The information follows:]

                        MR. BIGGS FOR THE RECORD

=======================================================================

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

    Ms. Jackson Lee. I thank the Ranking Member of the 
Subcommittee for his presentation and his opening statement.
    I do believe that, as I indicated, we should be engaged in 
both oversight and legislative response, and I would like to 
find common ground with our colleagues. Because I think this is 
very, very important both in terms of law enforcement and the 
utilization thereof, but also for the constitutional principles 
which we all believe in.
    It's now my pleasure to recognize the distinguished Chair 
of the Full Committee, the gentleman from New York, Chair 
Nadler, for his opening statement.
    Chair Nadler. Well, thank you very much. I thank Chair 
Jackson Lee for convening this hearing on the use of facial 
recognition by law enforcement.
    Facial recognition technology is now able to capture images 
and analyze them in ways that would have been completely 
impossible only a decade ago. In some cases, the technology 
enables the real-time surveillance and matching of individuals 
in a crowd.
    As facial recognition technology has proliferated in a 
manner largely unchecked by Congress, we now live in a world 
where biometric data is collected by private businesses, 
schools, State and local agencies, and as is the focus of this 
hearing, by law enforcement agencies at all levels of 
government.
    There is a tension that we must address. On the one hand, 
this technology is now a commonplace fixture in our lives. On 
the other hand, most Americans have little understanding of how 
the police use facial recognition technology to conduct 
surveillance of communities across the country--for better for 
or worse.
    Thanks to the work of the Government Accountability Office, 
we have our first glimpse into Federal law enforcement's use of 
facial recognition technology. In its recent study, GAO found 
that 20 of the 42 Federal law enforcement agencies surveyed 
used some form of facial recognition technology.
    This report, although an important first step, leaves many 
unanswered questions about how the technology is used, when it 
is used, and what type of mechanisms are in place to guard 
against abuse or to protect sensitive data.
    For example, of the 14 Federal agencies that said they used 
non-Federal systems, only one agency could provide the GAO with 
complete information about what systems their employees were 
actually using in the field.
    The picture at the State and local level is even more 
opaque, but we have received several disturbing reports of the 
misapplication of facial recognition technology by local police 
departments. For example, this technology has proven 
particularly unreliable when attempting to match women and 
people of color.
    In some cases, facial recognition is deployed without 
proper consideration of due process rights. One Witness here 
today, Mr. Williams, will testify to the profound impact that 
misidentification under these circumstances can have on an 
individual's life.
    Simply put, Mr. Williams deserves better from the law 
enforcement agencies entrusted to use a technology that we all 
know is less accurate when applied to citizens who look like 
him.
    Facial recognition technology can be an important crime-
fighting tool. Like all such tools, we need to ensure that the 
adoption of this new technology does not further erode trust 
between law enforcement and the communities they serve and does 
not compound the unfairness of a criminal justice system that 
too often disproportionately impacts people of color.
    Today, this Subcommittee begins to ask the critical 
questions surrounding police use of facial recognition 
technology. The hearing will not answer every question about 
the adoption of facial recognition technology by law 
enforcement, but my hope is that we will take another important 
step toward balancing the needs of law enforcement agencies, 
the priority of keeping Americans safe from crime, and the 
valid concerns about this technology expressed by communities 
across this country.
    Our task is not an easy one. We want an America where every 
individual knows that they and their families are safe from 
harm, while our fundamental rights are protected. We want an 
America where law enforcement has the tools it needs to protect 
and serve, but with clear guidelines and rules that prevent 
those tools from being abused.
    Understanding how to make that vision real at a moment when 
technology keeps changing and evolving will take a tremendous 
amount of work, but this is work worth doing.
    Again, I thank the Chair for calling this hearing. I look 
forward to hearing from our Witnesses, especially Bertram Lee, 
who once worked for my office way back when.
    It's good to see you, and I look forward to your testimony.
    I thank all the Witnesses for taking their time to engage 
with us on this important topic, and I look forward to their 
testimony.
    I yield back the balance of my time.
    Ms. Jackson Lee. I thank the Chair very much for his 
important presentation.
    Now, it's my pleasure to recognize the distinguished 
Ranking Member of the Full Committee, the gentleman from Ohio, 
Mr. Jordan, for his opening statement.
    Mr. Jordan. Thank you, Madam Chair.
    I look forward to today's hearing on facial recognition 
technology. I am hopeful we can approach this issue in a truly 
bipartisan manner, as has been mentioned by some previous 
statements.
    Last Congress, we worked closely with our colleague, the 
late Chair Elijah Cummings, on this very issue. While we had 
many vigorous debates on a variety of issues, facial 
recognition technology was one issue where we shared common 
ground. So much so, that we held three bipartisan hearings in 
the Oversight Committee on this technology.
    In those hearings, we learned many important facts about 
facial recognition technology. There are serious First and 
Fourth Amendment concerns about the use of FRT by law 
enforcement.
    There are also due process concerns about how law 
enforcement uses this technology. We learned, for example, that 
over 20 States' departments of motor vehicles have handed over 
their driver's license data to the FBI.
    No American citizen gave their consent to have their data 
and images turned over to the FBI. No elected official ever 
voted to allow this to happen. Every American should be 
troubled by this. I know I am, and I hope my colleagues on this 
Committee are as well.
    Almost two weeks ago, the GAO issued a new report about how 
Federal law enforcement uses FRT. This report makes clear that 
the Federal law enforcement agencies using facial recognition 
technology haven't even assessed the risk when using this 
technology. If they haven't even assessed the risks, they 
certainly aren't taking them into consideration when they're 
using the technology.
    GAO has surveyed Federal law enforcement agencies on their 
use of facial recognition technology; 14 agencies that reported 
using the technology to support criminal investigations also 
reported using systems owned by non-Federal entities. Only one 
agency had awareness of what non-Federal systems are used by 
its employees.
    In other words, not only are Federal law enforcement 
agencies not assessing the risk of using facial recognition 
technology, most don't even know which systems their employees 
are using.
    I imagine it's difficult for a Federal agency to assess the 
risk of using a system if it doesn't even know which system it 
is using.
    I look forward to learning more from GAO on this recent 
report and our other Witnesses on the path forward.
    With that, Madam Chair, I yield back.
    Ms. Jackson Lee. I thank the gentleman for his opening 
statement.
    Without objection, all other opening statements of our 
Members will be included in the record.
    I will now introduce today's Witnesses, and let me again 
thank them for their participation. This is a virtual hearing 
that is no less crucial and important, and we thank you for 
participating in that manner of technology.
    The first Witness is Director Gretta Goodwin. Gretta L. 
Goodwin is a Director in the Homeland Security and Justice team 
at the U.S. Government Accountability Office where she leads 
GAO's work on justice and law enforcement issues. Her portfolio 
includes Federal law enforcement oversight and training, civil 
liberties, civil rights, vulnerable populations, the Federal 
prison system, and the Federal judiciary.
    Director Goodwin has a Ph.D. and a master's degree in 
economics from the University of Nebraska, Lincoln, and a 
bachelor's degree in economics from the University of Houston.
    I am glad to see there's a Houston connection and the 
university that I happen to represent.
    Welcome.
    Professor Barry Friedman serves as a Faculty Director of 
the Policing Project at New York University School of Law where 
he is the Jacob D. Fuchsberg Professor of Law and affiliated 
professor of politics.
    Professor Friedman has taught, litigated, and written about 
constitutional law for the Federal courts, policing, and 
criminal procedure for over 30 years, and has produced numerous 
books and articles, appearing in various scholarly journals and 
popular media outlets.
    Professor Friedman graduated with honors from the 
University of Chicago and received his law degree magna cum 
laude from Georgetown University Law Center.
    Welcome.
    Robert Williams is a 43-year-old man who lives in 
Farmington Hills, Michigan, with his wife and two young 
daughters. He works as a logistics planner in the automotive 
industry. Mr. Williams was wrongfully arrested by the Detroit 
Police Department in 2020 when facial recognition technology 
misidentified him.
    Mr. Williams, we apologize to you and very grateful for the 
testimony that you will be presenting here today.
    Welcome.
    Bertram Lee, Jr., is a counsel for media and technology at 
the Leadership Conference on Civil and Human Rights where he 
works to advance the interests of marginalized communities in 
technology and media policy. His portfolio includes broadband 
access, media diversity, facial recognition, law enforcement 
surveillance technology, section 230 of the Communications 
Decency Act, and algorithmic bias in artificial intelligence.
    Previously, Mr. Lee worked as Policy Counsel at Public 
Knowledge and at the U.S. House Committee on Education and 
Labor under Chair Robert C. Scott.
    He received his J.D. from Howard University School of Law 
and his B.A. from Haverford College.
    Welcome.
    Kara Frederick is a Research Fellow at the Center for 
Technology Policy at the Heritage Foundation. Her research 
focuses on Big Tech and emerging technology policy.
    Among her prior roles, Ms. Frederick was a fellow at the 
Center for New American Security, served as a Senior 
Intelligence Analyst for a U.S. Naval Special Warfare Command, 
and helped create and lead Facebook's Global Security 
Counterterrorism Analysis Program.
    She received her M.A. in war studies from King's College 
London and her B.A. in foreign affairs and history from the 
University of Virginia.
    Welcome.
    Professor Jennifer Laurin serves as the Wright C. Morrow 
Professor at the University of Texas School of Law where she 
studies and writes about how law and institutional design shape 
the functioning of criminal justice institutions.
    She also currently serves as reporter to the American Bar 
Association's Criminal Justice Standards Task Force and is a 
former Chair of the Texas Capital Punishment Assessment team 
organized under the auspices of the American Bar Association.
    Professor Laurin received her undergraduate degree in 
politics from Earlham College and her J.D. from Columbia Law 
School.
    Welcome to you from Texas.
    Brett Tolman is the Executive Director for Right on Crime. 
He is the founder of the Tolman Group and previously served as 
a shareholder at the law firm of Ray Quinney & Nebeker PC.
    Previously, Mr. Tolman also served as the U.S. attorney for 
the District of Utah and was selected by Attorney General 
Michael Mukasey to serve as a special adviser to the Attorney 
General on national and international policy issues affecting 
United States attorneys in the Department of Justice.
    Prior to becoming U.S. attorney, Tolman served as legal 
counsel to the United States Senate Judiciary Committee.
    He received his B.A. and his J.D. from Brigham Young 
University.
    Welcome.
    Dr. Cedric Alexander is a law enforcement expert with over 
40 years of experience in public safety. He has appeared on 
national media networks to provide comment on police-community 
relations and has written numerous editorials and books on the 
subject.
    Previously, Dr. Alexander has served as Deputy Commissioner 
of the New York State Division of Criminal Justice Services, as 
Chief of Police of DeKalb County, Georgia, and as national 
President of the National Organization of Black Law Enforcement 
Executives, a very powerful law enforcement organization. He 
was also appointed to serve on President Obama's Task Force on 
21st Century Policing.
    He received a B.A. and a master's degree from St. Thomas 
University in Miami, Florida, and a doctorate from Wright State 
University.
    Welcome.
    Usually, I stand when I provide the oath, but we welcome 
our distinguished Witnesses and we thank them for their 
participation today. So, I will begin by swearing in our 
Witnesses.
    If you would please turn on your audio. Make sure I can see 
your face and your raised right hand while I administer the 
oath. Please unmute and let me see your raised right hand.
    Do you swear or affirm, under penalty of perjury, that your 
testimony you are about to give today to be true and correct to 
the best of your knowledge, information, and belief, so help 
you God?
    I need to hear an audible ``I do.''
    Thank you so very much.
    Let the record show the Witnesses have answered in the 
affirmative.
    Thank you so very much.
    Please note that each of your written statements will be 
entered into the record in its entirety. Accordingly, I ask 
that you summarize your testimony in five minutes. There is a 
timer in the Zoom view that should be visible on your screen.
    Director Goodwin, you may begin. Welcome. You are 
recognized for five minutes.

                 STATEMENT OF GRETTA L. GOODWIN

    Dr. Goodwin. Thank you Chair Nadler, Chair Jackson Lee, 
Ranking Member Jordan, Ranking Member Biggs, and Members of 
this Subcommittee, I am pleased to be here today to discuss 
Federal law enforcement's use of facial recognition technology.
    Use of this technology has expanded in recent years, and 
questions exist regarding the accuracy of the technology, the 
transparency in its usage, and the protection of privacy and 
civil liberties when the technology is used.
    Today, I will discuss:

        (1)  The ownership and use of facial recognition technology by 
        Federal agencies that employ law enforcement officers;
        (2)  the types of activities these agencies use the technology 
        to support; and
        (3)  the extent that these agencies track employee use of the 
        technology, especially those owned by non-Federal entities, as 
        well as the potential privacy and accuracy implications.

    We surveyed 42 Federal agencies that employ law enforcement 
officers. Twenty of them report that they owned a system with 
facial recognition technology or used another entity's system.
    Of these 20 agencies, three own their own system, 12 only 
used another entity's system, and five agencies both owned a 
system and used another entity's. These agencies noted that 
some systems can include hundreds of millions or billions of 
photos.
    Federal agencies reported using facial recognition 
technology to support various activities, such as criminal 
investigations, surveillance, and managing business operations 
during the COVID-19 pandemic.
    Of the 20 agencies that owned or used the technology, 14 
reported using it to support criminal investigations, including 
investigations of violent crimes, credit card and identity 
fraud, and missing persons.
    Looking more closely at these 14 agencies, some of them 
told us that they used the technology last summer to support 
criminal investigations related to civil unrest, riots, or 
protests following the killing of Mr. George Floyd in May of 
2020.
    In addition, a few of these agencies told us they used the 
technology on images from the U.S. Capitol attack on January 6 
of this year to generate leads for criminal investigations.
    Of these 14 agencies, we found that 13 did not have 
complete, up-to-date information on what non-Federal systems 
are being used by their employees. So, agencies have not fully 
assessed the potential risk related to privacy and accuracy of 
using these systems.
    To be clear, some agencies knew that employees were using 
certain systems. For example, agencies may have had formal 
agreements with States or contracts with private companies.
    However, many agencies had to poll their employees to 
answer our questions. Multiple agencies initially told us they 
didn't use the technology and later changed their answers after 
talking with employees.
    What these 13 agencies all have in common is that they 
don't have a process to track what Federal systems employees 
are using.
    Although the accuracy of facial recognition technology has 
increased dramatically in recent years, risk still exists that 
searches will produce inaccurate results. If a system is not 
sufficiently accurate, it could unnecessarily identify innocent 
people as investigative leads.
    The system could also miss investigative leads that would 
otherwise have been revealed.
    Moreover, the technology's high error rates for certain 
demographic groups could result in adverse consequences for 
individuals.
    Accordingly, we recommended that agencies implement a 
mechanism to track what non-Federal systems with facial 
recognition technology are used by employees to support 
investigative activities and assess the risk of using such 
systems, including privacy- and accuracy-related risks.
    Facial recognition technology's capabilities are only going 
to grow stronger. If law enforcement agencies do not know if or 
how their employees are using the technology, then they cannot 
ensure that the appropriate protections are in place.
    Chair Nadler, Chair Jackson Lee, Ranking Member Jordan, 
Ranking Member Biggs, and Members of this Subcommittee, this 
concludes my remarks. I am happy to answer any questions you 
have.
    [The statement of Dr. Goodwin follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Thank you for your testimony.
    I am now pleased to recognize Professor Barry Friedman for 
five minutes.

                  STATEMENT OF BARRY FRIEDMAN

    Mr. Friedman. Chair Jackson Lee, Chair Nadler, Ranking 
Members Jordan and Biggs, Members of the Subcommittee, I want 
to thank you for giving me the opportunity to testify today.
    As my introduction indicated, I am the founding director of 
the Policing Project at New York University School of Law. I 
want to tell you our mission because it relates directly to 
what you're doing today.
    We are an organization that partners with all stakeholders. 
We work in impacted communities, communities impacted by law 
enforcement, we also work closely with law enforcement 
agencies, with the goal of bringing to policing democratic 
accountability, transparency, and equity, particularly around 
issues of race.
    I want to stress two parts of that because I think they 
relate directly to this hearing.
    First, the issue of democratic accountability. It's our 
position--this is one my key points that I'd like to make 
today--that these technologies should not be used without 
proper democratic accountability. Accordingly, I commend this 
Subcommittee for having this hearing to pursue what I think is 
essential legislation in this area.
    Second, I want to stress that we do work with all 
stakeholders across the ideological spectrum. I was incredibly 
gratified to listen to all the Members who spoke to the need 
and the ability to work in a bipartisan way on this issue.
    I believe that there are strong, passionate feelings about 
this technology. It's very powerful technology. There is a way 
forward on the technology, and the question is, how do we get 
there? So, I want to turn to that.
    The approach that we use at the Policing Project, 
particularly for emerging technologies around policing, is one 
of cost-benefit analysis.
    We look at the benefits of the technology. As many of you 
said, there are definitely benefits that can be obtained from 
this technology.
    Then we look at the costs. As many of you have indicated, 
there are very, very serious costs, very, very serious 
potential harms. There are racial harms from the disparities. 
There are privacy harms. There are harms of giving too much 
power to governments, as we can all see by use of this 
technology by totalitarian governments. There are concerns 
about the First Amendment.
    Now, some people look at cost-benefit analysis and they 
think, okay, well, we weigh the costs, we weigh the benefits, 
we decide whether to do something or not.
    That's exactly the wrong approach. The value of cost-
benefit analysis or benefit-cost analysis is that we can find a 
nuanced way to regulate that can maximize the benefits and 
minimize or mitigate entirely the harms. That's what I want to 
talk about.
    I have spoken to many, many people over the last months 
and, in fact, years about this issue. There are strong views on 
both sides, including some people who after much reason think 
that the technology should be banned.
    The fact of the matter is that when I am talking to law 
enforcement or when I am talking to civil libertarians or 
racial justice advocates, whether I am talking about to people 
in the technology companies themselves, even though they 
disagree about the use of the technology and about many things, 
there's a wide swath of agreement about what isn't okay in the 
use of the technology, about what are shoddy and dangerous 
practices that should be regulated.
    I want to just dive very quickly into some of the nuance of 
those practices, and you have a longer enumeration in my 
testimony, and I am happy to answer any questions.
    So, first, issues of accuracy and bias. We've heard a lot 
about that. I think it's time, and I hope that we will all stop 
saying things like the technology is 87, 93, or 95 percent 
accurate. That doesn't mean anything. We should be talking 
about false negatives and false positives. That's what matters.
    The truth of the matter is we don't know anything about the 
accuracy of this technology when law enforcement uses it. That 
is because law enforcement uses the technology in a way 
different than it has been tested by NIST. Law enforcement uses 
different probe images than NIST has tested. Law enforcement 
uses much larger databases than NIST has tested. As you think 
about it, the error rates just increase as the databases get 
larger. We don't even know if NIST has tested the actual 
algorithms that law enforcement is using.
    Now, a lot of people say, ``Okay. Not to worry. There's a 
human in the loop.'' Now, I would appeal to your common sense. 
We don't have enough science, but we have some. Humans in the 
loop can be a good thing or a bad thing.
    Humans in the loop would be a good thing if an independent 
individual who had no idea what the algorithm had suggested 
identified the same person.
    Humans in the loop are problematic if all they're doing is 
confirming what the algorithm told them.
    So, again, we need testing and methods of ensuring that if 
there's a human in the loop, that human is actually working in 
a way to make sure that we eliminate concerns about accuracy 
and demographic--particularly racial--bias.
    Finally, I want to urge you to think very seriously about 
regulating the vendors themselves. That's the touch point. 
Those are the folks that know how the technology works and what 
can be done to make it work. They could work on what are 
acceptable probe photos, what are the right settings to make 
sure we get accurate results.
    I have a long list of things the vendors could regulate, 
but I am out of time. So, I am just going to thank you for 
listening to my testimony, and I am happy to answer questions.
    [The statement of Mr. Friedman follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Thank you very much, Professor Friedman.
    It's now my pleasure to recognize Mr. Robert Williams.
    Welcome, Mr. Williams. Again, I indicated my apologies. We 
are the presiders over law enforcement in this Nation, and, 
certainly, we welcome your very important testimony. Thank you.
    You are recognized for five minutes.

                  STATEMENT OF ROBERT WILLIAMS

    Mr. Williams. Thank you. Okay. Chair Nadler, Chair Jackson 
Lee, Ranking Member Jordan, Ranking Member Biggs, and Members 
of the Subcommittee, thank you for the opportunity to testify 
today.
    Last--well, not last, but January 2020, I got a call from 
my wife, and she told me that some detectives called her to 
tell me to turn myself in. She said they'd be calling me 
shortly because they didn't have my phone number.
    They called me. I answered and said--I asked them what it 
was about. He told me he couldn't tell me. All we know is that 
I need to turn myself in.
    I assumed it was a prank call, and I hung up and called my 
wife and told my wife to disregard it, who was irate and 
thinking that maybe I was trying to hide something from her.
    So, I told her to call our local police department. She 
did. They said they had no warrants or anything for my arrest 
and I should disregard the call.
    So, I proceeded to drive home. When I got home, there was a 
Detroit police car parked on my street. When I pulled in the 
driveway, he pulled in my driveway and blocked me in as if I 
was going to make a run for it.
    He pulled up. He said, ``Are you Robert Williams?'' I said, 
``Yes, I am.'' He said, ``You're under arrest.'' I said, 
``Whoa, you can't just arrest me for that. What am I under 
arrest for?'' He said, ``Don't worry about it,'' and proceeded 
to put the handcuffs on me.
    I told my wife and my kids who were coming out of the house 
at the time, ``Don't worry about it, they are making a mistake. 
I'll be right back.''
    Unfortunately, I wasn't right back. I ended up spending 30 
hours in jail--well, at the detention center.
    While I was in the car, I asked them again, ``Why am I 
being arrested?'' They said, ``We can't tell you that.'' They 
showed me the warrant that said ``felony larceny'' on it. I'm 
like, ``I didn't steal anything.'' I'm like, ``How do we get 
the warrant?'' He said that a detective will be in to talk to 
me.
    I was in the detention center in an overcrowded cell. The 
garbage was overflowing. There was a lot of less desirables in 
there talking about things that they had done and things that 
they had got away with. I am just like, ``Why am I here?'' 
Nobody ever came to tell me.
    So, the next day, when I went to court, I pleaded not 
guilty. They took a recess, and then some detectives came and 
asked me some questions. When they came to ask questions, they 
had me sign a paper saying that I will waive my right to 
counsel so they could show me what was on the papers. I signed.
    He showed me a picture. He said, ``When was the last time 
you was at the Shinola store?'' I said, ``A couple of years 
ago.'' He said, ``So, that's not you?'' I said, ``No, that's 
not me.''
    He turns over another piece of paper. He said, ``So, I 
guess that's not you either?'' I held that piece paper up to my 
face and said, ``I hope you don't think all Black people look 
alike.''
    He turned over another paper and said, ``So, I guess the 
computer got it wrong.'' I am like, ``I guess so. Am I free to 
go?'' He was like, ``I'm sorry for this mishandling, but we're 
not the detectives on your case so we can't let you go,'' so 
they sent me back to my cell.
    I got out. I received a bond, and I got out. My wife 
contacted the ACLU, and we started getting the information 
about the case, and we realized that it was a wrong facial 
recognition technology used.
    I just have been fighting for it ever since. I just don't 
think it's right that my picture was used in some type of 
lineup, and I have never been in trouble.
    With that being said, I would just like to say thank you 
for the opportunity to share my testimony. I hope that Congress 
does something about this law. I am happy to take any more 
questions that you might have for me.
    [The statement of Mr. Williams follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Powerful statement, Mr. Williams. You are 
tracking what I began with as I opened this hearing, that not 
only should we engage in oversight, but I do think we should 
engage in the right kind of fix. I am very apologetic to you 
and your family for having to go through this.
    Our next Witness is Mr. Bertram Lee, Jr. You are recognized 
for five minutes.

                 STATEMENT OF BERTRAM LEE, JR.

    Mr. Lee. Chair Jackson Lee, Chair Nadler, Ranking Member 
Jordan, Ranking Member Biggs, and Members of the Subcommittee, 
thank you for the opportunity to testify today.
    Thank you, Chair Jackson Lee and Chair Nadler, for calling 
this hearing and shining a light on the serious and undeniable 
threat that facial recognition technology poses, especially to 
Black and Brown people and other marginalized communities.
    We are at a turning point. We must reject policies and 
practices rooted in discrimination towards marginalized 
communities and move toward a new paradigm for public safety 
that respects the humanity, dignity, and human rights of all 
people.
    The Leadership Conference has spoken out against law 
enforcement use of facial recognition since 2016, how we're 
adding the inherent bias of these tools and their disparate 
impact on marginalized communities that are already 
overpoliced.
    Last month, The Leadership Conference, along with Upturn 
and New America's Open Technology Institute, released a 
statement signed by 40 advocacy organizations that outlined six 
of our most pressing civil rights concerns about facial 
recognition technology, including that its use exacerbates the 
harms of policing in communities that are already targeted by 
police; threatens individual and community privacy by allowing 
invasive and persistent tracking and targeting; chills First 
Amendment-protected activities; violates due process rights and 
otherwise infringes upon procedural justice; often relies on 
face prints that have been obtained without consent.
    Lastly, in addition to racial bias with respect to how law 
enforcement uses facial recognition, the technology itself 
poses a disproportionate risk of misidentification for Black, 
Asian, and Indigenous people.
    The possibility for misidentification and misclassification 
poses a tremendous threat to the health, safety, and well-being 
of communities. The reason for these problems and inherent 
biases varies. In some cases, the cause of the bias lives 
within the database that an image is searched against. In 
others, it is due to the historical bias built into the 
algorithm itself.
    The end result, however, is the same: Black people are 
disproportionately represented in law enforcement facial 
recognition databases, and inequitable error rates further 
entrench racial disparities in the criminal legal system.
    For example, in a comparison of match rates by country of 
origin, photos of people from East African countries had false 
rates a hundred times higher than the baseline rate. 
Researchers also found that some facial analysis algorithms 
misclassified Black women nearly 35 percent of the time while 
always getting it right for White men.
    Even if the accuracy of facial recognition technology was 
improved, the fundamental issue remains: Facial recognition 
technology dangerously expands the scope and power of law 
enforcement.
    When combined with existing networks of surveillance 
cameras dotting our urban and suburban landscapes, facial 
recognition algorithms could enable governments to track the 
public movements, habits, and associations of all people at all 
times, merely with the push of a button.
    According to a report from the Georgetown Center on Privacy 
and Technology, more than 133 million American adults are 
included in facial recognition networks across the country, and 
at least one in four State or local police departments can run 
facial recognition searches through their own network or the 
network of another agency.
    As facial recognition technology rapidly emerges in 
everyday police activities, safeguards to ensure this 
technology is used fairly and responsibly are virtually 
nonexistent.
    As Congress considers how our Nation's systems of policing 
and punishment disproportionately harm communities of color, it 
must decelerate law enforcement's ability to wield such 
powerful technology, particularly when the history and recent 
abuses provide little reason to think it would be used 
responsibly.
    We are safer when our communities are healthy, well-
resourced, and thriving. Policies of mass criminalization and 
over policing are fueled by White supremacy, not a belief in 
justice, and tools like facial recognition technology are about 
social control, not public safety.
    The Leadership Conference urges Members of Congress to Act 
now to protect the public from this technology, specifically by 
implementing a ban or moratorium on law enforcement use of 
facial recognition tools.
    Families and communities will continue to endure 
unspeakable harm and tragedy on our government's watch as long 
as we allow these policies to continue. Time is of the essence.
    The Leadership Conference looks forward to working closely 
with the Subcommittee to address the serious concerns with this 
technology and work toward a new vision of justice that truly 
keeps communities safe.
    Thank you.
    [The statement of Mr. Lee follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Thank you very much for your testimony, 
and as well for that insightful constitutional analysis, which 
is extremely important.
    Now, I would like to recognize Ms. Kara Frederick for five 
minutes.

                  STATEMENT OF KARA FREDERICK

    Ms. Frederick. Chair Nadler, Chair Jackson Lee, Ranking 
Member Jordan, Ranking Member Biggs, thank you for the 
opportunity to testify today.
    In four years, a projected six billion people will have 
access to the internet, with 80 billion devices connected to 
the Web. This creates profound digital surveillance 
opportunities. Facial recognition is a standout capability.
    Some uses of facial recognition comprise legitimate public 
safety imperatives--finding the Capital Gazette shooter, as the 
Chair mentioned, and detecting individuals using fake 
passports. The potential for abuse by agents of the State is 
also high. Risks are manifold, like inaccurate algorithms and 
data security vulner-
abilities.
    I want to focus on three risks in particular.

        (1)  The circumscription of civil liberties and individual 
        privacy. The FBI is leading the way in the use of facial 
        recognition for domestic surveillance. Today, the Bureau has 
        access to over 640 million photos, in some cases through the 
        use of private companies that scrape social media sites 
        unbeknownst to the subject.
        (2)  Outsourcing that surveillance to unaccountable private 
        companies. Reports that the Biden Administration intends to 
        expand the use of private companies unencumbered by 
        constitutional strictures and with a history of reckless 
        privacy practices are troubling.

    Although government entities, like DHS, have long used 
private firms to identify patterns in publicly available 
information, a renewed push to make use of outside expertise 
for domestic spying on the heels of the new White House plan to 
counter domestic extremism portends potential Fourth Amendment 
concerns.
    Such impulses to outsource domestic surveillance can lead 
to more expansive monitoring by law enforcement, which leads to 
the third risk:

        (3)  The potential integration of facial recognition data with 
        other PII through the expansion of mass surveillance.

    The technical capabilities are already here. Now, multiple 
data sources can be aggregated and synchronized to allow 
governments to look for patterns in citizens' behavior. Faster 
networks with lower latency provide quick transmission and 
higher throughput to handle increased data flows.
    More compute power and options, along with developments in 
machine learning and analytics that extract value from data, 
all fit together in mutually reinforcing ways, combining to 
weave diverse data streams into a grid of total surveillance if 
desired.
    This can engender a climate of fear, self-censorship, and 
the chilling of free speech and the right to peaceably assemble 
in public places.
    While authoritarian powers like China are at the bleeding 
edge of using facial recognition for internal control, the 
demonstrated inclination by governments to expand these powers 
in democratic nations renders the slope a slippery one. We know 
that once these powers expand, they almost never contract.
    The trend lines are foreboding. The United States nearly 
matches China in its surveillance coverage with one camera for 
every 4.6 people compared to China's one for 4.1 individuals.
    Municipalities are using COVID as justification for 
expanded surveillance, as in the case of Peachtree Corners, 
Georgia, which became the first U.S. city to use AI-driven 
smart cameras to monitor social distancing and the use of masks 
this year.
    Combined with near-historic low levels of public trust in 
the government to do what is right, the unfettered deployment 
of these technologies will continue to strain the health of the 
body politic without immediate and genuine safeguards.
    Technology has long outpaced our efforts to govern it. To 
constrain abuse and bound expansion, Congress should establish 
a Federal data protection framework with appropriate standards 
and oversight for how U.S. user data is collected, stored, and 
shared by Federal, State, and local entities.
    The initial focus of the effort should establish clear 
policies on data retention, categorize biometric surveillance 
data as sensitive data, and limit interoperability and data 
integration practices to stymie mass surveillance.
    Congress should ensure that U.S. identity management 
systems are secure and reliable, based on proper standards and 
measurements, and in accordance with NIST guidelines. 
Programmers should build in data privacy protections in the 
design phase, testing new methods of encryption or differential 
privacy, decentralized models of data storage, or federated 
models of machine learning for these systems.
    In sum, the debate over public safety and privacy trade-
offs in our Republic provides an opportunity for us to embed 
technology with privacy protections from the outset and to 
shore up a system of checks and balances to address privacy 
infringements that do occur.
    We are balanced on a razor's edge. How we decide to use and 
safeguard these technologies today either draws us closer to 
the free and open Republic we should be or closer to the 
authoritarian states whose surveillance practices we profess to 
abhor.
    I look forward to taking your questions.
    [The statement of Ms. Frederick follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Thank you very much for your testimony.
    We're now able to recognize Professor Laurin for five 
minutes.

                STATEMENT OF JENNIFER E. LAURIN

    Ms. Laurin. Chair Jackson Lee, Chair Nadler, Ranking Member 
Biggs, Ranking Member Jordan, and Members of the Subcommittee, 
thank you very much for the opportunity to address the 
Subcommittee on the topic of facial recognition technology.
    I will speak to a small but important sliver of this vast 
and complex topic that intersects with my own research and 
teaching on forensic science and criminal investigation and 
adjudication. The three bottom line points I want leave you 
with are these.

        (1)  Although facial recognition technology matches have not 
        been held to be admissible as evidence in criminal cases, law 
        enforcement use of facial recognition to identify suspects does 
        form the basis for arrests, criminal convictions, and periods 
        of incarceration in the Federal and State systems.
        (2)  While facial recognition has the capacity to be highly 
        accurate under ideal conditions, there is greatest cause for 
        concern about accuracy when used in criminal investigations.
        (3)  That the criminal legal system is not itself well-designed 
        to screen out mistaken or unreliable fruits of facial 
        recognition technology.

    To begin with the first point, facial recognition 
technology does lead to deprivations of liberty. Courts do not 
currently admit testimony or other evidence of facial 
recognition matches to establish that a defendant is the 
perpetrator of a crime.
    However, evidence of facial recognition matches has been 
relied upon by courts in stages of criminal cases that are not 
governed by rules of admissibility, for example, in bail 
hearings to support decisions ordering pretrial detention, and 
in sentencing to find facts that enhance a defendant's 
sentence.
    Perhaps more critically, the overwhelming majority of 
criminal prosecutions begin and end with no presentation of 
evidence before a jury at all. I refer, of course, to the 
nearly 98 percent of Federal criminal convictions obtained 
through guilty pleas.
    In those cases, given the decidedly low evidentiary 
threshold of probable cause to arrest, facial recognition-
generated matches can generate a criminal conviction without 
any courtroom testing of the government's evidence.
    Moreover, even when facial recognition is used by law 
enforcement only to generate an initial lead, that early facial 
recognition match has the potential, particularly given the 
stickiness of cognitive biases and the very real potential for 
facial recognition matches to influence subsequent 
identifications, as Professor Friedman indicated, the capacity 
to send investigators down an erroneous path that even if 
ultimately corrected has enormous consequence for a wrongly 
accused individual like Mr. Williams.
    The fact that facial recognition is shaping outcomes in 
criminal investigation and adjudication would be of little 
concern if there were no reason to question its accuracy as a 
means of identifying suspects. As others have already noted 
today, despite significant advances, significant concerns 
remain about facial recognition's accuracy.
    There are at least two reasons why the criminal 
investigative context raises special worries.

        (1)  As has already been discussed, the fact that there is 
        significant variability and accuracy among facial recognition 
        vendors, paired with the disturbing GAO findings that Federal 
        agencies rely on a wide variety of--and sometimes unknown--
        facial recognition systems, points to an urgent need for 
        disclosure of and perhaps restriction on what public and 
        private facial recognition systems are being used.
        (2)  Aside from variability and disparity in accuracy across 
        algorithms, law enforcement use of facial recognition to 
        identify criminal suspects raises special accuracy concerns 
        because it is particularly likely that less than ideal images 
        will be used in such procedures.

    Studies of facial recognition that find low error rates do 
so only when using cooperative staged images. Declines in 
accuracy are consistently seen when using images not captured 
under ideal circumstances, precisely the types of images that 
are used in criminal investigations.
    Finally, none of this might give rise to a need for 
legislative intervention if the criminal legal system already 
possessed adequate means to police investigative use of facial 
recognition. It does not.
    Partly, this is due to the fact that as long as facial 
recognition matches are used as investigative tools and not 
relied on as evidence in a criminal case, there's little 
opportunity for a defendant who is identified to challenge that 
practice in court.
    Additionally, however, restricted discovery, the sharing of 
case information between the prosecution and defense, means 
that defendants are severely hobbled in litigating law 
enforcement use of facial recognition.
    In contrast to the regime in an increasing number of 
States, defendants in Federal criminal cases do not have access 
to the government's investigative file. Indeed, defendants 
might not even learn that an early facial recognition match 
procedure was performed.
    Moreover, the trial-based timing of disclosure in Federal 
criminal cases means little opportunity exists for the defense 
to vet the reliability of government evidence before entering a 
plea.
    Additionally, to date, defense claims that they are 
constitutionally entitled to information about facial 
recognition under the holding of the case Brady v. Maryland 
have gained little or no traction in courts.
    In sum, facial recognition is an alluring forensic tool 
with potential to accurately open otherwise unavailable 
investigative avenues. Troublingly, the Federal criminal legal 
system is not well-designed to smoke out questionable uses of 
the technology.
    I look forward to your questions.
    [The statement of Ms. Laurin follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Thank you very much, Professor, for your 
testimony.
    Now, I recognize Mr. Brett Tolman for five minutes. Mr. 
Tolman, you are recognized. Thank you.

                   STATEMENT OF BRETT TOLMAN

    Mr. Tolman. Thank you, Chair Jackson Lee, Ranking Member 
Biggs, and Members of the Subcommittee. Thanks for the 
opportunity to testify today.
    My name is Brett Tolman. I'm the Executive Director of 
Right on Crime, a conservative organization dedicated to the 
promotion of criminal justice policies that promote public 
safety, individual liberty, and whole communities. I've 
previously served as United States Attorney, as a Federal 
prosecutor, and as chief counsel for crime and terrorism for 
the United States Senate Judiciary Committee.
    The first issue to address is concerning lack of 
transparency and basic information relating to law 
enforcement's use of facial recognition technology. Many of the 
fundamental questions that you likely want and deserve answers 
to, are currently unanswerable, questions, simple questions, 
such as how many law enforcement entities use facial 
recognition? How often do they use it? Who is in their 
databases and why?
    Beyond those Federal law enforcement agencies, one might 
suspect of using facial recognition, like the FBI, there are a 
host of others with less apparent need who also use the 
technology. For example, the U.S. Fish and Wildlife Service and 
the Internal Revenue Service.
    Also, discomforting are the sources of the photos 
supporting this technology. For example, the government 
utilizes millions of photos of law-abiding individuals 
collected from driver's licenses and passports; but private 
technology companies that contract with law enforcement have 
harvested billions of photos posted by unsuspecting users on 
platforms such as Facebook, YouTube, and even Venmo. This 
collection is an unprecedented invasion of privacy that places 
enormous undue control in the hands of the government and Big 
Tech, two entities not always known for their light touch or 
responsible use of power.
    Walking out the door in the morning can be an exercise in 
skipping from one camera to another. To this perpetual passive 
surveillance, law enforcement can potentially add recordings 
from body worn cameras and the Smartphone in an officer's 
pocket.
    In short, there are very few instances where law 
enforcement will not have the opportunity to subject a person 
of interest to facial recognition technology.
    Inevitably, the first temptation will be, and has been, to 
use facial recognition by default rather than necessity. 
Instead of limiting the practice to serious public safety risks 
or only after due process has been afforded an individual, 
officers may use and have already used the practice for run-of-
the-mill interactions in minor cases.
    Our Founding Fathers deliberately and prudently enshrined 
in the Bill of Rights prescriptions on the wanton search of 
Americans as a necessary bulwark for freedom. It is hard to 
square those notions and protections with the unfettered use of 
a technology that can instantaneously reveal an individual's 
identity as well as potentially their association and prior 
movements. The unrestricted use of facial recognition bears 
little difference to allowing police to collect fingerprints or 
DNA from anyone and everyone they come across, an abuse that we 
clearly do not and should not tolerate.
    Furthermore, we cannot ignore the risk that facial 
recognition technology will be used to target certain 
Americans. Facial recognition can instantly strip away the 
anonymity of crowds and ultimately threaten our constitutional 
rights of assembly and association. Consider the chilling 
effect facial recognition could have if used to identify 
individuals at a political rally or a government protest.
    This practice grossly lacks accountability. Right now, we 
have little more than vague assurances that we should trust the 
government to safely use the incredible power of facial 
recognition technology without such information or oversight.
    Not long ago, I was tasked with leading the effort in the 
United States Senate to reauthorize the PATRIOT Act. We heard 
similar assurances years ago by those leading the Department of 
Justice and the FBI about FISA and those surveillance 
authorities not being turned against honest Americans. We have 
seen how that worked out as outlined in the recent and 
disturbing Inspector General reports. None of this is to say 
that law enforcement should never have access to facial 
recognition technology.
    While China's unconscionable use of facial recognition 
technology to enhanced and accelerated its undemocratic control 
of its citizenry is a warning of the cost of failure, I do not 
believe it is an inevitable consequence of any use of facial 
recognition technology by law enforcement.
    Further, it is unrealistic to expect law enforcement 
officials to permanently deny themselves a tool that is 
increasingly prevalent in the commercial sector and which has 
such powerful capacity to improve public safety. It is easy to 
contrive a scenario in which we would very much want law 
enforcement to have this technology, searching for a known 
terrorist loose in our city presenting an eminent risk to the 
public, or seeking to identify a mass murder suspect, for 
example.
    However, acknowledging there are credible uses for facial 
recognition technology and voicing support for law enforcement 
is not the same as writing a blank check for power and then 
looking the other way. If nothing else, we need a reset in 
which law enforcement agencies must gain affirmative permission 
from relevant, democratically elected representatives to use 
this technology prior to its use. That way transparency, 
accountability, and proper use guidelines can be established.
    I look forward to answering your questions.
    [The statement of Mr. Tolman follows:]
[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Mr. Tolman, we thank you for your 
testimony.
    It's now my pleasure and privilege to recognize Dr. Cedric 
Alexander for five minutes.
    Dr. Alexander, you are recognized.

                STATEMENT OF CEDRIC L. ALEXANDER

    Dr. Alexander. Thank you, and good morning, Madam Chair, 
and those who are Ranking Members and Subcommittee Members, for 
the opportunity to be here with you. I hail from the great 
State of Florida, Pensacola more specifically, so it's great 
being here with you.
    I'm going to try not to be redundant, because maybe some of 
what I'm going to say has already been mentioned in the time 
that has been allocated; but let me start by saying this: On 
July 9, Portland, Maine Press Herald reported on two newly 
enacted acts, laws that make Maine the first U.S. State to 
enact a broad ban on government use of facial recognition 
technology.
    In Maine, State, county, and municipal governments will not 
be allowed to use any sort of FRT. Law enforcement may use the 
technology for investigating certain serious crimes, but State 
law enforcement agencies are barred from implementing their own 
FRT systems. They may request FRT searches from the FBI and the 
State Bureau of Motor Vehicles in certain cases.
    A year earlier, in August of 2020, the Portland City 
Council totally banned the use of facial recognition by the 
police. Indeed, currently there are approximately 24 American 
cities that ban police FRT use, and many other organizations 
has called for a nationwide ban because of evidence that false 
positive identification in the case of people of color exceed 
the rate of Whites. So, the Federal Government has enacted no 
law on how law enforcement may or may not use facial 
recognition technology.
    The benefits of facial recognition systems for policings 
are quite evident. Some have already been mentioned here. It 
is, in fact, that technology that does aid in the detection and 
prevention in crime. For an example, facial recognition is 
effectively used for issuing identity documents and is usually 
combined with long-accepted biometric technologies, such as 
fingerprints and iris scans.
    FRT face matching is used at border checks to compare the 
portrait of digitized biometric passports, and FRT, as we all 
know, was used to identify suspects in the January 6 violent 
breach of the U.S. Capitol, and high-definition digital 
photograph and video-graphy, sometimes deployed from aerial 
drones may be used increasingly to identify faces in mass 
events.
    Currently, the FBI's a leading law enforcement agency that 
uses FRT, and it has developed technology and best practices to 
promote the intelligent use of technology to reduce error and 
protect constitutional rights.
    In addition, the Facial Identification Scientific Working 
Group operating under the National Institute of Standards and 
Technology Organization of Scientific Area Committees is 
working to develop standards for facial recognition technology. 
Facial recognition technology has been useful and law 
enforcement, I believe, will continue to develop technically 
and, therefore, become even more useful. Blanket bans on FRT in 
policings are unwarranted and deny to police agencies a tool 
that is important to aid in public safety.
    Make no mistake, there are urgent constitutional issues 
relating to privacy protection from unreasonable searches, due 
process, and the presumption of innocence. Especially 
concerning are false positives and negative identification 
results.
    In addition, police do not always use FRT correctly. For 
instance, on May 16, 2019, The Washington Post reported that 
some agencies use altered photos, forensic art sketches, and 
even celebrity look-alikes for facial recognition searches. In 
one case, the Post's Drew Hall wrote, ``New York police 
detectives believed a suspect looked like the actor Woody 
Harrelson.'' So, they ran the actor's image through a search 
and then arrested a man that system had suggested might be a 
match. Forgive me for being blunt, but that was pretty, pretty 
egregious, quite frankly.
    Using artificial technology to confer upon a highly 
subjective visual impression, a halo of digital certainty is 
neither fact-based, prudent, efficient, or just. Under Federal 
law, at least it is not illegal for the simple reason that no 
Federal laws govern the use of facial recognition.
    I'm going to stop here and be more than glad to answer any 
questions that you may have, particularly from a very practical 
perspective as being a former chief in two major cities in this 
country, and a former Federal security director for the U.S. 
Department of Homeland Security in Dallas, Texas, for 5.5 
years. So, I understand the importance of this technology, but 
I also understand how it could be easily abused if those are 
not properly trained, certified in the utilization and best 
practices are not evident.
    So, thank you, Madam Chair.
    [The statement of Dr. Alexander follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
    Ms. Jackson Lee. Dr. Alexander, thank you and thank you to 
all the Witnesses. We obviously have opened up in the parochial 
and often used term, nonlegal, a can of worms that we need to 
certainly address.
    We will now be able to proceed with Members questions, and 
I'm very grateful for the attendance to all you Members, both 
on the minority side, and as well on the majority.
    We will now proceed under the five-minute rule with 
questions.
    I will now begin by recognizing myself for five minutes.
    Director Goodwin, you obviously know of the report, the GAO 
report of 2019, indicating that facial recognition algorithms 
tested accuracy in a real way by race, ethnicity, or country of 
origin, as well as gender, and it differed in accuracy.
    What are some of the concerns related to facial recognition 
system accuracy, Director?
    Dr. Goodwin. Thank you for that, Madam Chair.
    So, a number of concerns have been raised around the report 
around accuracy, and I will just focus on the main one. So, 
just in terms of if you think about a person's face, it's 
distinct; it typically doesn't change. If a person's face is--
that information is breached or hacked, that is totally 
different than your password being breached. Well, you can 
change your password, but you cannot change your face.
    So, in the report, we note this particular concern. The 
misidentification based on a facial image is of great concern, 
and we note that in our report.
    Ms. Jackson Lee. Thank you very much.
    Mr. Williams, you were wrongfully arrested in front of your 
wife and daughters, detained for 30 hours in a filthy, 
overcrowded detention center because of facial recognition. 
Your story is stunning, in particular holding up a picture 
alongside of your face, your good comment, ``Do we all look 
alike?'' You are a suspect allegedly in a theft investigation. 
Prosecutors dropped the charges, and the Detroit Police Chief 
apologized for sloppy detective work.
    Tell us, what impact did your arrest and detention have on 
you and your family? You are obviously a gentleman with 
employment, a family. How did that impact you?
    Mr. Williams. Ah, well, thank you. Well, the first thing 
when I came home was on the drive home, my wife was saying how 
she had to figure out how to get in contact with the detention 
center because they kept hanging up on her and telling her they 
can't answer any questions because they don't have any 
information. All they know is that I am there, and that is all 
they can tell her.
    When I got home, I asked my wife what was going on, and my 
daughter had took family pictures of us and turned them around 
because she said she couldn't bear to look at me while I was in 
jail, and I had to go talk to a five-year old at the time about 
a mistake that had been made by the police, and she--it was 
hard to make her understand that sometimes they make mistakes 
because you tell her as a child that the police are your 
friend, and if you ever get in trouble, this is who you go to 
and this was going to help. I had to deal with that, and we 
have thought about even taking her to a psychiatrist, because 
when we replay the video from what happened, she can't stand to 
watch it. She gets all emotional, and it's understandable. I 
have to go through that, I mean--
    Ms. Jackson Lee. Painful. Thank you so very much.
    Mr. Williams. Yeah.
    Ms. Jackson Lee. Thank you so very much.
    Professor Friedman, you have heard that story. My time is 
short, and I have one or two other questions.
    What are the privacy implications and risk to communities 
if a private company has access to the details of law 
enforcement investigations? You might add your thought about a 
legislative fix or thought about how we address this question.
    Professor Friedman?
    Mr. Friedman. Thank you, Chair. That was an incredibly 
moving story. I have kids, and I can only imagine.
    The problem, as I think Dr. Alexander pointed out, is that 
the use of this technology is a bit like the wild west. There's 
no regulation. When there's no regulation, we are going to have 
mistakes, and we are going to have breaches, and we are going 
to have all kinds of trouble. There are uses of the technology 
for law enforcement, but you need a full set of regulations.
    I've set out a bunch of them in my testimony. I think you 
could do a lot of this by regulating the vendors directly. As a 
product moving in interstate commerce, you could set a number 
of standards, including that the companies themselves have to 
regulate and identify what are effective probe images, what are 
the right size of the databases, what are the thresholds to 
avoid false positives and false negatives under different 
circumstances, that the technology is self-auditing so that we 
could know what how it was actually used by law enforcement. 
So, there's a lot of things that can be done.
    Ms. Jackson Lee. Thank you very much.
    Just, Dr. Alexander, what are your suggestions on how law 
enforcement authorities can reduce the phraseology that the 
Detroit chief had to acknowledge, sloppy detective work, 
without greatly reducing investigative efficacy?
    Dr. Alexander?
    Dr. Alexander. Thank you, Madam Chair.
    I guess having been a former chief and I had to respond in 
that way, that would certainly suggest to me that I have a 
larger systemic issue around the way investigations are 
conducted, and how people are interviewed.
    Here's the issue, though, for me. The technology is one in 
which we use every day. We pick up our cell phone. We use it to 
access calls or text messages, whatever the case may happen to 
be. The biggest issue here for me, and will continue to be, 
when you have victims such as Mr. Robert Williams, people who 
are innocently going along their way, because of a failed 
ability for us in policing, quite frankly--and I say ``we'' 
because I have been there--the inability to train people 
appropriately. They have to be trained. There has to be 
certifications. There has to be best practices, and there needs 
to be certification both, I believe, certainly at a State 
level, and a Federal level, that has to be ongoing, because one 
thing that we know currently is that we don't train enough.
    When you are utilizing that type of technology that can 
easily infringe upon someone's Fourth Amendment rights, I think 
we have to be very careful. We also have to remember where we 
came from, even in DNA technology, where we there, too, have 
seen over the years, particularly in the past, we have seen DNA 
labs that were shut down because of people's inability to carry 
out that professional function based on certification and 
accuracy. We don't want to find ourselves using this technology 
doing the same things that we have done in the past, because 
what it will continue to do, Congresswoman, is drive that 
proverbial wedge even further apart between good policing and 
communities across this country, and I don't think that is a 
road we want to continue to go around--to go down.
    Ms. Jackson Lee. Thank you so very much.
    My time has expired. I know that there are many other 
questions that I have interest in, but let me yield to the 
Ranking Member of the Subcommittee, Mr. Biggs, for five 
minutes.
    Mr. Biggs. Thank you, Madam Chair.
    Dr. Goodwin, thank you for being here today.
    In your most recent report, you found that 13 of the 14 
agencies that you surveyed that reported using non-Federal 
systems did not have complete, up-to-date information about 
what systems their employees were even using.
    Is that a correct interpretation?
    Dr. Goodwin. Yes, that's correct, sir.
    Mr. Biggs. So, in fact, if I understand right, even one of 
the agencies you worked with had to poll its employees to see 
whether they were using the system, and what systems they were 
using. Is that accurate, too?
    Dr. Goodwin. That's correct, Congressman.
    Then one of the agencies polled, yes, they initially told 
us that they didn't use the technology. When they polled their 
employees, they found that the technology had been used in over 
1,000 searches.
    Mr. Biggs. Well, if agency leadership does not even know 
what systems their employees are using, or whether they are 
using it, how can they be sure that those systems are not being 
misused, and that our constitutional rights of American 
citizens are not being violated?
    Dr. Goodwin. So, this speaks to the heart of the 
recommendations that we made, that agencies have an awareness. 
It's important that they have an awareness of what systems they 
are using, particularly the non-Federal systems that their 
employees are using. Given that awareness, then they can then 
begin to ensure that the appropriate protections are in place 
related to privacy and accuracy.
    Mr. Biggs. Ms. Frederick, we know Federal agencies are 
using private searches, and they are collecting literally 
billions of images. How are those images acquired?
    Ms. Frederick. Well, it depends on the private company. A 
lot of these companies--Clearview AI was mentioned earlier--
they claim to scrape the Internet, social media sites, things 
you might upload, what your grandma is still using Flickr to 
upload. There's an IBM scandal there as well. So, they are 
basically just scraping what is publicly available on the 
internet when people are unwittingly uploading these photos, 
clicking yes to some of the terms of use on these private 
companies. They are not consenting to allow these images to be 
used by law enforcement; but by using those non-Federal private 
companies, these contractors like Clearview AI, like Babel 
Street has been mentioned, they are just using your regular 
photos from your grandma, family, sisters, and your daughters. 
So, it is pretty much a government-infused wild west out there.
    Mr. Biggs. Mr. Tolman, these systems, these private 
systems--let's leave off Federal systems for a moment, or law 
enforcement systems--what steps are they even taking to secure 
their systems?
    Mr. Tolman. Well, the reality is they are more interested 
in the dollars that are behind this technology than they are 
securing civil liberties, and that will always be the case. 
When private companies are contracting and utilizing a powerful 
technology, their interest is going to be driven by their 
bottom line.
    Mr. Biggs. So, Mr. Tolman, back to you still, these 
agencies, like Federal agencies that are using private 
services, is it just easier for them to contract out than to 
develop their own system, I assume?
    Mr. Tolman. Yes. The capability is hundreds of millions of 
photos versus what they would be able to gather themselves with 
limited staff and resources.
    Mr. Biggs. So, how are we going to ensure that Americans 
are not being targeted by Federal agencies using the facial 
recognition technology? What are your policy prescriptions?
    Mr. Tolman. So, there certainly is--as others have said, 
there's going to have to be a complete shift, meaning the 
legislature is going to have to enact regulations that require 
any agency, government agency, whether they are contracting or 
not with Big Tech or other companies, are going to have to have 
policy that allow for the proper use, the legal use of this, 
ones in which you all say is a legal use.
    I prosecuted the kidnapper of Elizabeth Smart. I can 
imagine during that case, when I was a prosecutor, we would 
have clearly sought out law enforcement's ability to use that 
to identify Elizabeth and try to find her. That is a far cry 
than just downloading all from social media what you can and 
then targeting others without the legal justification or the 
regs or policies that have been approved by Congress.
    Mr. Biggs. Mr. Lee, I'm going to ask you the same question. 
What prescriptions, policy prescriptions do you advocate that 
we can ensure that Americans are not being targeted by Federal 
agencies or other law enforcement agencies?
    Mr. Lee. Ranking Member Biggs, thank you for the question.
    The Leadership Conference supports a ban, or moratorium, on 
the use of these technologies. We support a ban or moratorium 
because:

        (1)  These communities are already--the communities where 
        facial recognition technology is used are already over-policed 
        and over-surveilled.
        (2)  Facial recognition technology is disproportionately 
        inaccurate for Black and Brown communities, particularly, 
        communities of color.
        (3)  The community has not been involved in these processes. 
        Congress has not engaged in legislation on these topics. 
        Communities have not been involved in highlighting what they 
        think policing should be, specifically within the context of 
        facial recognition technology.

    Mr. Biggs. Thank you.
    My time is up, Madam Chair. Thank you.
    Ms. Jackson Lee. Thank you very much.
    Now, I'm pleased to recognize Mr. Nadler, Chair of the 
Committee.
    Mr. Nadler, you are recognized.
    Chair Nadler. Thank you, Madam Chair.
    Director Goodwin, I would like to focus on the FBI's use of 
facial recognition technology. In your report, you found that 
15 agencies reported using systems owned by another Federal 
entity, and 14 reported using systems owned by State, local, 
Tribal, or territorial entities, including the FBI.
    Can you briefly describe for us how the FBI uses State 
systems to expand its facial recognition search capacity?
    Dr. Goodwin. Thank you, sir, for that question.
    So, I will briefly talk about how the FBI used the 
technology around the events following the killing of George 
Floyd in May 2020.\1\ So, the FBI has a media tip line, as I'm 
sure you are aware of, and so what they did, they asked for 
people to submit images to the media tip line. The images that 
were submitted to the tip line they then inserted into an FRT 
system to try to get some potential matches.
---------------------------------------------------------------------------
    \1\ Director Goodwin requested to update this sentence. The 
original statement was ``So, I will briefly talk about how the FBI used 
the technology around the events of January 6.''
---------------------------------------------------------------------------
    So, that is one way the FBI has used the technology.
    We did a report a few years ago on FBI specific use of the 
technology, and we had issued a number of recommendations that 
spoke to them complying with the Privacy Impact Assessment, 
otherwise known as PIA, and the Systems of Record Notice, 
otherwise known as the SORN. The FBI--ultimately, they agreed 
with our recommendations, and they implemented the 
recommendations. So, they have--I think it was Dr. Alexander 
who mentioned that the FBI has done a number of things to 
ensure that the accuracy of the FRT systems that they are using 
has improved.
    Chair Nadler. Thank you.
    Also, Director Goodwin, your report describes how agencies 
use nongovernment-owned systems. What are the potential 
security risks associated with a Federal law enforcement agency 
using a non-Federal Government-owned, or, even a State- or 
local government-owned FRT system?
    Dr. Goodwin. So, I will focus on the non-Federal systems, 
and it's something that we speak to in the report. As I 
mentioned earlier, the Privacy Impact Assessments and the 
Systems of Record Notice, the PIA and the SORN, those are two 
standards that typically, if a government agency is using their 
own system, they have to follow. So, the concern that we 
mentioned in our report is that if they are using a non-Federal 
entity for facial recognition technology, that entity, that 
non-Federal entity may or may not have to comply with the 
recommendations and the guidelines laid out under the PIA and 
the SORN, and, so, that is a concern. So, with the PIA, that, 
of course, speaks to how the images are being used. The SORN 
speaks to the fact that if there's a change in the technology, 
that information has to be made public and available.
    Chair Nadler. Thank you.
    Professor Laurin, for those who are arrested following law 
enforcement officers' use of facial recognition technology to 
match them to a suspect's photo, what protections are in place 
for defendants, who are presumed innocent at that point, to 
challenge the use of the facial recognition technology that led 
to their arrest?
    Ms. Laurin. Well, in many instances today, it's quite 
limited, the opportunity to do that. In fact, practically 
nonexistent to the extent that the practice has been for law 
enforcement agencies to use the facial recognition match as a 
lead that then is corroborated through other evidence. The 
risk, though, is that that corroboration ends up not truly 
being independent corroboration.
    So, for example, in a case in Florida, a facial recognition 
match was made based on an image in a store, and subsequently a 
police officer who knew that the match had been made, looks at 
the image himself and said, yeah, these two guys look alike. 
So, already knowing that the computer had assisted a match sort 
of doubled down on it, which is not an independent 
corroboration.
    What courts have said is that defendants really don't have 
the right to challenge the facial recognition match when the 
government isn't relying on it in court; but that misses the 
fact that the match could taint what is showing up in court.
    Chair Nadler. Thank you.
    Facial recognition systems have been found to be less 
effective in detecting people of color and women. As we heard 
from Mr. Williams, he experienced this technological flaw in 
real time when he was misidentified despite his photo not 
resembling the suspect.
    Professor Friedman and Mr. Lee, can facial recognition 
technology be incorporated fairly into criminal investigative 
and public safety strategies despite this inherent flaw in the 
algorithms?
    Professor Friedman?
    Mr. Friedman. Go ahead.
    Mr. Lee. First, Chair, it's a pleasure to see you again, 
especially under these circumstances. Secondly, the Leadership 
Conference does not. Currently the issues of accuracy alone 
would say that--would hold that a ban or moratorium would be 
the most appropriate use. In addition, these technologies are 
being used in communities that are already over-policed and 
over-surveilled. So, there is no way to get around the accuracy 
and over surveillance issues that come with law enforcement use 
of facial recognition technology.
    Chair Nadler. Mr. Friedman?
    Mr. Friedman. Chair, so I want to echo the grave concern 
about technologies that have these demographic differentials, 
as I said in the past, that they ought not be used unless that 
can be addressed. Actually, join Mr. Tolman in thinking 
strongly that we shouldn't be using the technologies until 
there's regulation. There may be ways to deal with the 
demographic disparities if we understand what they actually 
are, but we don't have that testing, so we have learning to do. 
Then, we have to think about what it means to have a human in 
the loop because many people offer that up as a way to ensure 
that these things don't go wrong. As Mr. Williams' story all 
too graphically demonstrates, the way we use humans in the loop 
right now don't do that. It might be possible to develop 
protocols and best practices to do that; but we have not made 
that happen yet, and that is why we suggest that we need 
regulation.
    Chair Nadler. Thank you.
    Madam Chair, before I yield back, I ask unanimous consent 
to place the following into the record: A statement from over 
40 civil rights groups, a letter from the Security Industry 
Association, and individual statements from Data for Black 
Lives, the Electronic Frontier Foundation, and the Project on 
Government Oversight.
    Ms. Jackson Lee. Without objection, so ordered.
    [The information follows:]

                      CHAIR NADLER FOR THE RECORD

=======================================================================

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

    Chair Nadler. Thank you.
    I yield back.
    Ms. Jackson Lee. Thank you very much.
    Mr. Chabot, you are recognized for five minutes.
    Mr. Chabot. Thank you, Madam Chair.
    I would like to lead off by thanking our Ranking Member, 
Mr. Jordan, for allowing me to go in front of him. I have a 
noon speech that I am supposed to give in my district. It's 
going to take me awhile to get there, so I really do appreciate 
him switching spots with me.
    In addition to being a Senior Member of the Judiciary 
Committee for quite a few years, I also serve on the Foreign 
Affairs Committee, and I'm the Ranking Member and past Chair of 
the Asia and Pacific Subcommittee, and in that capacity, we, 
both Republicans and Democrats, on the Committee have been very 
closely watching, observing, monitoring what the PRC and the 
Chinese Communist Party is up to there, and the threat that 
they pose, not only to Americans, but to their own citizens.
    It's well known, as was mentioned earlier in the statement 
of one or two of the Witnesses, that the Chinese Communist 
Party uses a vast civilian surveillance network to track nearly 
every movement of every single person there, and there's 1.4 
billion people who they are tracking.
    So, it's a pretty extensive effort that they have 
undertaken. For example, they force their citizens to download 
apps that monitor them. They track social media posts to look 
for what they consider to be negative content. They stop people 
at random to check their devices for apps deemed to be 
dangerous to the state, and they use facial recognition 
technology, technology similar to what we are discussing here 
today, to pick individuals up that they are looking for in 
crowds.
    All of this is to build a so-called great firewall between 
China and the rest of the world, so they can keep their people 
under control, and to gather personal data on their citizens 
which can be used to flag and potentially torment the people of 
China, individuals they believe that are going to be unhelpful 
to the regime.
    So, I would like to ask Ms. Frederick and Mr. Tolman, would 
you care to comment on the assessment you see relative to 
China, their use of facial recognition, other technologies, to 
monitor their citizens and how the CCP is using facial 
recognition technology to oppress individuals and certainly 
various groups. The Uyghur community comes to mind because they 
have obviously been in the news, not nearly enough of late when 
you consider that a million people are in their gulags.
    Are there any lessons that we should learn from this? I'm 
not, by any stretch of the imagination, suggesting that our 
government has anything like this in mind; but I would like to 
hear from those two, if we could.
    Ms. Frederick. Certainly. Thank you for the question. I 
will begin, if you don't mind, Mr. Tolman.
    So, we should say yes, we have a system in the United 
States here that matters, sufficient rule-of-law protections, 
openness and engaged citizenry, a free press, and an 
independent judiciary that Act as guardrails for now against 
the propagation of technology for political ends that are 
anywhere near like China.
    It is important to look to them. As I said before, they are 
the bleeding edge of using these technologies for internal 
control and internal stability. They pervade a path for ethnic 
targeting in the western region of Xinjiang as we talked about, 
where 1.5 million Uyghurs are imprisoned in reeducation camps 
often through the use of these facial recognition systems.
    It's not just about facial recognition alone. I think the 
ability to integrate seemingly disparate data sets as they do 
in Xinjiang, they use something that Human Rights Watch has 
reverse-engineered. It's called the Integrated Joint Operations 
Platform, and what they do is they take all this data, they 
fuse it. It's license plate readers, data doors, these things 
that pick-up SIM cards, and all kinds of biometrics, iris 
scans, just all the digital exhausts that we are constantly 
emitting that people, even in these regions, are constantly 
emitting. They put them altogether, and they use this 
information. They use new technology, artificial intelligence--
this is one of the first places that they started using it--for 
political identification of dissidents, of Uyghur minorities.
    What I see coming down the pike is the use of DNA 
genotyping. China is already trying to do this. They are trying 
to use an individual's DNA to predict what their faces will 
look like. They are experimenting with voice recognition 
technologies, and the idea is to combine all this into this 
digital panopticon and use it for basically an iron hand of 
digital control. We have the technology right now to do so.
    They have visited some of this in Hong Kong, as we have 
seen the diminishing of freedoms there, how does the democracy 
die in full view of us, and China is using technology to enact 
it. We know that Russia is doing the same things when it comes 
to pro-Navalny protests in January of this year. So, all these 
things are something to keep in mind. As Paul Mozur from The 
New York Times said, which I think is something that I would 
leave you with, ``what China is trying to do is totalized 
control.'' They are trying to link digital reality with 
physical reality, and that chasm that used to exist is growing 
closer and closer and closer together. China wants them to be 
completely interconnected. We have to guard against that here 
in America.
    Mr. Chabot. Thank you.
    Madam Chair, my time is expired. Perhaps Mr. Tolman could 
respond in writing just so we don't hold the Committee up at 
this time.
    Ms. Jackson Lee. Thank you very much, Mr. Chabot, for 
yielding back.
    I now recognize the gentlelady from California, Ms. Bass, 
for five minutes.
    Ms. Bass. Thank you, Madam Chair, and let me thank the 
Ranking Member and our Witnesses for being with us today.
    I wanted to know if the Witnesses--and anyone could speak--
could distinguish between using facial recognition to 
investigate versus facial recognition for arrest. Is it more 
productive to use it that way versus all the complications for 
arrest?
    Also, it's very well-known, and it has been said by 
numerous panelists and Members, of the challenges of facial 
recognition with people of color, but I'm not sure, unless I 
missed it, if anybody said, ``Does it work better if it's White 
men?'' How does it perform?
    Dr. Alexander, would you like to go?
    Dr. Alexander. Well, I will try to respond, to the first 
part of your question, whether the technology should be 
utilized with the idea of making an arrest based on the 
identification itself, can it be used as an investigative tool?
    When I think about it as an investigative tool, if that 
person's image happened to be a match, or similar match and we 
are working with zero leads in a case, it may prove to be of 
some value, right? The thing we have to be very careful--and I 
think someone mentioned it earlier and articulated it far 
better than I can--that we have to make sure we cognitively 
don't approach that individual already with our implicit biases 
in mind that they are the exact person we are looking for. Do 
you follow what I'm saying, Congresswoman? So, I think we have 
to be very, very careful with that.
    Ideally, to be perfectly honest with you, I was here two 
years ago under the leadership of another hearing with the late 
Congressman Elijah Cummings having these same conversations, 
and I recollect very clearly there was a woman there who was a 
professor, researcher at MIT who spoke very eloquently around 
these issues and was able to provide some insight in terms of 
the depth of this technology and how it does not work to the 
advantage of people of color and women.
    There's an overwhelming amount of evidence--and I think 
there are those on this panel who can talk further about this--
where, in the case of White males, there was the least likely 
opportunity for misidentification. Certainly, those of color 
and women, they tend to be more problematic.
    Ms. Bass. You said least likely, meaning that it's more 
accurate with White men?
    Dr. Alexander. Yes. It's more accurate with White men. I'm 
sorry, yes.
    Ms. Bass. Maybe someone else could respond to that. I don't 
know whether it's Mr. Friedman or Ms. Goodwin. What about that 
in terms of the accuracy and its use as an investigative tool? 
In the George Floyd Justice in Policing Act, we essentially say 
there should not be Federal resources in facial recognition.
    Mr. Williams, you have your hand raised.
    Mr. Williams. I was just going to say about the question 
you had previously asked that it was supposed to be used for an 
investigative tool at that point, and they came and arrested me 
and didn't even tell me anything. I don't even live in Detroit, 
and Detroit police came to my house in Farmington Hills, and 
basically carted me off. Right. So, I don't have a lot to say 
about that. All I'm saying is that the way that it's used, they 
are not doing any due diligence at all. That, like the chief 
said, was shoddy investigative work done and--
    Ms. Bass. Actually, excuse me, Mr. Williams, to me you are 
an example of what I wasn't saying. I
    Mr. Williams. I was talking about when you asked the 
question.
    Ms. Bass. Right. They used it for your arrest--
    Mr. Williams. Right.
    Ms. Bass. --which I think it's clear is problematic; but I 
was referring to use it to investigate, not in real time for an 
arrest.
    Mr. Friedman, do you have anything you want to say in the 
last few seconds?
    Mr. Friedman. Sure. I think there are things that this body 
could do regulatorily, and then there are things that it could 
learn and try to get us to learn. One of the problems we have 
is that NIST has not tested the kinds of uses that law 
enforcement is using. What we need--and I'm happy to elaborate, 
but I don't want to go over time--is the kind of testing that 
actually reflects what's happening in law enforcement news 
rather than a set of abstract possibilities for the accuracy 
and bias.
    Ms. Bass. Thank you, Madam Chair.
    Ms. Jackson Lee. Thank you so very much.
    Mr. Ranking Member, I'm not going to take up your time, but 
I am just moved to say that it appears like unreasonable search 
and seizure under the Fourth Amendment. I will not take up your 
time, and I'm delighted to recognize Mr. Jordan for five 
minutes.
    Mr. Jordan. Thank you, Madam Chair.
    Let me go over to Mr. Tolman. How often does it happen, how 
often do we get the wrong person? Where Mr. Williams told his 
compelling story of how they were just wrong and he was 
arrested and detained, I think he said for 30 hours--I know the 
Hueper family, the recent story of the couple in Alaska who 
were not in the Capitol on January 6, but their door was kicked 
in. They were put in handcuffs. They were at gunpoint at one 
point during the interrogation. It was like a four-hour 
interrogation, and it was all based on facial recognition. So, 
how often do we get the wrong person?
    Mr. Tolman. Well, one time would be too many, but 
certainly, some estimate it as high as 30 percent, but I 
actually believe it's more than that, Representative Jordan. 
It's nice to see you again, sir.
    Mr. Jordan. You too.
    Mr. Tolman. It is, I believe, higher than that. I have 
represented two individuals, one of whom I'm fairly certain--
and it's not--we don't know about the ones that aren't 
publicized or the ones that aren't coming forward. I have a 
client who was accused of going inside of the Capitol. He was 
not in the Capitol. I believe it was facial recognition 
technology that put him in law enforcement's investigation, and 
we certainly had to work backwards to show that he was not.
    So, I think it's more often than we know because it's not 
being reported.
    Mr. Jordan. Mr. Lee, do you know? Is there any kind of--Mr. 
Tolman said 30 percent. Is that--and we know it's a 
disproportionate impact on African Americans. Do you think it's 
a higher number than that, or is that probably where that 30 
percent of the time this technology gets the wrong person?
    Mr. Lee. Ranking Member Jordan, thank you for the question.
    We don't know, and that's a part of the problem. These 
technologies lack the requisite transparency and the requisite 
oversight for us to even know. Of the cases like Mr. Williams' 
that we do know those are terrifying. What about the cases that 
are maybe on the knife's edge that we don't know about right 
now? So, we do need more transparency and more oversight, 
specifically with facial recognition technology and its use by 
law enforcement.
    Mr. Jordan. Yes, thank you.
    I'm going to yield to our Ranking Member, but I'm 
concerned, in a general sense, that just the overall 
surveillance State and individual Americans' just fundamental 
liberties, fundamental rights, but when you add on top of that 
the fact that it's wrong so often, to me, this is why if we 
can't come together in a bipartisan fashion as I know we have 
been trying now for a couple of years, if we can't do that and 
figure out a way to put restrictions on this, limit this, 
particularly when you have all these agencies using the 
technology, they don't even know which technology they're 
using, and whether it's private or what have you--there's just 
a host of problems here.
    So, I appreciate Chair Jackson Lee having this hearing, and 
I'm going to yield the remaining two minutes to our Ranking 
Member, Mr. Biggs.
    Mr. Biggs. I thank the gentleman for yielding.
    My question at this time is to you, Ms. Frederick. 
Peachtree Corners, Georgia, used AI-driven Smart cameras to 
monitor social distancing, social distancing, and use of 
mastering COVID-19.
    With current technology, what prevents Federal agencies or 
State and local government from using facial recognition not 
just to use it for identification, but to track movement and 
behavior of law-abiding citizens?
    Ms. Frederick. I fear that's what is coming down the pike, 
and right now, there are specific transparency measures and 
lukewarm, I would say, strictures on some of these uses. Ms. 
Goodwin talked very well about the PIA and the SORN, and JO's 
recommendations. Those are great start points, but at this 
point, there are no hard constraints on the expansion of the 
remit for these cameras. So, the public safety remit, it's 
huge. It could encompass all sorts of things, and, clearly, 
certain municipalities are taking advantage of that to 
interpret public safety as they wish.
    So, unless Federal entities are required to really submit 
those efficacy reports, those transparency reports that don't 
yet exist in a very rigorous fashion, then I fear many official 
entities will be allowed to interpret the use of facial 
recognition, interpret the use of digital surveillance in any 
way they can.
    I will say quickly, when I was at Facebook, the ethos of 
private companies in shipping these products--and this is the 
problem with private companies being used by Federal entities 
you want to ship it. You want to ship it. You deploy these 
technologies as fast as you can and solve problems in the 
rollout in later iterations. That ethos is--
    Mr. Biggs. So, Ms. Frederick, thank you for answering that. 
I need to ask one more question because it deals with--and I 
want to talk to Professor Laurin really quick. She had her hand 
raised with regard to the question on investigation versus 
arrest. What has come to my mind is this notion of tainted 
photo lineups, tainted identification being used in 
investigation as predicate for making an arrest.
    Did you want to elaborate on that, Professor Laurin?
    Ms. Laurin. Thank you so much for the opportunity, Mr. 
Ranking Member.
    I will say just a bit more, and that's to say that I do 
think that it's possible in theory to think about a distinction 
in the use of the technology as an investigative tool versus 
forming the basis for an arrest or put differently, a facial 
recognition match by itself forming probable cause, right, to 
arrest.
    I think that from a regulatory standpoint, one needs to 
take great care in thinking about what strictures need to be 
put in place to create a meaningful distinction between those 
two things, right? So, for example, there are a number of 
agencies that currently say facial recognition itself cannot be 
the basis for probable cause. Then there are very few, if any, 
sort of strictures around what further investigation is 
required to build an independent case, right? Are there, for 
example, blinding procedures so that individuals who are then 
doing a subsequent lineup don't know that facial recognition 
initially matched the individual? If not, right, that is not a 
particularly meaningful independent check.
    The last thing I will say is that I think there's an 
important sort of intersection here between facial recognition 
technology, and what we already know about the fallibility of 
eyewitness identification, right, which, in over 75 percent of 
known DNA exonerations, there were now known to be wrong 
eyewitness identification that occurred.
    So, that is an investigative tool that itself is already 
quite fragile, right? When used as a corroborating piece of 
evidence with the fragile facial identification, that should 
give reason for concern as well.
    So, thinking about what strictures need to be in place to 
create a meaningful check on the investigative use of facial 
recognition is an important regulatory move.
    Mr. Biggs. Madam Chair and Ranking Member Jordan, my time 
has long expired. Thank you.
    Ms. Jackson Lee. Thank you very much.
    It is an issue that warrants a lot of involvement, and we 
thank our Members.
    I'm delighted now to recognize Ms. Demings, the gentlelady 
from Florida, who as a fellow Floridian on the panel, for five 
minutes.
    Ms. Demings.
    Ms. Demings. Let me say good morning to all of you. Thank 
you, Madam Chair, and thank you, Ranking Member, for this very 
important discussion. Thank you to all the Witnesses for being 
here. It is good to have a fellow Floridian on the panel as 
well, so thank you very much.
    As a former police executive, I'm always interested in new 
technologies that can help to keep the public safe, help to 
keep law enforcement safe, and improve the delivery of police 
service, and we know that we have work to do in all those 
areas.
    It was a decade ago that we were testing body cameras at 
the Orlando Police Department, and while we weren't really sure 
of the road ahead, I do know that the public was very excited 
about that possibility. Certainly, it is one of the 
recommendations that is coming to the forefront now, and so, we 
should always seek technologies that will enable us to do a 
better job.
    Dr. Alexander, you said that technology has long outpaced 
attempts to govern it, and I think that is such a powerful 
statement, because not only does it appear that the use of 
facial recognition outpaced our ability to govern it, but we 
also see that with other technologies as well.
    Dr. Alexander, you talked about some best practices, until 
we can find ourselves at a point where we can keep up, if that 
will ever happen, through regulation and other things. Could 
you just expand a little bit on some of those best practices?
    Dr. Alexander. Well, I think that--and thank you very much, 
Congresswoman Demings.
    I think it is very important for us to recognize that 
technology does move along very quickly. If we think about the 
development and the implementation of technology over the last 
10, 15, 20 years, or even the last five years, it oftentimes 
outpaces us as human beings. We are not prepared, oftentimes, 
for the unintended consequences that comes with the technology 
that we are all utilizing every day in our lives.
    So, yes, we have to develop best practices. What is some of 
those best practices? I think we can really only clearly define 
them once we make some determination about the real legitimacy 
of this facial recognition technology. Look, it can be 
innocently used with our cell phones just to open it up and 
conduct our personal business. When it comes to the 
constitutional right of American citizens across this country, 
regardless of where they are and what neighborhoods they are 
in, that becomes more problematic. Can we use it as an 
investigative tool? You know as well as I do, Congresswoman, 
oftentimes we might embark upon an investigation where we have 
zero leads, but maybe we have a piece of this facial 
recognition that suggests, could that be Cedric Alexander?
    What we can't do is what happened to Mr. Williams. We can 
look, well, let's look into the background of Cedric Alexander 
a little further. Let's explore some other things that we did 
not even think about, even had a suspect at all, or a person of 
interest. The problem even lies there, as I've already stated, 
is that sometimes, we can identify an individual that may be a 
likely person of interest, a suspect. If our own implicit 
biases get in the way, it will overwhelm the entire 
investigation, and it leads us down this dark path of where now 
we are really doing injury and hurt to people.
    Here's the bigger thing I would mention to all of you in 
Congress, is that as we go through this place in America right 
now around police reform, one thing we do not need is a piece 
of technology that is going to continue to drive a separation 
between police and community. People want good police. We all 
want great technology. The idea around this technology really 
is great. It's well-founded. It has to become more accurate. It 
has to have the type of legitimacy, trust in this technology 
that is needed for the American people to feel like it's not 
going to infringe upon the rights and hurt people in any type 
of way.
    I sit on the National Innocence Project Board, and I 
understand the importance of DNA technology and how it's used 
to free individuals, quite frankly, who should have never been 
incarcerated, like Mr. Williams. We have to do a better job in 
advancing this technology, but understanding that sometimes the 
technology moves along faster than we do as human beings, 
because we still have got a lot of catching up to do around 
many of the social challenges we have in our great Nation.
    So, I hope in some kind of small way that addressed your 
question, Congresswoman.
    Ms. Demings. It did. Thank you so much again. Thank you to 
all our Witnesses.
    With that, I yield back, Madam Chair.
    Ms. Jackson Lee. The gentlelady's time--thank you very 
much--has expired.
    We now recognize Mr. Gohmert for five minutes.
    Mr. Gohmert. Thank you, Madam Chair.
    I've got to say, this really warms my heart, to hear 
Witnesses from both sides having so much good input. It was my 
observation during the Bush years that the Department of 
Justice was able to convince Republicans--many of them, a 
majority of them--``No, we don't need any reform right now,'' 
and really push back and convince Republicans that were in 
charge to back off some reforms we should have made.
    I noticed resistance like that. I'm sure a lot of pressure 
from the Obama Administration on Democrats that were in charge 
not to do some reform that was needed.
    So, when I hear the testimony and the questions and hear 
this in such a great, bipartisan way, something that is so 
serious, it really does warm my heart. Some great comments and 
great questions.
    Dr. Alexander, I note you're careful to make sure that you 
note that this technology can be helpful, and then it sends me 
thinking back about our FBI.
    Do you think the technology on recording statements, 
instead of the FBI having their agents not record statements, 
audio, or video, but just write down their version of what the 
Witness, do you think the technology has gone too fast for the 
FBI, that maybe we could get them to start recording with audio 
at least, maybe video? I am being sarcastic.
    That is an area of concern for me. If you combine facial 
recognition mistakes with the FBI's ability to only record, 
only write down what their version of what a Witness says, you 
combine flaws like that, then we are looking at some serious 
problems at the Federal level.
    I am sure your local law enforcement which you oversaw 
recorded statements, so that wouldn't be such a problem.
    Can you see regulations, Dr. Alexander, that would help us 
be able to confine facial recognition just to the investigative 
stage?
    As a former judge, I know sometimes law enforcement will 
come in and they will go, ``Now, Judge, this is in the 
affidavit, but, hey, we did also . . . . Well, if you didn't 
swear to it, I don't want to hear it,'' That discrimination of 
real sworn evidence to unsworn evidence isn't always 
distinguished by some judges.
    Do you see some regulations we could utilize to actually 
get some teeth into that?
    Dr. Alexander. Thank you for that question, Congressman 
Gohmert.
    First, let me say, the FBI has the tendency, has the 
ability, has the training, and also has the fidelity of a 
Federal law enforcement agency, that is, well-trusted in 
everything that they have pursued, particularly around 
technology. They really stood up on technology, if you go back 
a hundred years with them, in terms of the work that they did 
back then, and they do today, forensically.
    They do have great capability of being able to access that 
type of technology and use it in a way that is standardized, 
one in which they have best practices, one in which they really 
are the leaders around the utilization, because they use this 
technology to go after foreign actors who want to do harm to 
this country or to utilize this even in domestic cases.
    Mr. Gohmert. Dr. Alexander, my time is short, but I am very 
concerned about some of their abuses. When we see the FISA 
Court that meets in secret, and then there doesn't seem to be 
any checks.
    Like I say, as a former judge, I'm very offended that some 
judges have not been outraged, like Mr. Williams' treatment and 
others. I think that is a problem when the judiciary is not 
upset enough to actually take action and bring law enforcement 
into line.
    Let me ask, Mr. Tolman, do you see a way to get specific 
regulations that would discern between investigation and 
actually going to court and getting a warrant and picking up 
somebody like Mr. Williams?
    Mr. Tolman. Yeah, I think you could implement things such 
as not allowing for the facial recognition technology to launch 
an investigation, that they would have to have the independent 
basis to launch the investigation. Then, ultimately, if it went 
to trial, you could utilize facial recognition technology as 
additional evidence.
    You can't have it the other way around, because then you 
have concerns about whether the evidence that is corroborating 
the facial recognition technology is simply influenced by the 
subjectivity of the investigators.
    Mr. Gohmert. Well, thank you.
    My time has expired. Thank you very much, Chair Jackson 
Lee. I really appreciate this hearing.
    Ms. Jackson Lee. Thank you so very much, Mr. Gohmert. We're 
more and more finding some common ground. Should be some good 
opportunity for us to move forward.
    I am delighted to recognize now, the gentlelady from 
Georgia for five minutes. Ms. McBath is recognized.
    Ms. McBath. Thank you so much, Chair. I really, really 
appreciate you today.
    Thank you to each and every one of you that is here today, 
and just thank you so much for all your timely resource and 
information, and your expertise.
    I would like to go ahead and just start a really quick 
question.
    Dr. Alexander, looking at how we might regulate facial 
recognition technology, how can we seek to build trust between 
law enforcement and the communities that they serve, especially 
communities of color?
    Dr. Alexander. Well, fundamentally, that's where a lot of 
the issue lies, Congresswoman McBath, because currently in the 
environment that we're in--and this is not new, it's just more 
exacerbated now over the last several years--there is certainly 
a great deal of mistrust between police and community.
    You're already dealing with that in the current context of 
where we are right now in this country, but as we continue to 
introduce new technology and ways of seeing things that are 
questionable, such as facial recognition, if we don't do it 
right and we end up with continued victims like Mr. Williams, 
then we continue to create that separation where legitimacy 
around policing continues to be questioned.
    This is the fight that I find myself fighting every day in 
the roles that I play now, is really trying to build those 
relationships, but it certainly does have its challenges 
associated with them.
    We've got to get the technology right, because if we don't, 
it's not going to help the cause in building those 
relationships. Because if we don't have that collaboration 
between police and community--and I don't care where it is--
we're going to continue to find ourselves here and continue, 
all of us, to continue to be at risk.
    Ms. McBath. Thank you so much and stay in the fight.
    Dr. Alexander. Yes, ma'am.
    Ms. McBath. Professor Laurin, your testimony, now it said 
the rules of discovery, the sharing of case information between 
the prosecution and the defense are restrictive. So, 
prosecutors don't actually have to mention what facial 
recognition technologies they might have used to investigate, 
and our courts don't get much of an opportunity to advance the 
law by sorting out what are acceptable and unacceptable uses of 
this technology.
    Could different kind of discovery rules help with this kind 
of problem?
    Ms. Laurin. Thank you so much for the question, 
Congresswoman.
    I think the short answer is yes. Just at the risk of going 
a little bit into the weeds, one thing to understand about 
criminal discovery in the United States, just like criminal law 
generally in the United States, is that there's tremendous 
diversity of systems, because we have at least 51 criminal 
justice systems in the United States, given the States and the 
Federal government.
    So, if you look across those systems, the Federal criminal 
discovery is really among the most restrictive. Rule 16 is very 
limited in terms of what information the government is required 
to disclose to the defense.
    There are a number of States that have, particularly in 
recent years, gone quite in the opposite direction, New York 
most recently, but my own State of Texas in passing the Michael 
Morton Act moved to essentially what's called an open file 
discovery system.
    Under that system, the defense typically would have access 
to essentially the entirety of the nonprivileged, anyway, 
investigative file, would know the steps that investigators 
took, regardless of whether the prosecution was going to 
present that evidence, and regardless of whether a court deemed 
it to be exculpatory or favorable to the defense.
    So, I actually think, in addition to some of the points 
that Mr. Tolman was just making about regulations that one 
would want to have in place about how evidence could be used, 
particularly in thinking about the Federal system, that 
thinking about opening discovery more with regard to facial 
recognition technology is really an essential part of achieving 
transparency and really vetting the reliability of this in the 
course of criminal adjudication.
    Ms. McBath. Thank you.
    In just the few moments I have left, Professor Friedman, 
should Congress regulate private and government-owned systems? 
I guess I know that the GAO report shows, as others have noted 
today, that facial recognition systems owned by the government, 
as well as those owned by private companies and used by 
government actors, aren't always regulated. Does the ownership 
of these systems play a role in how they are used?
    Mr. Friedman. Sure. Thank you for the question.
    I think it's essential that there be regulation. In fact, I 
think you're hearing all of us say that it's essential that 
there be regulation. The likeliest way for Congress to regulate 
is actually the vendors, the manufacturers of the products that 
are distributing on the interstate commerce. That's a 
constitutional hook to get at the regulation. So, I think it's 
essential.
    Now, you do want to regulate the use of those technologies 
by law enforcement agencies. There's been a lot of discussion 
about trust between communities and law enforcement, but you're 
just not going to see that until there's a serious set of 
regulations for how the technologies are used with very strong 
guardrails. Again, I could say more, but I see the time has 
expired.
    Ms. McBath. Well, thank you so much, each of you.
    I yield back the balance of my time, which there is none. 
Thank you.
    Ms. Jackson Lee. Your time has expired. Thank you so very 
much.
    I now recognize Mr. Tiffany for five minutes.
    Mr. Tiffany. Thank you very much, Madam Chair.
    Ms. Frederick, I want to just frame my question with the 
background that the Biden Administration is reported in the 
last couple of days to be coming out that they're going to 
enlist the wireless phone carriers to vet people's text 
messages.
    I mention that because some of my constituents believe we 
are a short leap to the Chinese Communist Party's use of 
technology in their country where they have created their 
social credit system.
    Are my constituents right to be concerned about where we're 
at here in the United States?
    Ms. Frederick. Well, I will say, as somebody who worked in 
the intelligence community for almost a decade, I think your 
constituents are right to be concerned about where these 
technologies can take us.
    What I think is particularly disconcerting is what you 
mentioned: SMS contents potentially being parsed. Email content 
has been mentioned as the next frontier. Podcasts have been 
mentioned as having a portion of unfettered conversations.
    These are bastions that shore up our culture of free 
speech. I definitely think it is of concern to see where 
authoritarian governments are taking us. Again, we have that 
system of guardrails.
    It is important, especially for conservatives and those 
expressing heterodox views, to be vigilant and to watch where 
this technology naturally goes and guard against it. I think 
we're all advocating here for certain guards in some respect.
    Mr. Tiffany. Dr. Alexander, you mentioned the State of 
Maine, in particular Portland, Maine. We also heard from Mr. 
Lee, commented communities have not been involved.
    Isn't there a role for city councils, State governments, 
those governments that are at the State and local level to get 
involved with this? Because they're responsible for their 
police forces in all instances, correct?
    Sir, I think you are muted.
    Dr. Alexander. Yes, thank you, Congressman.
    Yes, you are absolutely right, because there at the local 
level is where it all starts, in locally elected officials, 
mayors, city managers, county managers, whatever the case may 
happen to be. They certainly do need to have a voice in this.
    Because what you want to always make sure of is that your 
police agencies are operating at the highest level of training 
and certifications. We don't have that yet across the country, 
and that's why you find some communities that are backing off 
of this technology until more concrete findings are done.
    Mr. Tiffany. Yeah. Thank you very much for that.
    I am just thinking about local elected officials. I have 
many constituents, though I am in the Wisconsin, that work in 
the Twin Cities, in Minneapolis. I mean, we saw a real failure 
on the part of in particular the mayor of Minneapolis and the 
city council in addressing the riots that overwhelmed the city 
of Minneapolis last year. A lot of it is because they didn't 
take control of their police department. There were a few bad 
cops in that police department, and they needed to take care of 
them. They did not do it.
    This is the same thing. They need to address the 
technology. It isn't just the city of Minneapolis. It's cities 
across the country, as well as States. They have the purview, 
they have the obligation to oversee their police departments, 
and they should do that, including in regard to using this type 
of technology.
    Final thing that I would say before I am going to yield to 
Mr. Biggs.
    This is the example of the limits of technology. I think 
about simple things, like with children. They get better 
reading retention when they read off from paper versus off from 
an electronic format.
    I think about police the same way here. You cannot replace 
boots on the ground and good investigative work with technology 
in all instances.
    I yield to Mr. Biggs.
    Dr. Alexander. That's correct.
    Mr. Biggs. I thank the gentleman for yielding.
    My question for you, Mr. Tolman, is this. U.S. Capitol 
Police has been given Army surveillance equipment to start 
monitoring Americans. We have a report from a group called 
Surveillance Technology Oversight Project that says that they 
think that represents an expansion of police power and 
surveillance.
    Since I just have a few seconds, and then I am going to get 
to ask the question to you, they point out the expansive nature 
of the surveillance state.
    How do you see this role of actually using military 
surveillance, giving equipment and technology to law 
enforcement? How does that expand the surveillance State?
    Mr. Tolman. Well, in this particular instance it's of grave 
concern because right now the Capitol Police are exempt from 
the Freedom of Information Act. So, not only would it be 
difficult under rules of discovery for a defense counsel, you 
also would not be able to get transparency.
    So, it creates a very powerful surveillance State in an 
agency that probably is unaccustomed and does not have the 
history of being these surveillers of United States citizens. 
So, it's a recipe for disaster.
    Mr. Biggs. Thank you.
    My time has expired, Mr. Tiffany.
    Mr. Tiffany. Yes. I will yield back with just one final 
thing.
    I think we should put in the mix here what is happening 
with these text messages, that they're talking about enlisting 
the phone companies, the wireless carriers, to intercept 
people's SMS messages. I think that's a very chilling thing and 
that should not be done by the Biden Administration. We should 
throw that into the mix in dealing with this issue as we go 
forward.
    Thank you very much, Madam Chair.
    Ms. Jackson Lee. Well, thank you very much. I know that 
these issues reach beyond Administrations and deal with 
important questions that this Committee is now bringing 
forward. Thank you so very much.
    It's now my pleasure to yield to the gentlelady from 
Pennsylvania, Ms. Dean, for five minutes.
    Ms. Dean. Thank you, Madam Chair. Thank you for convening 
this hearing on innovative technology and its appropriate use 
and regulation under the law.
    I also thank all our testifiers and advocates today for the 
enlightening things that you're able to share with us.
    Mr. Lee, I would like to start with you. Companies like 
Clearview claim that through their database it can 
instantaneously identify people with, quote, ``unprecedented 
accuracy.'' Yet, Mr. Williams' story isn't unique. Such 
technology has resulted, as we know, in at least two false 
arrests in Michigan and one in New
Jersey.
    Organizations such as ACLU have expressed concern around 
the risk that Clearview's technology may have on undocumented 
immigrants, communities of color, survivors of domestic 
violence, and other vulnerable people and communities. Could 
you share with us some of the dangers of this face print and 
its misuse?
    Mr. Lee. Absolutely. Thank you for the question, 
Congresswoman.
    There are a number of civil rights and civil liberties 
concerns with private access to law enforcement investigations, 
and also with law enforcement's ability to purchase private 
data. There is little to no transparency, as we said before.
    I think a fundamental issue that we need to talk about with 
respect to this hearing is that these technologies are not 
accurate, and I think we are having a conversation as if they 
are. There are significant error rates across the spectrum, 
particularly for marginalized and multi-marginalized 
communities.
    So, with little oversight, with little guidance, with no 
community input, and no meaningful ways to engage in checks and 
balances, any conversation around the use of facial recognition 
technology within the criminal legal process is something that 
The Leadership Conference and many advocacy organizations push 
back against wholeheartedly.
    We should be thinking about a way in which to envision law 
enforcement and community engagement with law enforcement that 
does not require surveillance, near-constant surveillance, nor 
near-constant over policing. That is something that we need to 
push forward towards.
    Ms. Dean. Thank you so much for that. So, transparency, 
recognition that this is not a foolproof technology, and the 
error rate, there's an awful lot more we need to learn.
    Mr. Williams, of course we are all moved by your story. 
Your young daughters watched as you were arrested for a crime 
you did not commit.
    How did your arrest and detainment affect your children, 
family, and community?
    Mr. Williams. Thank you for the question.
    Well, it mainly affected my oldest daughter. My other 
daughter was only two, so she didn't know what was going on. 
She cried, too. She just was crying because her sister was 
crying.
    Like I said earlier, I was trying to tell the story. It was 
a short time. We thought about taking her to see a psychiatrist 
because every time she sees the story on TV it makes her very 
emotional and she starts crying, telling us to turn the TV off. 
I am like, ``I am not going to jail every time you see it on 
TV, baby. It's just they're reporting the story again.''
    I had wrote something down when somebody else had asked a 
question. The police actually did investigate it. With all 
those eyeballs on the pictures, nobody said, ``These two guys 
don't look alike.'' I don't know how the human part goes into 
it, but--
    Ms. Dean. Have you ever heard any explanation for that?
    Mr. Williams. No. Nobody told me anything.
    Then we basically had to--well, I can't say force them, but 
we asked them over and over and over again for a proper 
explanation or a way to say it with some type of--I don't know 
what the word for it is. They just didn't seem like they were 
sorry. Like, they didn't want to be apologetic about it. They 
were like, ``It happens. We made a mistake.'' I'm like, ``That 
doesn't do anything for me, because all I did was come home 
from work and got arrested.''
    Then, it's other stuff, like I didn't get dinner that 
night. They had already served food at the prison. So, I got 
off work, I got to go sit in a cell with no food and no 
drinking water. It was terrible for somebody that--I wasn't 
ready for it.
    Ms. Dean. I have to say, Mr. Williams, thank you for 
sharing that with us. I asked about your children, but we have 
to realize what impact this has had on you, a long-term impact 
on you, to be picked up after work and arrested and detained so 
inhumanely and so incorrectly.
    So, thank you, Madam Chair. I see my time--
    Mr. Williams. I didn't mean to cut you off. I don't know if 
you can hear that I got a small speech impediment right at this 
moment. Last year, I also suffered some strokes. They're 
looking into that to see what actually caused the strokes. I 
have no idea at this point. I had one doctor say it could have 
been attributed to stress or whatnot. I don't know if he has a 
way to prove it. I don't know. I did have strokes last year 
after.
    Ms. Dean. God bless you.
    Mr. Williams. Thank you.
    Ms. Dean. Please, I pray for your restored health.
    Thank you. I yield back.
    Ms. Jackson Lee. Thank you. The gentlelady's time has 
expired.
    Mr. Williams' story should inspire all us, among others, to 
finding a bipartisan pathway forward.
    Let me now yield five minutes to Mr. Massie.
    You're recognized, Mr. Massie, for five minutes.
    Mr. Massie. Thank you, Madam Chair, for holding this very 
important hearing on a very important topic.
    Before I jump into facial recognition, I want to talk about 
name recognition.
    Imagine a system, a computerized system, that instead of 
basing the adjudication of your civil rights on the distance 
between your eyes or the shape of your nose or where your ears 
are in relation to your cheekbones, it's based on the spelling 
of your first and last name and your birthday and whether you 
might share a same name with somebody who has been convicted of 
a felony or may be in prison today.
    Well, such a system exists, and there is no judge or jury 
involved. It's called the NICS Background Check System.
    The troubling thing about facial recognition as it exists 
today, is there may be racial disparities in it. We know that 
there are racial disparities in the NICS Instant Background 
Check System. That's the system that you have to go through to 
purchase a firearm.
    My friend John Lott, who worked at the DOJ recently, has 
found evidence that this system by a factor of three to one 
produces false denials for Black males compared to White males 
because they may share a similar name to somebody who is 
incarcerated within their ethnic group.
    So, Ms. Goodwin, I appreciate the good work that the GAO 
does. You all do great work. I served on the Oversight 
Committee. We covered this topic before there, the facial 
recognition topic.
    The GAO released a report that said there were 112,000 
denials based on the instant background check and only 12 
Federal prosecutions. Ostensibly, 100,000 people committed a 
crime by trying to buy a firearm when they were ineligible.
    What we know from that is actually that they didn't commit 
a crime, but there were a lot of false positives.
    I got the commitment of FBI Director Wray in one of our 
recent hearings to look into this issue about whether there are 
racial disparities built into the Instant Background Check 
System. This is not very technologically advanced at all. He 
said he would work on it.
    I hope that this comes across your desk and that you have 
an opportunity that you might look into it. Is this something 
that your part of the GAO would examine?
    Dr. Goodwin. Thank you, Congressman. Actually that was my 
report. I worked on that report.
    Mr. Massie. Excellent.
    Dr. Goodwin. The prohibited purchases report. So, we are 
happy to have a conversation with you about additional work we 
can do in that area.
    Mr. Massie. Wonderful. Thank you so much. Thank you for 
that report.
    They do collect race information on the Form 4473. So, if 
we could just take another level, it wouldn't have to be as 
deep and thorough as the report that you did already, but it 
would be very helpful, I think.
    I noticed that the beginning of this recent report on 
facial recognition--or your testimony--started with: We 
surveyed 42 Federal agencies with law enforcement employees or 
internal law enforcement. I thought, ``Whoa, whoa, whoa, stop 
right there.'' That to me is troubling, that we have 42 Federal 
agencies that have law enforcement.
    Maybe this is something the Democrats and Republicans could 
get together on. Maybe we have too many organizations. If the 
Department of Education has a police force, then maybe we have 
too many police forces at the Federal level.
    Moving on, I wanted to ask one of our professors here, 
maybe Professor Laurin, about Brady material, possible 
exculpatory evidence that's produced and that the prosecution 
may not want to share with the defense.
    You mentioned that the way they get around some of the 
typical safeguards is they say, ``Oh, we just used it to 
identify the suspect and we're not using it for the 
conviction.''
    Can you talk about how that's problematic and whether 
defense attorneys can get Brady material from facial 
recognition companies?
    Ms. Laurin. Thank you for the question.
    I'll start with the last piece that you mentioned, which is 
getting Brady material from companies. In other words, there 
may be instances, assuming a world that we don't quite live in 
yet, where facial recognition matches, say, are actually 
introduced in evidence in a criminal case.
    In those instances, the defense would certainly want to, 
and a competent defense would, sort of interrogate: What's 
inside the Black box that generated this match?
    Well, one of the complicating features here is the 
proprietary nature of many of these technologies.
    So, we haven't seen this yet so much in the facial 
recognition context, but we have seen it actually in the 
context of DNA with certain more novel analyses of low copy 
DNA, et cetera, where the defense has been denied the ability 
to access the source code, access what is claimed by companies 
to be proprietary information protected by trade secrets. That 
intellectual property law has been invoked to quash defense 
subpoenas.
    So, one of the many complicating features here, one of the 
many limits on the ability of the defense to vet this in a 
criminal context, is this proprietary dynamic.
    Layered on top of that are a number of complications that 
derive from Brady doctrine and disagreement over the scope of 
Brady and deciding when defendants are entitled to what we 
would call Brady material ex ante versus after the fact.
    I know you have limited time. I am happy to pause there.
    Mr. Massie. I see my time has expired. I just want to say, 
I hope Congress will weigh in on this instead of, like the 
Brady doctrine, allowing the courts to write this for us, 
because it does seem to be a problem not just within facial 
recognition, or yet, but with--across all the domains.
    I yield back.
    Ms. Jackson Lee. Thank you very much.
    I now recognize the gentleman who has dealt with some of 
these tech issues from a different perspective, the gentleman 
from Rhode Island, Mr. Cicilline.
    You are recognized for five minutes.
    Mr. Cicilline. Thank you, Madam Chair, for calling this 
very important hearing.
    Thank you to the Witnesses for your really compelling 
testimony.
    I want to begin with you, Mr. Williams. I am just wondering 
whether--first, thank you for being here. Sometimes we talk 
about these issues in terms of public policy, and it's good to 
be reminded this actually impacts people's lives in a very 
profound way, and your story is one example of that.
    My question to you is, did it seem like the police had even 
bothered to conduct an actual investigation, or did they appear 
to be relying solely on what the computer told them in 
connection with your arrest?
    Mr. Williams. Okay. Thank you for the question.
    So, they did provide information. Like, I didn't find out 
any of this until after we went to court. They had a lineup of, 
I guess, six guys, and none of us looked alike. I was like, 
``Who put this together?'' I am going to guess, because I 
wasn't there, but somebody picked me and said, ``This is the 
guy right here. Do you see him in this lineup?''
    I looked nothing like the other guys. I was actually 10 
years older than all the other guys in the pictures. They 
didn't use my current driver's license photo, they used one 
from six years ago. Well, at this point it's eight years ago.
    Mr. Cicilline. Thank you.
    Director Goodwin, are there specific privacy concerns 
related to potential data breaches of facial recognition 
systems that are more concerning than breaches of other systems 
used by the Federal government?
    Dr. Goodwin. Yes, Congressman. Thank you for that question.
    So, yes, there are. So, as I mentioned earlier, your face 
is permanent, it's distinctive. If that information is 
breached, it's a lot more problematic and more concerning than 
if your password is breached, because you can change your 
password, but you can't change, for the most part, you can't 
change your face.
    So, there are major privacy concerns about what it looks 
like when the technology--when that information is breached.
    Mr. Cicilline. Thank you.
    Mr. Lee, facial recognition systems have varying levels of 
accuracy, but we know they're especially ineffective in 
identifying people accurately from communities of color and 
women.
    I am just wondering, is there even any training that might 
help law enforcement to overcome these inaccuracies, or are 
they so embedded in the system that we should ban facial 
recognition outright?
    Mr. Lee. Thank you for the question, Congressman.
    I think that we need to think about, one, contexts of 
policing that don't involve surveillance and don't involve 
overly surveilling marginalized communities and over policing 
them as well.
    Also, I would highlight that one of the things that we need 
to think about is that the technology doesn't--if the 
technology can't--is not surveilling or if the technology is 
not working to the same communities that it is trying to help, 
is this technology worth using in the first place?
    So, that's something that we need to think about, 
particularly within the context of civil rights concerns. 
Because, again, Mr. Williams was wrongfully arrested, 
wrongfully detained for a crime that he did not commit based 
off of a lineup and setup and use of facial recognition 
technology that was not authorized by the local government, was 
not known by Mr. Williams himself, was not understood under any 
circumstances by the people using it, and had varying levels of 
accuracy across the board that were not given to Mr. Williams 
at the time or to the law enforcement--
    Mr. Cicilline. I hate to interrupt you. I want to get to 
one more question.
    Professor Laurin, building on what Mr. Massie was asking 
about, what Mr. Lee just mentioned, we could just simply 
require prosecutors to inform a defendant that facial 
recognition technology was used to identify them in any part of 
the process. That's one reform we could do.
    In addition, we could require that the prosecutors disclose 
any other matches that are sort of exculpatory, statutorily, as 
Mr. Massie was saying with respect to Brady. There are things 
we could do to at least make the process more transparent and 
hopefully give defendants the opportunity to challenge this 
evidence.
    Ms. Laurin. I agree. There certainly are other items that 
could be added to that. For example, to the extent the systems 
report out confidence levels, that is, something that seems 
necessary to disclose.
    I also think this issue of being able to get at that Black 
box and actually know what's driving the results, which does 
then involve these questions around intellectual property, is 
really an important issue to get at.
    Mr. Cicilline. I just want to say, because the Chair 
recognized this, part of this is facilitated by the monopoly 
power of these large technology platforms, the fact that they 
are unregulated, the fact that they are collecting enormous 
amounts of surveillance. As Ms. Frederick said, the danger of 
integrating all this presents real challenges.
    So, I think it also requires us to regulate how private 
companies can provide this technology to law enforcement, and 
larger questions about how we have to rein in Big Tech. I look 
forward to following up with Ms. Frederick in more detail on 
her very compelling testimony.
    I yield back. Thank you, Madam Chair, for the indulgence.
    Ms. Jackson Lee. The gentleman's time has expired. Thank 
you very much.
    We have other Members who have yet to ask their questions. 
So, we will certainly move so that you can be included.
    Now, I am recognizing Mr. Fitzgerald for five minutes.
    Mr. Fitzgerald.
    Mr. Fitzgerald. Thank you, Madam Chair. Thank you very 
much. Fascinating stuff this morning.
    I just wanted to move maybe in a direction of a little bit 
away from the law enforcement aspect to just the private 
sector, which some of the Members touched on this morning.
    As we see, iPhones that use face recognition. Any time you 
are going through the airport now, you see the Clear system, 
which is allowing access for individuals in airports on a 
regular basis. Then some employers, now that we're reading, are 
using this in relationship to their employees and under the 
auspices of security. So, there's a lot of different things 
happening in the private sector.
    The question I had for Ms. Frederick and Mr. Tolman, can 
you address some of the constitutional and privacy concerns of 
private sector use, particularly related to the difference 
between using facial recognition technology for verification 
purposes rather than just flat-out identification?
    Ms. Frederick. So, I will start here. I think generally, 
from a wave-top level, it's that outsourcing of surveillance 
that portends more concrete, potential concrete violations of 
the Fourth Amendment.
    I'm not a lawyer. I don't profess to be. When I see the 
surveillance State basically outsourcing this job to private 
companies that don't fall under the remit of the United States 
Constitution, that's what starts to worry me as a citizen, 
somebody who has been behind the powerful technologies that are 
employed at a commercial level also by the surveillance States 
overseas.
    To me, it's that outsourcing to private companies that we 
need to think about when it comes to the Constitution, because 
some reports indicate that is expressly made to skirt the 
constitutional protections.
    So, I'll let Mr. Tolman give his broader expertise on this.
    Mr. Tolman. Yeah, I think that last point is where I wanted 
to go, which is oftentimes the use of third-party providers is 
an end run around the protections. We have already seen one 
court has indicated that the Baltimore Police Department 
violated the Fourth Amendment in its use of facial recognition 
technology. That would be one pressure point to push you 
towards utilization of a third-party provider, one in which you 
have a somewhat lower standard when utilizing the evidence from 
them.
    We don't know the nature of some of those partnerships. 
That's the biggest problem right here, is the lack of 
transparency in the information. We don't know what it is, 
sometimes even the questions we need to ask to find out how 
pervasive that relationship is.
    Mr. Fitzgerald. Thank you. Great answers.
    The other question that I would have for both of you would 
be there's been some discussion about a full-out ban. What 
we're seeing across the Nation is some municipalities through 
ordinance have been progressively addressing this issue as it 
emerges, as part of something that's under their control, many 
times it is a municipal court system. They're moving forward, 
city councils and even some county boards are moving forward to 
full-out bans.
    Is that practical at this point? You can probably hear it 
in my voice. It doesn't seem like it is to me at this point.
    Mr. Tolman. I would just quickly say that it is hard to 
imagine that this powerful tool is not to be utilized at all. I 
understand the calls for bans, but that's what makes this so 
difficult, is there are circumstances in which this powerful 
technology may be the only thing that may save a victim or save 
a town from some horrific crime or from a terrorist attack, for 
example. So, getting this right in how we do authorize law 
enforcement to use it is right now the most important job I 
think this Congress has.
    Ms. Frederick. I prefer to think of it more--
    Mr. Fitzpatrick. Ms. Frederick, do you care to comment on 
that.
    Ms. Frederick. Yes, sir. I prefer to think of it more as a 
tactical pause, take the time to get it right. So, I think 
there are measures with which we could do so.
    Lawsuits are being fought in the courts as well. ACLU is 
suing Clearview AI. Its Illinois arm filed a 2018 lawsuit 
against Chicago Police. So, there's other mechanisms as well as 
municipalities to take that tactical pause, take the time to 
get it right.
    Mr. Fitzgerald. Thank you.
    Thank you, Madam Chair. I yield back. Thank you.
    Ms. Jackson Lee. I thank the gentleman.
    I am now pleased to recognize the gentleman from 
California, a former AUSA, Mr. Lieu.
    You are recognized for five minutes.
    Mr. Lieu. Thank you, Chair Sheila Jackson Lee.
    I am super excited that you called this hearing. My office 
and I have been working on facial recognition for over two 
years. The reason it's taken so long is, as the Chair has aptly 
stated, it is a can of worms. When you look at this issue, it 
can get very complicated.
    We have worked on legislation that is essentially complete 
now. It basically requires a warrant for most cases of the use 
of facial recognition technology. It sets auditing and bias 
testing standards. It requires reporting. It prohibits facial 
recognition technology for most First Amendment-protected 
activity. There are a number of other provisions.
    We're going to circulate that legislation to the Members of 
this Committee, both Democrats and Republicans, for your review 
and comment and coauthorship.
    So, thank you, Chair Lee, for this hearing.
    My first question goes to Professor Laurin.
    We already know from this hearing, facial recognition 
technology implicates the First, Fourth, and Fifth Amendments. 
I want to talk about the Equal Protection Clause, because there 
is this disparity in accuracy in terms of the color of your 
skin. Would there also be an equal protection violation if we 
had government agencies deploy this knowing that similarly 
situated people are being treated differently in terms of 
surveillance?
    Ms. Laurin. So, thank you very much for the question.
    I think that in answering the question I want to sort of 
distinguish between constitutional doctrine as it stands and 
maybe a more intuitive sense of what is rightfully understood 
to be constitutionally problematic.
    So, I think the challenge with using the Equal Protection 
Clause and the doctrine around it to get at these issues of 
bias in facial recognition is that existing Supreme Court 
doctrine requires that essentially discriminatory intent be 
proved for State action to be deemed to be a violation of the 
Equal Protection Clause.
    That presents a hurdle when one is working with evidence of 
disparate impact. There's an evidentiary gap often in getting 
from the State realizes there's a disparity to the State wants 
to discriminate on the basis of race or on the basis of sex. 
That has been a real difficulty in using the courts to get at 
these problems of disparity.
    Now, that doesn't mean that it wouldn't be understandable, 
appropriate for Congress to say, ``We see a problem in these 
disparities,'' and want to legislate to require that 
disparities themselves be sufficient basis for limiting the use 
of technology, et cetera. Of course, one needs to think about 
what the hook would be for that in terms of section 5 or the 
Commerce Clause or something along those lines.
    I think that existing doctrine presents a challenge because 
of this requirement of discriminatory intent.
    Mr. Lieu. Thank you. I appreciate that.
    My next question goes to Professor Friedman.
    First, let me take a step back and say I do agree with Dr. 
Alexander and Mr. Tolman that we can't just ban this 
technology. If you look at the course of human history, nearly 
impossible to ban technology. We can regulate it.
    Professor Friedman, because there's so many different 
iterations and possible uses and situations for facial 
recognition technology, what do you think about imposing a 
warrant requirement that can actually take into account all the 
different particularized factors before deciding whether any 
specific case we would use facial recognition technology?
    Mr. Friedman. Thank you for the question.
    I just want to make a comment about the ban versus 
moratorium or strategic pause point, which I think everybody is 
recognizing. We got the cart ahead of the horse here, and 
that's the problem. So, regulation is the only shot we have of 
getting the horse back out in front.
    Warrant requirements are useful in some circumstances and 
not others, and it depends on the use case for the technology. 
So, for example, if the use case is identification, which is 
what most agencies are doing right now, they take a probe photo 
and they compare it to a database, a warrant requirement makes 
sense, but only to get permission to use the technology because 
there is a reasonable belief that the photo that you have got 
is somebody that is suspected of a crime that is permitted 
under the statute. I, for one, would only allow the use facial 
recognition for very serious crimes.
    So, that is a use of the warrant that makes sense. The 
warrant can't tell you, for example, that a particular 
individual committed the crime. It's just to use the 
technology.
    Mr. Lieu. Great. Thank you.
    I see my time is up, and I yield back.
    Ms. Jackson Lee. The gentleman's time has expired.
    I think we have present with us the distinguished gentleman 
from Utah.
    Mr. Burgess Owens is recognized for two minutes.
    Mr. Owens. Thank you, Chair Jackson Lee and Ranking Member 
Biggs. Thank you for holding this very important hearing today.
    The concerns surrounding facial recognition technology 
demand a bipartisan solution, and I look forward to working 
with my colleagues to address these concerns.
    I'd like to thank the Witnesses today.
    Mr. Williams, we all agree on the unfairness of what you 
and your family have had to endure. Thank you so much for your 
decision to stay in the fight, to share your experiences, and 
do your best to make sure no other Americans have to experience 
what you and your family are going through.
    I'd also like to give a special welcome to my friend and 
fellow Utahn, Mr. Brett Tolman. Along with other 
accomplishments mentioned previously, Brett has also been a 
very strong advocate of criminal justice reform.
    I am finding the upside of being toward the end of the 
questioning is the remarkable education that a person received 
over the last couple of hours. The downside is the questions I 
had coming into the session have been answered. So, I am going 
to yield the remainder of my time back to Ranking Member Biggs.
    Before I do, I would like to highlight something that was 
mentioned previously, and that is the heartwarming bipartisan 
concern that I am feeling and seeing as we go through this 
topic.
    I'd also like to point out that as we do confront solutions 
of solving the increased crime that we are seeing everywhere 
that we cannot rely on technology. Technology is our tool to 
help support human relationships, and all human relationships 
are based on respect.
    We cannot continue to demean and disrespect the good 
servants, the 90 percent of good men and women policing our 
communities and expect technology to make up for it.
    I am convinced that this Committee will find the solutions 
and address the concerns of facial recognition technology. 
Let's take the same passion to bring back the respect for our 
law enforcement officers, men and women, the same respect that 
I had when I experienced it as a child. Our communities will 
have the benefit of more confidence in our future innovations 
once we have that accomplished.
    With that, I would like to yield back my remaining time to 
Mr. Biggs.
    Mr. Biggs. I thank the gentleman for yielding, and Madam 
Chair.
    Just a couple of quick points. This has been a very 
enlightening hearing. We have not really addressed, in my 
opinion, enough the question on reasonable expectation of 
privacy that's been shattered by the data scraping that goes on 
by private concerns to develop these databases which are being 
referred to.
    I hope that we can expand our investigation into that, 
because I don't think the question has been settled through the 
courts yet either. If it has, I haven't seen it. We certainly 
need to get to that.
    I also want to comment on Mr. Lieu's legislation.
    I look forward to reading it, Mr. Lieu, and I appreciate 
your efforts on it. I hope that we can also integrate some of 
the work that was done in the previous iteration of the OGR 
Committee with Chair Cummings and working on that as well. So, 
I look forward to seeing it, Mr. Lieu, and hope that you can 
get that to us soon. I appreciate it.
    When I want to ask Mr. Tolman this question because it 
really gets to the heart of some of what we're talking about. 
This is just, as Mr. Lieu said, this is a can of worms when you 
open this thing up.
    When we start bringing together other technologies, we have 
discussed this SMS and email, and we have moved from just using 
this mere identification, which is in and of itself a problem, 
to the potential to track movement, this notion of science 
fiction-type, pre-crime, and stopping pre-crime.
    What's your take on that, in a very brief way, that we're 
on a spectrum that's going to keep moving to trying to stop 
people from committing crimes before a crime is ever committed 
using this technology and others combined?
    Mr. Tolman. Yeah, I share the concern. We have seen that 
when law enforcement is given additional technology their first 
instinct is to utilize it because they have a sincere and 
genuine desire to work on behalf of our communities and the 
victims of crime.
    However, they are often the last one to actually ask the 
question: Should we utilize this technology? That has to be the 
reasoned discussion of Members of the legislature that 
identifies this.
    We are already seeing that speech and social distancing 
under COVID and areas where we have not ever been, are now 
areas where we are focusing law enforcement technology and 
action. Because of that, we're on a slippery slope, and I would 
share your concern.
    Mr. Biggs. Thanks, Mr. Tolman. I hate to cut you off, but I 
want to finish from our side where we started with the Chair, 
and that's the testimony of Mr. Williams.
    So, Mr. Williams, in your testimony, your written 
testimony, you said, quote, ``I don't want anyone to walk away 
thinking that if only the technology was made more accurate its 
problems would be solved,'' close quote.
    Well, I think the issues which we have discussed today are 
not just issues with accuracy of the technology. I think we're 
talking about a total invasion of privacy, which is absolutely 
unacceptable, which at some point is going to give too much 
power to the state. We had actually seen it already in your 
case, Mr. Williams, and others that we have discussed today.
    So, I am looking forward to working with the Chair, Mr. 
Lieu, and others on this Committee who have a sincere interest 
in resolving this issue.
    Thank you, and I yield back.
    Ms. Jackson Lee. Thank you, Mr. Biggs. You are absolutely 
right, we have a sincere interest in resolving this issue, and 
there are layers of issues that we can respond to as a 
Committee in a bipartisan manner.
    Speaking of bipartisanship, I'm delighted to yield to the 
gentleman from California, a former sheriff, Mr. Correa, for 
five minutes.
    Mr. Correa. Thank you, Madam Chair, for holding this most 
important hearing.
    I want to thank all our Witnesses today for some very good 
testimony.
    Mr. Williams, I will also say, hearing your story, being in 
a cell for 30 hours, not knowing why or what you were accused 
of, is alarming, tragic at many levels.
    As we think about society today, when you give your 
fingerprints, you give permission to have your fingerprints 
taken. When you go in for a live scan, you say, ``Here are my 
prints, take the scan.'' When you give your DNA, you give it 
with permission, knowing that somebody is going to do a DNA 
scan on you.
    When it comes to facial recognition, no consent is 
necessary. In fact, most people don't even know that their 
information, their facial information, is being collected by 
somebody, a private entity out there.
    Mr. Bertram Lee mentioned that 133 million Americans--let 
me repeat that--133 million Americans have their facial 
recognition data in a database some way or another in this 
country. I presume that most, if not all, gave that information 
or gave their facial information without any consent or even 
them knowing that data was taken from them.
    So, Professor Laurin and Professor Friedman, today I have a 
question for you. We discussed the Constitution, our rights. In 
effect, are there any regulations, laws that govern to limit in 
any way possible a private entity's ability to collect facial 
recognition data? As a taxpayer, as a citizen, am I going to 
have to accept the fact that a private firm has my facial 
recognition data to sell or use as they wish? Question.
    Mr. Friedman. Professor Laurin, if I may?
    Ms. Laurin. Please go ahead.
    Mr. Friedman. I just want to appreciate the question at a 
couple of levels because I think it's so critically important.
    It's very easy to fall into the language of constitutional 
rights here, and the constitutional rights are, obviously, 
critically important. The interpretation of the Constitution by 
the courts often doesn't touch the things that we're talking 
about, in part because the technology is rushing ahead of the 
courts.
    So, you could not be more correct that there is an urgent 
need for regulation, for legislation, both as to private 
companies and as to law enforcement agencies.
    Then as to the private companies, I don't know of any 
congressional law that deals with this. Some States have 
stepped in, like Illinois in this space. For the most part 
there's a void. I think the need for legislation is essential.
    Mr. Correa. Ms. Laurin?
    Ms. Laurin. I can really only essentially echo what 
Professor Friedman said. He speaks my mind. I think that there 
is a tendency, actually, to overly prioritize the 
constitutional stuff and to miss the fact that actually the 
Constitution is a floor, not a ceiling, in terms of 
protections.
    In fact, often the constitutional protection is lower by 
design because the courts say really there are details for 
Congress to figure out, it's really not our institutional 
place.
    Mr. Correa. So, I will ask you again. Let me repeat my 
question.
    As a taxpayer, as a citizen, am I going to have to accept 
the fact that right now a private third party has my facial 
recognition information and they can do whatever they want to 
do, sell that information to whoever they want to sell it, 
without any repercussions, without any regulation, without 
violating any kind of a law? Is that the case? Is that the 
State of the laws in this country today?
    Ms. Laurin. The best answer I can give you, without 
presenting myself as an expert in every statutory scheme 
governing, for example, electronic communications, et cetera, 
is that I do think that there are at least some data privacy 
regimes, statutory data privacy regimes that would limit some 
activities.
    In the main, there is very little regulation currently. 
Certainly, the Constitution has virtually nothing to say about 
private nonstate action. That is why congressional action is so 
important.
    Mr. Correa. So, I'm left here with a very unsettling 
feeling that I essentially have no remedy for somebody using my 
facial data recognition information for whatever they want to 
do with it?
    Mr. Friedman. Well, you can create one. You have the--
Congress has the power to regulate products moving in 
interstate commerce. You could change that state-of-affairs if 
you chose to.
    Mr. Correa. Thank you, Madam Chair. Ran out of time. I 
yield.
    Ms. Jackson Lee. The individual ran out of time. Mr. 
Correa, we are all running out of time. This is such an 
important issue, and I'm very grateful for all my Members. I 
listened to--as I yield just for a moment, I listened to 
Professor Friedman over and over again about regulation, and 
that is what we are having this hearing for, because we are 
working on legislation, and I look forward to working with Mr. 
Biggs, but also with our colleagues who are, likewise, working 
on legislation to be able to address the question of a 
regulation or a construct that needs to address this question. 
So, thank you all very much.
    It's my pleasure to yield to the Vice Chair of this 
Subcommittee from Missouri, the gentlelady, Congresswoman Bush, 
for five minutes.
    Ms. Bush. Thank you, Chair, for convening this hearing.
    We as a Committee have spent significant time addressing 
the brazen and broad scope of power held by law enforcement 
agencies, from the FBI to various prosecutorial entities under 
the Department of Justice. Time and time again, we are met with 
concrete examples of unfettered law enforcement practices that 
not only undermine our fundamental civil liberties, but our 
basic principles of human dignity and security.
    When the ACLU conducted a study with Members of Congress 
and a mug shot database, 28 Members of Congress were falsely 
matched, including the late Congressman John Lewis, and this 
was Members of Congress, public officials. Imagine what it's 
like for everyday people across this country.
    What we do know of this technology is that the darker your 
skin tone, like mine, the more likely you will be falsely 
identified.
    During the Ferguson uprising following the murder of 
Michael Brown, Jr., I recall wearing bandanas as a response to 
the tear gas. It was during that time, though, that we realized 
that we needed to wear bandanas because of the likelihood of 
facial recognition technology being used against protest 
movements and surveilling each and every one of us involved in 
the Ferguson uprising.
    Director Goodwin, the GAO found that 14 agencies reportedly 
use facial recognition technology to support criminal 
investigations. Yes or no, was FRT used by the Federal law 
enforcement agencies during the protest following the torture 
and murder of George Floyd? It's just a yes or no.
    Dr. Goodwin. Yes.
    Ms. Bush. What are the collateral consequences of 
protestors being identified by these technologies?
    Dr. Goodwin. So, Congresswoman, we didn't go into that 
issue, but we raised the concerns about privacy and accuracy. 
So, of course, with accuracy, the concern is that people would 
be misidentified by the technology, and that could have adverse 
consequences. The privacy issue is that there are faces. A 
person's face could be out there and could be used a number of 
ways. So, our focus was to look at the lay of the land for how 
law enforcement were using the technology.
    Ms. Bush. Thank you.
    In 2016, it was reported that police used facial 
recognition to surveil crowds during the Freddie Gray protest 
to find people with outstanding warrants.
    Mr. Lee, are there any protections for protestors' 
identities when they are in public spaces but engaging in First 
Amendment-protected activity?
    Mr. Lee. Thank you for the question, Congresswoman.
    This would be, in our mind, a clear violation of the First 
Amendment. Also, there are serious privacy and civil liberties 
concerns with the use of facial recognition technology that we 
saw not only with the George Floyd protest, but also that we 
saw in LAPD using Ring cameras to surveil Black Lives Matter 
protestors. We have serious concerns, and there are serious 
civil rights and liberties concerns about the tracking of 
protestors, especially in these circumstances and especially 
when the technology is not well-used or very accurate.
    Ms. Bush. So, Mr. Lee, how does the use of the facial 
recognition technology used to identify protestors? How does 
that influence the public's perception or opinion of law 
enforcement, or even the high-tech policing?
    Mr. Lee. There's already a tenuous relationship between 
communities that are over-surveilled and over-policed in law 
enforcement. We have seen this time and time again over the 
past year, and that is something that I think Congress needs to 
keep in mind when looking about regulating facial recognition 
technology. These are tools used by law enforcement to 
disproportionately target Black and Brown people across the 
Nation, and we need to keep that in mind, and the civil rights 
and civil liberties of those people in mind, as we think about 
what the next steps are. That is why we, as a civil rights 
community, have asked for a ban or a moratorium and a tactical 
pause, to use the language of Ms. Frederick. We think that it 
chills protests. It chills speech. It misidentifies, and it 
does not work.
    Ms. Bush. Thank you so much, Mr. Lee.
    There are local solutions that are being [inaudible].
    Ms. Jackson Lee. Ms. Bush--are we able to get Ms. Bush 
connected?
    I think we may have lost Congresswoman Bush.
    Thank you very much, Congresswoman, for those insightful, 
very insightful points that you have raised. I have been 
impressed by the direction of our questions, the depth of our 
questions, and, certainly, the importance of our Witnesses.
    Mr. Biggs, I have questions that I'm going to utilize at 
this time, but I'm happy to yield to you have been certainly 
rewarded with time. I'm happy to yield with you before I make 
my final inquiries that I hope will be important to the record 
of this hearing and proceedings.
    Mr. Biggs?
    Mr. Biggs. Thank you, Madam Chair. I know you will be 
surprised to hear this, but even though I have many more 
questions and comments about this, I appreciate those that have 
participated; but I'm going to limit my comments right there 
and, again, thank the Members, the Witnesses, and yourself.
    Thank you, Madam Chair. I yield back.
    Ms. Jackson Lee. Thank you so very much and thank you to 
the Witnesses.
    I'm going to proceed with some questions that I did not 
proceed with. As I do so, I want to make sure that I thank all 
the Witnesses again. You will probably hear me thank the 
Witnesses more than once.
    I would like to submit into the record--and then I will 
begin just some questions that I think are very important. I 
want to submit into the record the GAO report that I believe 
was entitled, ``Facial Recognition Technology, Federal Law 
Enforcement Agencies Should Have Better Awareness of Systems 
Used by Employees.'' I think that is shocking, and I think that 
is also indicative of what I think Dr. Alexander made mention 
of in terms of the numbers of law enforcement and the 
utilization of this tool without training.
    [The information follows:]

                     MS. JACKSON LEE FOR THE RECORD

=======================================================================

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

    Ms. Jackson Lee. I will, Mr. Lee, come to you for a broad 
question on civil rights. Let me ask, Dr. Alexander, 18,000 
police departments, one of the reasons why NOBLE has been so 
instrumental in helping us push for the George Floyd Justice in 
Policing Act to find regular order that will help police, not 
diminish police. How do you respond to the idea--also, to my 
Members believe that we should have a follow-up hearing with 
some of our Federal law enforcement representatives that were 
cited by this excellent report that Dr. Goodwin has put forth 
through the GAO, and that may be in a classified manner, but we 
will be presenting that in the future.
    Dr. Alexander, to note that it was determined through the 
Federal system that employees were using the system without 
their hierarchy knowing, what do you think that means for the 
18,000 police departments where this kind of technology comes 
into their possession and how that relates to your point about 
police community relationships, but also victims, and I'm going 
to call him a victim, and I hope that he can made whole, and 
that is Mr. Williams and his family.
    Dr. Alexander.
    Dr. Alexander. Thank you, Madam Chair.
    Yes, there's 18,000 police departments in this country, and 
oftentimes, we are finding more and more we have got 18,000 
different ways things are being done and how they are being 
responded to, how they are being disciplined, how they are 
being managed, how they are being trained, et cetera. When you 
have one organization--and all it takes is just one, quite 
frankly, in the eyes of the American public--they all see 
policing as the same. It doesn't make it right, but that is the 
reality of it. If I do something well as a police officer, we 
all jump up and down and say, hey. If I do something horrible, 
we all have to take that hit, too.
    To your question, I think it becomes important to recognize 
that appropriate training is going to have to be regulated at a 
Federal level, and there's going to have to be some Federal 
standard, I truly believe, the same way we do with fingerprint, 
and with DNA, and with other forensic methods that have proven 
to be of value in the law enforcement and public safety 
community, because the whole idea about this particular piece 
of technology is that it is very convoluted, and it's very 
complicated, and there are going to be cases that are going to 
go before the courts that we haven't even thought about yet as 
it relates to this technology, still yet to be examined.
    So, I just think, Congresswoman, that due diligence will be 
given to these local agencies across this country. If they have 
an opportunity to be well-trained, there would be the 
standardization to the utilization of this technology, there 
would be ongoing certification, there would be practices that 
are held to the highest standards of science. In addition to 
that, I think it becomes important as well that the public 
understands what facial recognition technology is, because we 
have talked about it a lot within the context of policing, and 
within Congress, but a lot of people in our communities across 
this country don't understand exactly what it means, and when 
they do hear about the cases, they hear about the cases that 
are often associated with that of Mr. Williams.
    Ms. Jackson Lee. I would think minimally that--and you 
don't have to answer. I'm just going to make this based upon 
your response--minimally, that police departments need to be 
aware of when the technology is in their possession, and 
whether or not it's a detective with sloppy work, that that is 
being used and the individuals given the authority to use it 
are well-trained and, as you indicated, certified.
    I would just like to place that on the record because the 
title of the GAO report said that our Federal agencies should 
have a better awareness of these systems being used by 
employees. That is actually shocking and amazing to me, and I 
think that can be parlayed out into the field with the other 
18,000 police departments.
    Thank you for your answer.
    I indicated that my concern has been the unreasonable 
search and seizure. I would like both Professor Laurin and, as 
well--and I think this was raised by one of our Members, but I 
would like maybe a succinct answer from both of you, Professor 
Laurin, with your academic background, you focused on criminal 
procedure. I don't think that we are without tools. We have the 
ability on the Judiciary Committee to look at the Federal 
criminal procedure as we do civil procedure.
    In any event, Mr. Tolman, as a former U.S. Attorney, in 
cases where an investigation results in prosecution, should 
prosecutors be required to inform the defendant that facial 
recognition technology was used to identify them in any part of 
the process? Obviously, it is investigative, and should 
alternate positive matches be required to be produced as that 
investigation goes forward and if it ultimately comes into 
court? I know that one of our Members mentioned the Brady 
defense, but how should we respond to that? Professor Laurin 
first, and then Mr. Tolman.
    Ms. Laurin. Thank you, Chair.
    My bottom line view is it's absolutely right, the 
observation that with respect to Federal criminal procedure 
there's some very direct interventions that Congress can make, 
and while I think the scope of Rule 16, frankly, is always sort 
of a fraught discussion, I think there are important limited 
interventions that can be made in terms of requiring access to 
facial recognition, to the fact of the search, to alternative 
matches, to information on confidence levels.
    Again, as I said, I think to the extent possible, it's 
really essential that the defense have access to the algorithms 
to what's inside the Black box to be able to evaluate the 
reliability of the technology. That could all be a matter of 
sort of statutory change without having to worry about the 
courts sorting through the morass of whether Brady Doctrine 
encompasses that.
    Ms. Jackson Lee. Thank you.
    Mr. Tolman. I would simply add that--
    Ms. Jackson Lee. Thank you. Yes, Mr. Tolman.
    Mr. Tolman. The Professor is absolutely correct. I agree 
with everything she just articulated.
    I would add that, unlike a lineup where a prosecutor is 
required to hand over alternative individuals that might have 
been placed in the lineup, and the rationale for the lineup 
that was utilized, facial recognition technology requires the 
prosecutor, and should require the prosecutor to turn over more 
if the intention is to give the defense counsel all that it 
needs to assess the reliability and the admissibility of such 
evidence.
    So, we are at a new ball game with this technology, and it 
will require more to be given from a prosecutor to a defense 
attorney under the strictures of our constitutional 
protections.
    Ms. Jackson Lee. I think that--
    Ms. Laurin. I would just add one additional note, which is 
that it is essential that timing be addressed because if 
defense counsel is only able to access this on the eve of 
trial, it does no good in terms of evaluating whether a guilty 
plea is an appropriate resolution. It does no good in terms of 
preparing to challenge the admissibility of evidence, and so 
early discovery is important as well.
    Ms. Jackson Lee. Early discovery and early knowledge? Is 
that my understanding?
    Ms. Laurin. [Nonverbal response.]
    Ms. Jackson Lee. Thank you so much.
    Professor Friedman, I just wanted to clarify, when you said 
``regulation,'' in what framework are you speaking?
    Mr. Friedman. So, you know, I think what's important here 
is comprehensive regulation. That does two general things: One 
is regulate the technology as best as we understand it right 
now, so we know that there are very serious problems with the 
technology, and they need to be addressed. Then the other is, 
there's a lot we need to learn.
    So, one of the things that Congress could do immediately is 
instruct some entity, like NIST, to actually test the 
technology under operational conditions, how law enforcement is 
using it, because there are very, very serious concerns about 
accuracy and bias, particularly racial bias, and we get numbers 
reported, but those numbers bear no reflection to how law 
enforcement is actually using the technology, and what little 
we know from this suggests that those error rates may be quite 
high as others have indicated.
    So, there's a list of things that I think we should--that 
Congress should have this study, and then I think there's a 
list of immediate safeguards we could put in place, for 
example, best-practice protocols about what it means to have a 
human in the loop that verifies what the algorithm does.
    Ms. Jackson Lee. Thank you very much.
    Ms. Frederick--and I'm going to finish with Mr. Lee. Ms. 
Frederick, you have been keen on the providers, or the tech 
entities that control this technology. From your perspective, 
what would be a regulation in that entity, in that arena?
    Ms. Frederick. Well, ma'am, as you know, I work for the 
Heritage Foundation, and we are about limited government, and 
we do not really support regulation when it comes to--or robust 
regulation when it comes to private entities. So, my solution 
would be, we have to encourage the programs within these 
companies, the leaders in these companies to institute those 
privacy-by-design mechanisms, so make sure in the design phase 
of creating an algorithm, you are in viewing it with privacy 
protections, and that is why I sort of harken to the 
differential privacy. We have methods of encryption that are 
getting more and more powerful and much better, decentralized 
models of data storage, studying models of machinery.
    So, if we as--or excuse me, if you as Congress can sort of 
impress upon these companies that they need to build in these 
data privacy protections, I think that is a good start point, 
short of actual regulation on private companies.
    Ms. Jackson Lee. Well, in the spirit of my Ranking Member, 
you know this is a bipartisan hearing, so we wanted to make 
sure we welcomed your thoughts as well, which will be very 
helpful to us.
    Let me have Mr. Williams, and then, Mr. Lee, you will have 
the last word. Mr. Williams, I think you have heard from all of 
us. We are appalled at what happened to you. We are stunned. We 
know that there are good intentions. I will be introducing some 
articles where this technology was used.
    We also heard from Dr. Alexander that there is good 
policing out there working to do positive things. In your 
instance, you fell into a very bad trap. What do you want to 
leave us with as a victim, not a victim in your own self, but a 
victim of this technology that should not have happened? Again, 
I started out by apologizing to you and your family, and we 
will be eagerly watching your lawsuit possibly pursued by the 
ACLU, if I'm correct. What do you want us to know as you end 
your testimony here?
    You need to unmute.
    Mr. Williams. Yes, I will unmute. Thank you.
    First, I just want to say thank you for inviting me to come 
tell my story, and I hope that going forward, we can figure out 
a way to, like they say, to regulate because it's already being 
used, so we need to get it fixed so that it works more in a way 
that it works properly, I should say, because in the current 
state, it's wrong. They get it wrong even when they get it 
right, and I don't know if that makes sense.
    Like, you don't have a--you really don't have a say--so, I 
mean, throughout the time that nobody has ever signed up to 
say, hey, use my picture from my ID in your mug book. Like I 
said a little earlier, they didn't even use my current driver's 
license. They used one previous from when, I guess, I might 
have looked younger at that point. I don't know. I guess all 
your information is in the system, so it's available to be 
used.
    I was asked a few times about how my daughter took it, and 
I don't know what the lasting effect is on them. All I know is 
that at this point, she is upset. Every time we look at her, 
she--so, I don't know. I'm just--I hate to say I'm thankful 
that it was a felony larceny, but it just--what if the crime 
was capital murder or something like that, and then they just 
came to my house and arrested me? I don't know if I would have 
got a bond to get out and be free to even talk to you at this 
point, because the court system is backed up that they probably 
wouldn't have even got to me yet, and I would still be locked 
up for something I didn't do. All right.
    I kind of cringe to think about it like that, because I did 
not get to say anything. I didn't have any rights read to me or 
anything. They showed up, asked me my name, and I asked for a 
warrant. They didn't just bring one. I mean, I resisted arrest, 
but I was in my driveway, and I'm, like, ``what are you all 
doing?'' He was, like, ``Are you Robert Williams?" He was, 
like, ``all right you are under arrest.'' I'm, like, ``for 
what?'' He was, like, ``I can't tell you.'' I'm, like, you 
can't even tell me what I'm under arrest for, but I'm under 
arrest. They took me immediately down to the detention center, 
and I got fingerprinted and mug-shotted right then and there, 
and I assumed that my fingerprints were clear and I would be 
out, and that didn't mean anything. It didn't mean--if my 
fingerprints went in, so what?
    All right. Now, they have a mug shot to add to it. It was 
so backwards if that makes sense, right? I just feel like I'm 
just a regular, and I thought I knew the law, but I was wrong. 
So, hopefully we get some change.
    Ms. Jackson Lee. That is powerful, Mr. Williams. You are 
seeking--I would just simply say you are seeking justice.
    Mr. Williams. Yes, ma'am.
    Ms. Jackson Lee. You deserve it, you really do.
    Mr. Lee, you obviously represented sort of a family of 
those who have been fighting these issues dealing with justice 
in the criminal justice system, and, so, having listened to Mr. 
Williams and--I know the position of a ban and a moratorium, 
but just, if you would, end us on the enormity of the problem, 
if we are to have facial recognition used randomly in this 
massive hierarchy of 18,000 police departments without some 
intervention. It may be your form. It may not be. What can 
happen to persons who wind up like Mr. Williams?
    I'm shocked that you can't get a reason why you are being 
arrested. I thought you were entitled to that. To use facial 
recognition, the people that he was in the lineup with didn't 
look like him. It didn't look like him when they put the 
picture up to him. So, how is that just an undermining of civil 
rights and civil liberties in this country, if used as it 
seemed to have been used with Mr. Williams?
    Mr. Lee. Madam Chair--
    Ms. Jackson Lee. Please, Mr. Lee.
    Mr. Lee. Yes, Madam Chair. I think there's a broader 
context of policing that we must keep in mind that is 
especially instructive when talking about the case of Mr. 
Williams. Also, why so many--the Leadership Conference and its 
coalition organization support a ban, moratorium, and, Ms. 
Frederick, I'm going to steal tactical pause from you here on 
out in my policy conversations. It's because we can't separate 
the tools from the context of the American policing system. We 
have evidence of that system over decades, centuries of how 
Black bodies, and disproportionately Black people are treated 
within the criminal legal justice system.
    We can't separate those tools. Then if you add the whole 
history of that criminal legal system to a tool that is known 
in the best of conditions to be biased against Black people and 
against women and misclassifies or misidentifies Members of the 
LGBTQ-plus community, people with disabilities, right, so many 
Members of the community that we need to protect because of 
their protected status and the history of harms that have been 
done onto their bodies throughout centuries, then we need to 
have a real conversation about whether this technology actually 
works the way that we think it does. It doesn't. That has been 
proved time and time again, whether the National Institute of 
Standards and Technology, whether the research by Joy 
Buolamwini and Timnit Gebru, right, whether we are talking 
about the ACLU's use of the Amazon facial recognition 
technology to identify even Members of Congress who are public 
figures.
    It's being used in communities that are already over-
policed and over-surveilled, and it is the least likely to work 
for those exact same communities, and the community is not 
involved in the process. There's no public airing of widespread 
use. Congress has not been--Congress has not been asked, or 
Congress has not legislated on these topics.
    Then there's even opacity with which the tech is used that 
is compounded by the opacity of policing practices in general, 
and law enforcement agencies. We don't know still, even with 
the GAO report, how deep law enforcement is using facial 
recognition technology, nor the full extent of those 
technologies' writ large.
    Community engagement is not a check box. It's a continuous 
conversation, and no one knows. There are secret hearings. 
There are classified briefings. The public does not know the 
full extent of facial recognition technology, and really 
broadly, digital evidence capacity of law enforcement capacity 
writ large.
    We will continue to consistently call for a moratorium or 
ban on these technologies, and we would be more than happy to 
work with this Committee and Congress to imagine a solution and 
imagine a world of policing that puts the civil rights and 
civil liberties of marginalized communities first, not last, 
and not using these same communities as basically test cases 
when these technologies do not work.
    Ms. Jackson Lee. Okay. Thank you so very much. Thank you so 
very much.
    Thank you to each of the Witnesses for their profound 
testimony, and thank you to my Members, or the Members of this 
Committee. Mr. Biggs, as Ranking Member, thank you for joining 
me on, I believe, a hearing that will be very constructive 
going forward as evidenced by the questions and the responses 
and, of course, the expert Witnesses that we had today.
    So, this concludes today's hearing.
    Before I conclude, I would like to submit into the record: 
Center for Strategic and International Studies, ``The Problem 
of Bias in Facial Recognition,'' May 1, 2020; Washington Post 
article, ``How America's Surveillance Networks Helped the FBI 
Catch the Capitol Mob,'' and that is dated April 2, 2021; and 
then The Washington Post responding to the ``Federal study 
confirms Racial Bias of Many Facial-Recognition Systems, Casts 
Doubt on Their Expanded Use,'' December 19, 2021. I want those 
articles submitted into the record.
    [The information follows:]

                     MS. JACKSON LEE FOR THE RECORD

=======================================================================

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

    Ms. Jackson Lee. I thank, again, our distinguished 
Witnesses for their attendance and participation. We thank all 
those who helped prepare this hearing.
    Without objection, all Members will have five legislative 
days to submit additional written questions for the Witnesses 
or additional materials for the record.
    Again, this hearing is adjourned.
    Thank you.
    [Whereupon, at 1:29 p.m., the Subcommittee was adjourned.]

                                APPENDIX

=======================================================================

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

                        QUESTIONS FOR THE RECORD

=======================================================================

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

                                 [all]