[House Hearing, 116 Congress]
[From the U.S. Government Publishing Office]



 
                     THE FUTURE OF WORK: PROTECTING
                      WORKERS' CIVIL RIGHTS IN THE
                              DIGITAL AGE

=======================================================================

                                HEARING

                               before the

            SUBCOMMITTEE ON CIVIL RIGHTS AND HUMAN SERVICES


                         COMMITTEE ON EDUCATION
                               AND LABOR
                     U.S. HOUSE OF REPRESENTATIVES

                     ONE HUNDRED SIXTEENTH CONGRESS

                             SECOND SESSION

                               __________

            HEARING HELD IN WASHINGTON, DC, FEBRUARY 5, 2020

                               __________

                           Serial No. 116-50

                               __________

      Printed for the use of the Committee on Education and Labor
      
      
      
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]      




    Available via the: https://edlabor.house.gov or www.govinfo.gov
    
    
                              ______                       


             U.S. GOVERNMENT PUBLISHING OFFICE 
39-731 PDF             WASHINGTON : 2021     
    
    
    
                    COMMITTEE ON EDUCATION AND LABOR

             ROBERT C. ``BOBBY'' SCOTT, Virginia, Chairman

Susan A. Davis, California           Virginia Foxx, North Carolina,
Raul M. Grijalva, Arizona            Ranking Member
Joe Courtney, Connecticut            David P. Roe, Tennessee
Marcia L. Fudge, Ohio                Glenn Thompson, Pennsylvania
Gregorio Kilili Camacho Sablan,      Tim Walberg, Michigan
  Northern Mariana Islands           Brett Guthrie, Kentucky
Frederica S. Wilson, Florida         Bradley Byrne, Alabama
Suzanne Bonamici, Oregon             Glenn Grothman, Wisconsin
Mark Takano, California              Elise M. Stefanik, New York
Alma S. Adams, North Carolina        Rick W. Allen, Georgia
Mark DeSaulnier, California          Lloyd Smucker, Pennsylvania
Donald Norcross, New Jersey          Jim Banks, Indiana
Pramila Jayapal, Washington          Mark Walker, North Carolina
Joseph D. Morelle, New York          James Comer, Kentucky
Susan Wild, Pennsylvania             Ben Cline, Virginia
Josh Harder, California              Russ Fulcher, Idaho
Lucy McBath, Georgia                 Steve Watkins, Kansas
Kim Schrier, Washington              Ron Wright, Texas
Lauren Underwood, Illinois           Daniel Meuser, Pennsylvania
Jahana Hayes, Connecticut            Dusty Johnson, South Dakota
Donna E. Shalala, Florida            Fred Keller, Pennsylvania
Andy Levin, Michigan*                Gregory F. Murphy, North Carolina
Ilhan Omar, Minnesota                Jefferson Van Drew, New Jersey
David J. Trone, Maryland
Haley M. Stevens, Michigan
Susie Lee, Nevada
Lori Trahan, Massachusetts
Joaquin Castro, Texas
* Vice-Chair

                   Veronique Pluviose, Staff Director
                 Brandon Renz, Minority Staff Director
                                 ------                                

            SUBCOMMITTEE ON CIVIL RIGHTS AND HUMAN SERVICES

                  SUZANNE BONAMICI, OREGON, Chairwoman

Raul M. Grijalva, Arizona            James Comer, Kentucky,
Marcia L. Fudge, Ohio                  Ranking Member
Kim Schrier, Washington              Glenn ``GT'' Thompson, 
Jahana Hayes, Connecticut                Pennsylvania
David Trone, Maryland                Elise M. Stefanik, New York
Susie Lee, Nevada                    Dusty Johnson, South Dakota


                            C O N T E N T S

                              ----------                              
                                                                   Page

Hearing held on February 5, 2020.................................     1

Statement of Members:
    Bonamici, Hon. Suzanne, Chairwoman, Subcommittee on Civil 
      Rights and Human Services..................................     1
        Prepared statement of....................................     4
    Comer, Hon. James, Ranking Member, Subcommittee on Civil 
      Rights and Human Services..................................     5
        Prepared statement of....................................     6

Statement of Witnesses:
    Ajunwa, Ms. Ifeoma, J.D., Ph.D. Assistant Professor of 
      Employment and Labor Law, Cornell University...............    13
        Prepared statement of....................................    15
    Romer-Friedman, Mr. Peter, J.D., Principal and Head of the 
      Civil Rights and Class Actions Practice Gupta Wessler PLLC.    26
        Prepared statement of....................................    28
    Lander, Ms. Esther G., J.D., Partner, Akin Gump Strauss Hauer 
      and Feld LLP...............................................    16
        Prepared statement of....................................    18
    Yang, Ms. Jenny R., Senior Fellow, Urban Institute...........     9
        Prepared statement of....................................    11

Additional Submissions:
    Chairwoman Bonamici:.........................................
        Letter dated February 3, 2020 from The Leadership 
          Conference on Civil and Human Rights...................    50
        Link: Help Wanted: An Examination of Hiring Algorithms, 
          Equity, and Bias.......................................    52
    Mr. Comer:...................................................
        Prepared statement from the HR Policy Association........    53
    Questions submitted for the record by:
        Chairwoman Banamici 



        Hayes, Hon. Jahana, a Representative in Congress from the 
          State of Connecticut...................................    60
    Responses submitted for the record by:
        Mr. Romer-Friedman.......................................    63
        Ms. Yang.................................................    69


                     THE FUTURE OF WORK: PROTECTING

                      WORKERS' CIVIL RIGHTS IN THE

                              DIGITAL AGE

                              ----------                              


                      Wednesday, February 5, 2020

                       House of Representatives,

            Subcommittee on Civil Rights and Human Services,

                    Committee on Education and Labor

                             Washington, DC

                              ----------                              

    The subcommittee met, pursuant to call, at 2:03 p.m., in 
Room 2175, Rayburn House Office Building. Hon. Suzanne Bonamici 
[chairwoman of the subcommittee] presiding.
    Present: Representatives Bonamici, Schrier, Lee, Comer, 
Stefanik, and Johnson.
    Also present: Representatives Scott, Foxx, Takano, and 
Blunt Rochester.
    Staff present: Tylease Alli, Chief Clerk; Ilana Brunner, 
General Counsel; Emma Eatman, Press Assistant; Eunice Ikene, 
Labor Policy Advisor; Stephanie Lalle, Deputy Communications 
Director; Andre Lindsay, Staff Assistant; Jaria Martin, Clerk/
Special Assistant to the Staff Director; Kevin McDermott, 
Senior Labor Policy Advisor; Richard Miller, Director of Labor 
Policy; Max Moore, Staff Assistant; Veronique Pluviose, Staff 
Director; Carolyn Ronis, Civil Rights Counsel; Banyon Vassar, 
Deputy Director of Information Technology; Katelyn Walker, 
Counsel; Rachel West, Senior Economic Policy Advisor; Gabriel 
Bisson, Minority Staff Assistant; Courtney Butcher, Minority 
Director of Member Services and Coalitions; Rob Green, Minority 
Director of Workforce Policy; Jeanne Kuehl, Minority 
Legislative Assistant; John Martin, Minority Workforce Policy 
Counsel; Hannah Matesic, Minority Director of Operations; 
Carlton Norwood, Minority Press Secretary; and Ben Ridder, 
Minority Professional Staff Member.
    Chairwoman BONAMICI. The Committee on Education and Labor 
will come to order. Welcome, everyone. I note that a quorum is 
present.
    The Committee is meeting today for a legislative hearing to 
hear testimony on ``The Future of Work, Protecting Workers' 
Civil Rights in the Digital Age.''
    I note for the Subcommittee that Congressman Mark Takano of 
California, Congresswoman Pramila Jayapal of Washington, 
Congresswoman Lori Trahan of Massachusetts, Congresswoman 
Yvette Clark of New York, and Congresswoman Lisa Blunt 
Rochester of Delaware will be permitted to participate in 
today's hearing with the understanding that their questions 
will come only after all Members of this Subcommittee and then 
the Full Committee on both sides of the aisle who are present 
have had an opportunity to question the witnesses.
    I will now move to opening statements. Pursuant to 
Committee Rule 7(c), opening statements are limited to the 
Chair and the Ranking Member. This allows us to hear from our 
witnesses sooner and provides all Members with adequate time to 
ask questions. I now recognize myself for the purpose of an 
opening statement.
    Technology and automation have become entrenched in nearly 
every aspect of our society and culture. The intentions behind 
the use of technology may be noble, but our efforts to both 
assess and address the effects on our workplace have been 
inadequate.
    In recent years, employers have harnessed new digital tools 
like recruiting and hiring algorithms, computer analyzed video 
interviews, and real time tracking of their workers, to cut the 
cost of hiring and managing workers.
    This is our third hearing in our Future of Work series and 
today we will examine how the technologies that employers use 
for hiring and management may, intentionally or not, facilitate 
discrimination and undermine workers' civil rights. We will 
discuss how Congress, Federal agencies, and the business 
community can strengthen workplace protections to make sure 
workers are not left vulnerable to discriminatory practices.
    And to prevent discriminatory hiring, firing, and 
monitoring practices, we will investigate whether new 
technologies are designed to account for implicit and explicit 
bias and are used transparently.
    Proponents of new technologies assert that digital tools 
eliminate bias and discrimination by attempting to remove 
humans from the process. But technology is not developed or 
used in a vacuum.
    A growing body of evidence suggests that, left unchecked, 
digital tools can absorb and replicate systemic biases that are 
ingrained in the environment in which they are designed.
    For example, hiring algorithms often rely on correlations 
to make predictions about the capabilities of job candidates. 
Yet these tools can mistake correlation for causation and 
subsequently perpetrate harmful disparities.
    In 2017, an algorithm built by Amazon to hire engineers was 
scrapped after it was found to favor men over women by 
penalizing graduates of women's colleges. Because men hold the 
majority of engineering positions, the algorithm had presumed 
that being male was a key characteristic of successful 
engineers when in reality, being male does not cause one to be 
a successful engineer.
    New technologies that surveil and monitor workers can also 
exacerbate bias in the workplace. These tools may force workers 
to share their location, activities, and even private biometric 
information, sometimes without workers' knowledge or consent.
    The technologies also allow employers to access private 
information that could be used to discriminate against workers. 
For instance, through certain workplace wellness programs, an 
employer could learn of a disability, a health condition, or 
genetic condition that is otherwise protected by 
antidiscrimination law.
    Too often employers and technology vendors are not 
transparent about the design and use of digital tools, posing 
challenges for workers seeking redress for workplace 
discrimination.
    Simply put, without transparent and responsible design, 
digital tools can further perpetuate and even exacerbate long-
held biases that have led to workplace disparities, 
particularly for women of colors--color, individuals, women--
individuals with disabilities, women, and older workers. 
Moreover, digital tools that are opaque in their design and 
operation cannot be held accountable. As traditional employment 
relationships shift dramatically in our modern economy, 
workers' antidiscrimination protections are also in jeopardy.
    As this Committee has established, new technologies have 
fundamentally restructured the workplace through the rise of 
gig and platform work.
    These platforms have provided workers with new 
opportunities, but many employers have also used new 
technologies to deny workers basic protections.
    For example, app-based companies frequently misclassify 
their employees as independent contractors, depriving them of 
protections and benefits such as minimum wage and overtime pay.
    Worker misclassification is not unique to app-based 
companies. Some app-based companies directly hire their 
employees, as we learned from a business leader in our first 
Future of Work hearing.
    Workers misclassified as independent contractors are also 
excluded from the majority of Federal workplace 
antidiscrimination laws, including protection under Title VII 
of the Civil Rights Act of 1964, the Americans with 
Disabilities Act, and the Age Discrimination in Employment Act.
    These gaps leave workers classified as independent 
contractors, whether misclassified or not, with few options to 
challenge discrimination.
    We have the responsibility on this Committee to work with 
Federal agencies and the business community to strengthen 
workplace protections in the face of changing technology. And 
this should include the right to be free from workplace 
discrimination and the right to be hired based on 
qualifications rather than age, identity, or zip code.
    We must compel employers and technology vendors to be 
transparent and accountable for new workplace technologies. We 
must invest in our key defenses against employment 
discrimination and empower the Equal Employment Opportunity 
Commission to address emerging forms of digital discrimination 
and we must identify and close the gaps in our Nation's laws 
that leave workers vulnerable to misclassification, 
discrimination, and harassment on the job.
    I request unanimous consent to enter into the record a 
letter from The Leadership Conference on Civil and Human Rights 
and Upturn and a recent report on hiring algorithms, equity, 
and bias from Upturn into the record.
    Without objection, so ordered.
    I look forward to our discussion today, and I now yield to 
the Ranking Member, Mr. Comer, for an opening statement and I 
do want to note, I went long so if you want to take a little 
extra time, feel free.
    [The statement of Chairwoman Bonamici follows:]

 Prepared Statement of Hon. Suzanne Bonamici, Chairwoman, Subcommittee 
                   on Civil Rights and Human Services

    Technology and automation have become entrenched in nearly every 
aspect of our society and culture. The intentions behind the use of 
technology may be noble, but our efforts to both assess and address the 
effects on our workforce have been inadequate. In recent years, 
employers have harnessed new digital tools--like recruiting and hiring 
algorithms, computer-analyzed video interviews, and real-time tracking 
of their workers--to cut the cost of hiring and managing workers.
    This is our third hearing in our Future of Work series. Today we 
will examine how the technologies that employers use for hiring and 
management may, intentionally or not, facilitate discrimination and 
undermine workers' civil rights. We will discuss how Congress, federal 
agencies, and the business community can strengthen workplace 
protections to make sure workers are not left vulnerable to 
discriminatory practices. And, to prevent discriminatory hiring, 
firing, and monitoring practices, we will investigate whether new 
technologies are designed to account for implicit and explicit bias and 
are used transparently.
    Proponents of new technologies assert that digital tools eliminate 
bias and discrimination by attempting to remove humans from the 
processes. But technology is not developed or used in a vacuum. A 
growing body of evidence suggests that, left unchecked, digital tools 
can absorb and replicate systemic biases that are ingrained in the 
environment in which they are designed.
    For example, hiring algorithms often rely on correlations to make 
predictions about the capabilities of job candidates. Yet these tools 
can mistake correlation for causation and subsequently perpetuate 
harmful disparities. In 2017, an algorithm built by Amazon to hire 
engineers was scrapped after it was found to favor men over women by 
penalizing graduates of women's colleges. Because men hold the majority 
of engineering positions, the algorithm had presumed that being male 
was a key characteristic of successful engineers. In reality, being 
male does not cause one to be a successful engineer.
    New technologies that surveil and monitor workers can also 
exacerbate bias in the workplace. These tools may force workers to 
share their location, activities, and even private biometric 
information--sometimes without workers' knowledge or consent. The 
technologies also allow employers to access private information that 
could be used to discriminate against workers. For instance, through 
certain workplace wellness programs, an employer could learn of a 
disability, health condition, or genetic condition that is otherwise 
protected by anti-discrimination law.Too often employers and technology 
vendors are not transparent about the design and use of digital tools, 
posing challenges for workers seeking redress for workplace 
discrimination.
    Simply put, without transparent and responsible design, digital 
tools can further perpetuate and even exacerbate long-held biases that 
have led to workplace disparities, particularly for workers of color, 
women, individuals with disabilities, and older workers. Moreover, 
digitial tools that are opaque in their design and operation cannot be 
held accountable.
    As traditional employment relationships shift dramatically in our 
modern economy, workers' antidiscrimination protections are also in 
jeopardy. As this Committee has established, new technologies have 
fundamentally restructured the workplace through the rise of ``gig'' 
and ``platform'' work. These platforms have provided workers with new 
opportunities, but many employers have also used new technologies to 
deny workers basic protections.
    For example, app-based companies frequently misclassify their 
employees as ``independent contractors,'' depriving them of protections 
and benefits such as minimum wage and overtime pay. Worker 
misclassification is not unique to app-based companies. Some app-based 
companies directly hire their employees, as we learned from a business 
leader in our first Future of Work hearing.
    Workers misclassified as independent contractors are also excluded 
from the majority of federal workplace antidiscrimination laws, 
including protections under Title VII of the Civil Rights Act of 1964, 
the Americans with Disabilities Act, and the Age Discrimination in 
Employment Act. These gaps leave workers classified as independent 
contractors--whether misclassified or not--with few options to 
challenge workplace discrimination.
    We have the responsibility on this Committee to work with federal 
agencies and the business community to strengthen workplace protections 
in the face of changing technology. And this should include the right 
to be free from workplace discrimination and the right to be hired 
based qualifications rather than age, identity, or zip code.
    We must compel employers and technology vendors to be transparent 
and accountable for new workplace technologies. We must invest in our 
key defenses against employment discrimination, and empower the Equal 
Employment Opportunity Commission to address emerging forms of digital 
discrimination. And we must identify and close the gaps in our nation's 
laws that leave workers vulnerable to misclassification, 
discrimination, and harassment on the job.
    I request unanimous consent to enter a letter from The Leadership 
Conference on Civil and Human Rights and Upturn and a recent report on 
hiring algorithms, equity, and bias from Upturn into the record.
    I look forward to our discussion today, and I now yield to the 
Ranking Member, Mr. Comer, for an opening statement.
                                 ______
                                 
    Mr. COMER. All right. Well, thank you, Madam Chair, and 
today we are here to discuss how technological advancements are 
affecting workers.
    New technologies continue to increase efficiency, reduce 
costs for employers in recruiting and hiring and lead to 
quicker job placements and enhanced job opportunities.
    In a statement to this Committee, the HR Policy Association 
noted, quote, in a recent survey 71 percent of staffing firms 
believe artificial intelligence will eliminate human bias from 
the recruitment process, unquote.
    So not only can employers utilize new technologies to 
eliminate employment bias, but they can also be used to 
decrease time and the cost of doing business.
    Technology has also driven the sharing economy which has 
created substantial opportunities for workers and job creators 
who are seeking flexible workforce arrangements so they can 
better compete in our ever-changing economy.
    Workers are seeking out the benefits and flexibility these 
arrangements provide as they recognize how significantly they 
can improve their quality of life as well as their family's. 
This is a growing trend among American workers and job seekers 
that should be encouraged, not impeded.
    Many businesses who also value flexibility and productivity 
are turning to independent contractors. The use of independent 
contractors makes sense for job creators looking to obtain 
high-quality services, for workers who want to offer their 
skills on their own terms, and for consumers who benefit from a 
reduction in the cost of goods and services.
    Simply put, online platforms and other emerging 
technologies have given American workers more control, 
flexibility, and opportunity in the workplace than they have 
previously had. Regardless of technological advancements, every 
American should have the opportunity to achieve success in the 
workplace free from discrimination.
    This is why there are important protections built into 
Federal law to prevent workplace discrimination. These 
protections are broadly written and continue to apply to new 
and emerging technologies.
    These laws protect individuals from employment, 
discrimination based on age, color, disability, genetic 
information, national origin, race, religion, or sex.
    Workers in the sharing economy are also protected. For 
example, the Fair Labor Standards Act has strong remedies in 
place for employers who incorrectly classify workers and 
violate minimum wage and overtime requirements.
    All workers should be paid in full for their work. That is 
why Committee Republicans support enforcement of the FLSA. We 
shouldn't penalize Americans who work for themselves or the 
companies that do businesses with them.
    Instead, we should applaud these Americans for their 
entrepreneurial spirit. Our Nation's laws were written so that 
they can be and are applied to employers' use of technologies 
in ways that protect workers.
    Additionally, it should go without saying that the 
overwhelming majority of businesses follow the law and want to 
do what is expected of them. Bottom line, workers, job 
creators, and the U.S. economy are all benefitting from today's 
technological advancements.
    Madam Chair, before we hear from our witnesses, I need to 
take a moment to point out the hypocrisy of today's hearing. My 
Democrat colleagues want to talk about protecting workers' 
rights while they simultaneously push radical legislation that 
will undermine the rights of workers.
    H.R. 2474, the PRO Act, which we expect will be on the 
House floor for a vote tomorrow, is written to bail out the 
failing labor union business model that is being widely 
rejected by American workers.
    This radical legislation would penalize entrepreneurships 
by creating an expansive, one-size-fits-all definition of an 
employee, which will increase costs for business owners as well 
as consumers while limiting worker opportunities for 
individuals who desire flexibility.
    Instead, we should champion reforms that expand 
opportunities for flexibility, innovation, and entrepreneurship 
to give workers and job seekers opportunities to compete 
successfully in the modern economy.
    I thank the witnesses for being here today and I look 
forward to their testimony and, Madam Chair, I yield back.
    [The statement of Mr. Comer follows:]

Prepared Statement of Hon. James Comer, Ranking Member, Subcommittee on 
                    Civil Rights and Human Services

    ``Today, we are here to discuss how technological advancements are 
impacting workers.
    New technologies continue to increase efficiency, reduce costs for 
employers in recruiting and hiring, and lead to quicker job placements 
and enhanced job opportunities. In a statement to this Committee, the 
HR Policy Association noted: `In a recent survey, 71 percent of 
staffing firms believe artificial intelligence will eliminate human 
bias from the recruitment process.' So, not only can employers utilize 
new technologies to eliminate employment bias, but they can also be 
used to decrease the time and cost of doing business.
    Technology has also driven the sharing economy, which has created 
substantial opportunities for workers and job creators who are seeking 
flexible workforce arrangements so they can better compete in our ever-
changing economy. Workers are seeking out the benefits and flexibility 
these arrangements provide as they recognize how significantly they can 
improve their quality of life, as well as their families. This is a 
growing trend among American workers and jobseekers that should be 
encouraged, not impeded. Many businesses who also value flexibility and 
productivity are turning to independent contractors. The use of 
independent contractors makes sense for job creators looking to obtain 
high-quality services, for workers who want to offer their skills on 
their own terms, and for consumers who benefit from a reduction in the 
cost of goods and services.
    Simply put, online platforms and other emerging technologies have 
given American workers more control, flexibility, and opportunity in 
the workplace than they have previously had.
    Regardless of technological advancements, every American should 
have the opportunity to achieve success in the workplace free from 
discrimination. That is why there are important protections built into 
federal law to prevent workplace discrimination. These protections are 
broadly written and continue to apply to new and emerging technologies.
    These laws protect individuals from employment discrimination based 
on age, color, disability, genetic information, national origin, race, 
religion, or sex.
    Workers in the sharing economy are also protected. For example, the 
Fair Labor Standards Act (FLSA) has strong remedies in place for 
employers who incorrectly classify workers and violate minimum wage and 
overtime requirements. All workers should be paid in full for their 
work. That is why Committee Republicans support enforcement of the 
FLSA. We shouldn't penalize Americans who work for themselves or the 
companies that do business with them. Instead, we should applaud these 
Americans for their entrepreneurial spirit.
    Our nation's laws were written so that they can be, and are, 
applied to employers' use of technologies in ways that protect workers. 
Additionally, it should go without saying that the overwhelming 
majority of businesses follow the law and want to do what is expected 
of them. Bottom line, workers, job creators, and the U.S. economy are 
all benefiting from today's technological advancements.
    Madam Chair, before we hear from our witnesses, I need to take a 
moment to point out the hypocrisy of today's hearing. My Democrat 
colleagues want to talk about protecting workers' rights while they 
simultaneously push radical legislation that will undermine the rights 
of workers.
    H.R. 2474, the PRO Act, which we expect will be on the House floor 
for a vote tomorrow, is written to bail out the failing labor union 
business model that is being widely rejected by American workers. This 
radical legislation would penalize entrepreneurship by creating an 
expansive, one-size-fits-all definition of an employee, which will 
increase costs for business owners as well as consumers, while limiting 
work opportunities for individuals who desire flexibility.
    Instead, we should champion reforms that expand opportunities for 
flexibility, innovation, and entrepreneurship to give workers and job 
seekers opportunities to compete successfully in the modern economy.
    I thank the witnesses for being here and I look forward to their 
testimony.''
                                 ______
                                 
    Chairwoman BONAMICI. Thank you, Mr. Comer. I know we will 
be having the PRO Act debate on the floor as well as in this 
Committee but now we are going to focus on the topic at hand. 
Without objection, all other Members who wish to insert written 
statements into the record may do so by submitting them to the 
Committee Clerk electronically in Microsoft Word format by 5 
p.m. on Tuesday, February 18, 2020.
    I will now introduce our distinguished panel of witnesses 
and I will introduce each witness before we begin questions. 
First, Ms. Jenny Yang served as the Chair of the U.S. Equal 
Employment Opportunity Commission from September of 2014 to 
January of 2017, and as Vice Chair and a Member of the 
Commission from 2013 to 2018.
    Under her leadership, the commission launched the Select 
Task Force on the Study of Harassment in the Workplace to 
identify innovative solutions to prevent harassment at work. 
And she led efforts to strengthen the EEOC's annual data 
collection to include employer reporting of pay data.
    Next, we have Dr. Ifeoma Ajunwa. She is an assistant 
professor of labor and employment law in the Law, Labor 
Relations, and History Department of Cornell University's 
Industrial and Labor Relations School and an associate faculty 
member at Cornell Law School.
    She is also a faculty associate at the Berkman Kline Center 
at Harvard Law School and an affiliate of The Center for the 
Study of Inequality at Cornell University. She is a 2019 
recipient of the National Science Foundation Career Award and a 
2018 recipient of the Derrick A. Bell award from the 
Association of American Law Schools.
    Dr. Ajunwa's research interests are at the intersection of 
law and technology with a particular focus on the ethical 
governance of workplace technologies.
    And at the discretion of the Chair, I do want to mention 
that Derrick Bell was my law school dean when I went to law 
school at the University of Oregon, so it is an honor that you 
are here with that award, that distinguished award.
    Ms. Esther Lander is a partner at Akin Gump Strauss Hauer & 
Feld LLP, Washington, D.C., where she focuses on complex 
employment litigation, high-stakes internal and government 
investigations, and client counseling.
    She previously served as the Principal Deputy Chief of the 
Employment Litigation Section within the Civil Rights Division 
at the Department of Justice.
    Mr. Peter Romer-Friedman is a principal at Gupta Wessler 
PLLC in Washington, D.C., where he heads the firm's new civil 
rights and class actions practice.
    He maintains a dynamic and innovative civil rights docket 
with an emphasis on employment discrimination and benefits, 
fair housing, credit discrimination, and constitutional rights. 
The civil rights cases often arise at the cutting edge of the 
law and focus on solving both entrenched and emerging problems 
with novel approaches.
    We appreciate all of the witnesses for being here today and 
we look forward to your testimony. Let me remind the witnesses 
that we have read your written statements and they will appear 
in full in the hearing record.
    Pursuant to Committee Rule 7(d) and Committee practice, 
each of you is asked to limit your oral presentation to a 5 
minute summary of your written statement.
    Let me remind the witnesses as well that pursuant to Title 
18 of U.S. Code Section 1001, it is illegal to knowingly and 
willfully falsify any statement, representation, writing, 
document, or material fact presented to Congress or otherwise 
conceal or cover up a material fact.
    Before you begin your testimony, please remember to press 
the button on your microphone in front of you so it will turn 
on and the Members can hear you.
    As you begin to speak, the light in front of you will turn 
green. After 4 minutes, the light will turn yellow to signal 
that you have 1 minute remaining. When the light turns red, 
your 5 minutes have expired and we ask you to wrap up. We will 
let the entire panel make their presentations before we move to 
Member questions. When answering a question, again, please 
remember to turn your microphone on, and I first recognize Ms. 
Yang for your testimony.

    TESTIMONY OF JENNY R. YANG, J.D., SENIOR FELLOW, URBAN 
                           INSTITUTE

    Ms. YANG. Thank you. Chair Bonamici, Ranking Member Comer, 
and Members of the Subcommittee, thank you for inviting me here 
today. I am a Fellow at the Urban Institute, but the views 
expressed are my own and shouldn't be attributed to Urban, its 
trustees or funders.
    I would like to start by sharing a story of Kyle Behm, a 
bright college engineering student who applied for an hourly 
job at Kroger. He had held similar positions in the past yet 
after taking a personality assessment, he was scored red and 
rejected.
    Kyle had earlier been diagnosed with bipolar disorder, so 
personality questions such as whether he experienced mood 
changes led many major retailers to reject him.
    Sadly, Kyle is no longer with us today, but his father 
Roland continues to advocate to ensure people with disabilities 
are not systematically excluded by hiring assessments.
    A new generation of AI-driven screens are transforming the 
lives of America's workers with profound implications for civil 
rights. To ensure an equitable future, we must ask the question 
who is at risk of being screened out. Otherwise, workers who 
fall outside of a set profile could be unemployable for reasons 
that are--aren't truly job related. Today, I will focus on two 
areas. First, I will discuss algorithmic hiring and 
discrimination. Second, I'll address new tech-driven civil 
rights concerns for workers on the job. Let's take a look at 
the stages of the hiring process through this hiring funnel. In 
the sourcing stage, employers recruit applicants. In the 
screening phase, employers assess applicant's abilities. In the 
interviewing stage, many employers now use video interviews to 
evaluate candidates. Finally, employers select candidates and 
set pay. In each stage, complex algorithms inform decisions.
    Today, I will focus on screening algorithms. Because of the 
dramatic rise in online applicants, employees are using chat 
bots. Chat bots, resume screens, online assessments and web 
games to automate decisions.
    Some employers are seeking to increase diversity by 
measuring abilities rather than relying on proxies such as 
elite university degrees.
    Yet many employers simply attempt to automate their past 
hiring decisions which may reflect bias. Algorithmic systems 
can then replicate existing inequities on a massive scale. Bias 
can enter systems in several ways. First, bias may be 
introduced in the data using--used to train algorithms. 
Amazon's effort to build a resume screen highlights this 
challenge.
    The computer models trained on resumes submitted over 10 
years which were mostly from men. The model then learned to 
prefer males and penalize women's resumes containing words such 
as women's chess club or all women's colleges.
    Second, bias may arise from the variables considered. 
Models may learn to utilize proxies for protected 
characteristics. For example, zip codes can be a proxy for 
race. The selection of variable can reflect the blind spots of 
developers, a particularly acute concern given the lack of 
diversity in the fields.
    Finally, humans may misuse the predictions and place undue 
weight on them. To ensure safeguards, I share three strategies 
for consideration.
    First, an update to the Uniform Guidelines on Employee 
Selection Procedures of 1978 would incorporate the latest 
scientific understanding into unified government principles. 
Second, a third-party auditing system would promote 
accountability while having flexibility to evolve with 
technology and protect intellectual property.
    Third, a worker's bill of rights for algorithmic decisions 
would ensure that individuals understand how decisions are made 
and have a process to challenge them.
    Next, I'd like to turn to new tech-driven civil rights 
concerns for workers on the job. One significant concern is 
that increased surveillance and tracking of workers' 
interactions throughout the day may deter workers from coming 
together to raise civil rights concerns for fear of 
retaliation.
    Another concern is that a growing reliance on customer 
ratings by tech platforms and automated performance systems can 
introduce harmful and unchecked bias.
    Finally, online platforms have disrupted traditional 
employment relationships, classifying many workers as 
independent contractors.
    As non-employees, they aren't protected by most Federal 
antidiscrimination laws. Although Section 1981 prohibits 
intentional discrimination in contracting based on race and 
ethnicity, it doesn't prohibit other forms of discrimination 
such as sexual harassment.
    States are filling these gaps by providing protections for 
independent contractors and making it more difficult to 
misclassify workers.
    To ensure a future that advances equal opportunity, we need 
safeguards that create meaningful accountability. Focus cannot 
remain solely on optimizing processes for employers but must 
also consider the impact on workers' dignity and civil rights. 
Thank you. I look forward to your questions.
    [The statement of Ms. Yang follows:]
    
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]    
       
    Chairwoman BONAMICI. Thank you for your testimony. Dr. 
Ajunwa.

TESTIMONY OF IFEOMA AJUNWA, J.D., PH.D., ASSISTANT PROFESSOR OF 
          EMPLOYMENT AND LABOR LAW, CORNELL UNIVERSITY

    Ms. AJUNWA. Chair Bonamici, Ranking Member Comer, and 
members of the subcommittee, thank you for the opportunity to 
testify today.
    I am a labor and employment law professor at Cornell 
University and I have been asked to testify today on two 
topics.
    The first, employment discrimination and privacy concerns 
arising from automated hiring including automated video 
interviewing. And the second, privacy and discrimination 
concerns related to the use of workplace wellness programs and 
electronic workplace surveillance.
    These technological advancements and the potential for 
employment discrimination beg for updates to labor and 
employment law.
    I identify three major problems with automated hiring. The 
first, the design features of automated hiring platforms may 
enable employers to eliminate applicants from protected 
categories without retaining a record.
    Second, intellectual property law which protects automated 
hiring from scrutiny could allow discriminatory practices to go 
undetected.
    And third, the unrestricted portability of applicant data 
from automated hiring systems increases the chances of repeated 
employment discrimination resulting in algorithmic black 
balling.
    Automated video interviews are the newest trend in 
automated hiring. With this new technology, candidates' 
responses are captured on video and then evaluated based on 
word choice, speech patterns, and facial expressions.
    When video interviewing systems are trained on White male 
voices and faces, this disadvantages both racial minorities and 
White women whose facial expressions and tone of voice might be 
misinterpreted.
    Other issues associated with automated hiring include the 
unregulated collection of applicants' personal data and the 
black box nature of how such information is used.
    To date, there are no Federal regulations governing the 
collection, storage, or use of data from automated hiring. To 
remedy this, I propose three updates to labor and employment 
law.
    First is the addition of a third cause of action, the 
discrimination per say doctrine, to Title VII. Second, the 
requirements for audits and certification of automated hiring 
systems. And third, a mandate for data retention and record 
keeping design features for automated hiring systems.
    In addition to automated hiring, technology has advanced 
the capability of employers to monitor their workers through 
digital surveillance and also employee wellness programs. 
Beginning with punch card systems, advancing to GPS systems, 
and most recently microchips embedded under the skin, invasive 
workplace surveillance is now a part of life for most 
Americans.
    For example, workplace wellness programs have evolved to 
offer health risk assessments and despite protections afforded 
by antidiscrimination laws, employers have started to offer 
genetic tests to employees.
    With the introduction of genetic testing to workplace 
wellness programs contradicts both the letter and the spirit of 
the Genetic Information Nondiscrimination Act and the Americans 
with Disabilities Act.
    To protect the health privacy of workers, my coauthors and 
I have proposed two new laws. First, the Employee Privacy 
Protection Act, the EPPA, would ensure that employee monitoring 
is constrained to the workplace and actual job tasks.
    The EPPA would limit surveillance outside the workplace and 
would prohibit the monitoring of employees when they're off 
duty.
    Second, the Employee Health Information Privacy Act, the 
EHIPA, would clarify that health information generated from 
workplace wellness programs are--is protected information under 
existing antidiscrimination and health privacy laws.
    The EHIPA would also ensure that data collected from 
workers could not be sold without the employee's consent.
    For the future of work, the primary concern should be 
whether workers will enjoy equal opportunity for employment and 
also thrive in workplaces that respect human privacy.
    Governmental action is necessary to protect workers from 
being forced to trade their dignity in the employment 
bargaining. I thank the Committee for the opportunity to 
testify today and I look forward to your questions.
    [The statement of Ms. Ajunwa follows:]
    
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]    
    
    Chairwoman BONAMICI. Thank you for your testimony. I now 
recognize Ms. Lander for 5 minutes for your testimony.

TESTIMONY OF ESTHER G. LANDER, J.D., PARTNER, AKIN GUMP STRAUSS 
                        HAUER & FELD LLP

    Ms. LANDER. Thank you, Chair Bonamici, Ranking Member 
Comer, and Members of the Subcommittee for allowing me to 
appear before you today.
    I am a partner at the law firm Akin Gump in the firms' 
labor and employment group here in Washington, D.C. I 
previously served as the Principal Deputy Chief in the 
employment litigation section of the Department of Justice 
Civil Rights Division and I am appearing here today in my 
personal capacity.
    In my written testimony, I describe the many benefits 
associated with using technology in employee selection 
procedures.
    If used correctly, the business case is clear. Employers 
are able to harness the power of available data to efficiently 
make sound selection decisions, reduce manual labor, subject 
candidates to the same objective screening criteria, and 
eliminate the potential for implicit bias that exists with 
subjective decision making.
    With so many technology-based tools on the market however, 
concerns have been raised that AI screening is resulting in 
unlawful discrimination.
    To date there have been few lawsuits challenging AI tools 
and there are no published studies to show technology-based 
selections are more likely to result in discrimination than 
more traditional paper and pencil tests.
    With that said, when employers implement technology to make 
selection decisions, it is important to understand the laws 
that already exist to protect applicants and candidates from 
unlawful discrimination.
    Specifically, Congress passed the Civil Rights Act of 1991 
which amended Title VII to make disparate impact discrimination 
an unlawful employment practice.
    Under the 1991 act, any selection procedure that adversely 
impacts protected groups must be justified by the employer as 
job-related and consistent with business necessity.
    To make this showing, employers must document a strong 
connection between the selection procedure and the job in 
question which typically involves a process called testing 
validation.
    Courts assess the adequacy of an employer's validation 
efforts under the Uniform Guidelines on Employee Selection 
Procedures which were adopted by the EEOC and other government 
agencies to assess the lawfulness of selection procedures under 
Title VII.
    Although the guidelines were established in 1978, which 
admittedly was a long time ago, they are well equipped to 
address the concerns expressed by other witnesses today about 
AI tools resulting in hiring decisions based on non-job-related 
correlations that screen out protected groups.
    First, the guidelines anticipate developments in hiring 
techniques and tools and make clear that all selection 
procedures need to be reviewed in light of current 
understandings which in itself is a basis to reject validation 
studies premised on non-job-related correlations.
    Second, the guidelines direct enforcement agencies to 
consider whether the selection procedure was carefully 
developed and is being used in accordance with professional 
standards. This concept is commonly referred to as competent 
test design.
    So for example if an AI tool has machine learned to 
disproportionately screen out applicants from a protected group 
because they do not share the same zip code as successful 
incumbents, an employer would not be able to show competent 
test design even if a strong correlation exists between 
performing successfully on the job and one zip code. Third, the 
guidelines require that all validation studies include a 
complete and explicit description of the selection procedure 
that includes any measures that are being used. This written 
transparency requirement means that vendors cannot hide behind 
the so-called black box.
    A proper validation study that complies with the guidelines 
must explain what the selection procedure is measuring and then 
correlate those measures with successful job performance, 
reduced turnover, or other important job-related behaviors.
    And finally, regardless of how a selection procedure is 
validated, the guidelines require an investigation into 
fairness.
    This investigation could include taking a deeper look at 
the selection procedure to see what items were in the case of 
AI tools, which screening criteria are causing adverse impact 
and to consider removing those criteria and making other 
modifications that will result in a fair selection procedure. 
I'd also like to briefly address the gig economy, an area where 
advances in technology have created opportunities for works and 
companies.
    Gig workers can take advantage of low costs, flexible 
hours, and the ability to easily build an independent business. 
The ease of technology and the volume of workers using it has 
heightened concerns about worker misclassification.
    However, there is a body of law that already exists to 
address this topic as does a comprehensive remedial scheme for 
workers who have been misclassified. The remedies for 
misclassified workers grow even more substantial when recovered 
on a class-wide basis which have served as a powerful deterrent 
against worker misclassification.
    In closing, technology advances are beneficial to workers, 
employers, companies, and the economy. As the labor force and 
businesses adapt to these changes, employment laws are 
currently in place to ensure that worker rights are protected.
    Thank you for the opportunity to speak with you today and 
share my thoughts on the important topics covered by this 
hearing. I look forward to answering your questions.
    [The statement of Ms. Lander follows:]
    
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]    
        
    Chairwoman BONAMICI. Thank you for your testimony. I now 
recognize Mr. Romer-Freidman for 5 minutes for your testimony.

TESTIMONY OF PETER ROMER-FRIEDMAN, J.D., PRINCIPAL AND HEAD OF 
THE CIVIL RIGHTS AND CLASS ACTIONS PRACTICE, GUPTA WESSLER PLLC

    Mr. ROMER-FRIEDMAN. Thank you. Good afternoon and thank you 
for the opportunity to testify today. My name is Peter Romer-
Friedman, I'm a principal at Gupta Wessler PLLC and the head of 
the firm's civil rights and class actions practice.
    As a civil rights lawyer, I've represented victims of 
discrimination in jobs, housing, credit, and public 
accommodations. They've included workers in many industries, 
service members, and veterans, victims of Hurricane Katrina and 
the foreclosure crisis, as well as farmers and ranchers. Lately 
I have focused on combatting digital bias.
    Sixty years ago, there were no desktop computers or 
websites, but we did have entrenched discrimination in the 
workplace, housing, in public spaces.
    If you picked up a newspaper in 1960, you'd see classified 
ads with segregated columns for male and female jobs. Job ads 
that stated explicit preferences based on race, gender, and 
age.
    Congress tried to put an end to this biased advertising and 
recruiting when it enacted Title XII of the Civil Rights Act 
and the Age Discrimination in Employment Act.
    Congress knew this discrimination has huge negative 
consequences. If you announce a job is for men, women are less 
likely to apply. If you primarily recruit men, mostly men will 
be hired.
    For decades it appeared that these laws were working. Overt 
discrimination and statements in newspapers disappeared. Most 
employers stopped openly recruiting based on biased 
preferences. This all changed however, when employers decided 
to harness the power of the internet and social media to 
recruit workers. Advertising platforms like Facebook enabled 
employers to discriminate in their job advertising so that they 
could target job ads only to people of certain races, genders, 
ages, zip codes, and even political interests.
    An untold number of employers deployed these very tools to 
expressly exclude workers from receiving their job ads based on 
many protected traits.
    And until recently when Facebook made changes due to a 
settlement with my clients, it was possible for employers to 
exclude people from getting job ads based on thousands of 
categories unrelated to jobs.
    For example, an employer could decide not to send their job 
ads on Facebook to people interested in Christianity, the 
Republican National Committee, the ACLU, or the AFL-CIO.
    And just a few years ago, employers could target job ads on 
Facebook to people interested in heinous things like Hitler, 
White pride, fascism, rape, and ISIS.
    There has never been a full public accounting of all the 
biased ads published on Facebook but here is what we know from 
investigative journalism and the investigation of my client, 
the Communications Workers of America.
    Hundreds if not thousands of employers routinely excluded 
women and older workers from getting job ads on Facebook. The 
same bias was common in ads for housing, credit, and other 
financial services.
    There have likely been hundreds of millions of incidents of 
digital bias. Here are a few real-life examples. T-Mobile sent 
job ads on Facebook targeting people who were only 18 to 38 
years old. Amazon sent job ads on Facebook that targeted only 
people 18 to 50. A leading security installation company called 
Defenders sent job ads targeting only men 20 to 40.
    Thankfully, many terrific advocates stepped up to challenge 
this harmful discrimination. Organizations like the CWA, the 
ACLU, National Fair Housing Alliance and my prior law firm, 
Outten & Golden.
    We took Facebook to court and filed EEOC charges against 
dozens of employers that denied job ads to women or older 
workers.
    After years of litigation, Facebook in March of 2019, 
agreed to make sweeping changes to its platform to prevent 
advertisers from denying job, housing, and credit ads based on 
protected statuses and Facebook recently implemented those 
changes. Still, we are very concerned that Facebook's own 
algorithm may be discriminating based on age and gender when 
Facebook itself decides which users will receive job ads within 
an audience the advertiser selected.
    We are also troubled that dozens of major employers 
including Amazon, T-Mobile, and Capital One are claiming that 
Federal law does not bar them from denying job ads to workers 
based on a protected status like age.
    We believe our Federal civil rights laws already outlaw 
this crude digital bias and recently we have seen the DOJ, 
EEOC, and HUD agree that it's illegal to deny job or housing 
ads based on a person's race, gender, or age.
    But Congress can and should take critical steps to clarify 
and strengthen Federal law to stop digital bias. I have 
recommended a range of critical steps that Congress can take 
including ensuring that tech platforms like Facebook are 
covered by civil rights laws, clarifying that certain types of 
digital bias are unlawful, requiring greater disclosure of 
digital practices and bias, and making sure that the Federal 
public accommodations law applies to online spaces, and ending 
section 230(c) immunity for commercial or paid advertising.
    In too many areas of our society, the move fast and break 
things credo of powerful technology leaders like Mark 
Zuckerberg has turned back the clock by more than half a 
century. It has upended our civil rights, our civil discourse, 
and even the most basic facts that our society can agree upon.
    Technology should not disrupt our civil rights. It 
shouldn't break equal opportunity. Technology should be a 
mechanism for making the promise of equal opportunity and 
integration a reality, especially in the workplace.
    Thank you very much, appreciate the opportunity to answer 
any questions.
    [The statement of Mr. Romer-Friedman follows:]
    
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]    
       
    Chairwoman BONAMICI. Thank you so much to each of our 
witnesses today. Under the Committee Rule 8(a), we will now 
question witnesses under the 5 minute rule.
    And I want to say in light of all the testimony we heard 
today, I'm sure everyone wishes for more than 5 minutes because 
we have so many questions but I will now yield myself 5 
minutes.
    Professor Ajunwa, in your written testimony, you discuss 
how companies use automated video interviewing that permits the 
employer to evaluate factors that are not job related in the 
interviewing process.
    Last year, in the Science, Space and Technology Committee, 
Joy Buolamwini is the founder of the Algorithmic Justice League 
testified on some of these issues and they discussed their 
experience with facial analysis software failing to detect 
their dark skin until they put on a white mask which uncovered 
both skin type and gender bias in the AI services from 
companies like Microsoft, IBM, Amazon.
    So, Professor, what characteristic can employers evaluate 
when using automated video interviewing and do individuals 
typically know these factors being evaluated as they interview?
    Ms. AJUNWA. Thank you for your question, Chair Bonamici. 
So, the--one of bigger problems that automated video 
interviewing is that oftentimes the job applicants don't 
actually know that they will be evaluated based on their video.
    They just think that they're sending in a video that will 
then be viewed by humans but actually that video is actually 
being put through algorithms that are evaluating both the 
facial expressions, tone of voice, even word choice.
    And the problem with that of course is that if you look at 
how the training of the algorithm is done, oftentimes the 
training is using a very limited pool of applicants so it could 
be all White male applicants and in which case, women who have 
different tones of voices or even people who are from other 
cultures and therefore have different facial expressions can 
actually be disadvantaged because then their responses can be 
misinterpreted by the algorithm.
    Chairwoman BONAMICI. Thank you. Both on this Committee and 
on the Science Committee, we have a lot of conversations about 
the importance of diversifying the STEM, STEAM workforce and I 
think that is one step in solving this problem because 
obviously, it is designing the algorithm it makes a difference.
    Mr. Romer-Friedman, in your testimony you said that job 
advertisements are often targeted based on categories that are 
not job related, or proxies, and you described how individuals 
may be excluded from seeing job ads. Thank you for the actual 
visual representation.
    Do you consider excluding an individual from seeing a job 
ad because their experience exceeds a maximum number of years 
or because they attended a women's college for example, would 
those be examples of targeting based on proxies?
    Mr. ROMER-FRIEDMAN. Absolutely, Chair Bonamici. These are 
the kinds of things that without the digital procedures and 
processes could be illegal. We see this a lot in the economy 
for older workers. They're excluded simply because they have 
too many years of experience or they graduated from college a 
number of years ago.
    But we are seeing the something accelerated and exacerbated 
in the digital space and that's a problem. We think that it 
clearly violates the law. It not only has a disparate impact; 
we think you can infer intentional discrimination from these 
kinds of clear proxies.
    Chairwoman BONAMICI. And how could Congress best make sure 
that the employers are not using proxies to discriminate based 
on sex, age, religion, other categories, protected categories?
    Mr. ROMER-FRIEDMAN. As I have recommended in my testimony, 
Congress could explicitly say that if a job category or if a 
category for targeting someone or evaluating someone is not 
directly related to the job or the opportunity, it simply is 
banned, it would be an unlawful practice.
    In the same way that it's just strictly unlawful regardless 
of the intent to advertise a job that states a preference based 
on age or race or gender.
    Chairwoman BONAMICI. Thank you. Ms. Yang, you're a former 
Chair of the Equal Employment Opportunity Commission. What 
additional resources could Congress provide to the EEOC, the 
Commission on Civil Rights, and the Office of Federal Contract 
Compliance Programs to adequately address the problems that 
were described today with algorithmic bias and digital 
discrimination in hiring?
    Ms. YANG. Thank you, Chair Bonamici, for that question. The 
government plays a particularly important role in rooting out 
hiring discrimination because individuals typically don't know 
why they weren't hired.
    So the EEOC made it a priority to look at recruiting and 
hiring discrimination and the agency has authority to open 
charges on their own investigation, even where an individual 
may not have enough information.
    So right now under our current law, the Federal Government 
plays an incredibly important role in investigating concerns 
about hiring screens and the agencies need more resources. They 
need to be able to hire computer scientists and data scientists 
who understand how these systems work.
    We had initially started a task force over 4 years ago back 
when I was at the EEOC. We had Professor Ajunwa testify and 
help us learn about these issues but we didn't have the 
capacity on staff to really, fully understand how to evaluate 
these systems, to understand how the Uniform Guidelines really 
need to be updated and having that technical know-how within 
the agency would be incredibly valuable.
    Chairwoman BONAMICI. Thank you so much. I yield back and 
recognize the Ranking Member of the Full Committee, Ms. Foxx, 
from North Carolina for your questions.
    Ms. FOXX. Thank you, Madam Chairwoman, and I want to thank 
our witnesses for being here today. Ms. Lander, the Federal 
laws prohibiting employment discrimination do not explicitly 
address the technologies we are discussing today. In your 
opinion though, are these statutes readily applied in the 
modern workplace and to employers' use of search engines, 
algorithms, and AI in the recruitment and screening process? Do 
you see gaps in these laws or do you believe these laws are 
more than broad enough to cover new technologies?
    Ms. LANDER. Thank you for that question. As a practitioner 
in this area who actually has counseled clients and reviewed 
some of the tools that we're talking about today, I have not 
had any difficulty applying the Uniform Guidelines as written 
to assess these tools and to provide feedback to both my 
clients and the vendors who are selling them regarding ways in 
which they should be modified or enhanced to ensure 
nondiscriminatory selections.
    So to answer your question, yes, I do believe that it is 
not difficult to apply the Uniform Guidelines as they're 
currently written to address the concerns that are being raised 
by the panel today with regard to the technology tools that are 
on the market.
    Ms. FOXX. Thank you, Ms. Lander. Ms. Lander, what are the 
upsides for business owners of using new technologies in 
recruiting and screening job candidates from the perspective of 
complying with non-discrimination laws?
    In what ways are these technologies superior to other forms 
of job recruitment and screening when it comes to complying 
with the Federal laws prohibiting discrimination?
    Ms. LANDER. Well, in this current climate that we are in, 
the volume of resumes and applications that are submitted, it's 
quite different than the day and age where somebody had to walk 
in a fill out a paper application.
    Employers are being bombarded with, you know, thousands of 
applications sometimes when they have an opening and in its 
simplest form, AI tools are capable of simply scanning those 
applications or resumes simply to screen out those who don't 
even have the minimum qualifications for the job which saves a 
substantial amount of man power and time in trying to do that 
with a person.
    To answer your question though about how these tools can 
help reduce discrimination, when done correctly, these tools 
eliminate the risk of implicit bias in decision making because 
when the criteria that they're screening for are job related, 
the entire screen is objective and is not susceptible to what 
somebody might believe when they see a particular name or they 
look at somebody and see a particular race or gender.
    Ms. FOXX. Thank you. Ms. Lander, you discussed in your 
written testimony, the Uniform Guidelines on Employee Selection 
Procedures jointly written by the EEOC, the Civil Service 
Commission, Department of Labor, and Department of Justice and 
provide guidance for the employers and obviously you have 
mentioned those in your comments now.
    Based on your experience, do the guidelines apply to the 
algorithms in AI that many employers are using that is somewhat 
repetitious to my first question, and do the guidelines provide 
useful information and best practices for employers?
    Ms. LANDER. They do. Some of the tools that we're 
discussing are recruiting tools and some are hiring or 
selection tools.
    So the guidelines are aimed at any sort of hiring test or 
selection device that makes decisions that allows people to 
proceed in the hiring process.
    So when it comes to recruiting, there is a difference 
between sourcing, which is efforts to expand your pool of 
eligible candidates or applicants that meet your qualification, 
the qualifications for the job.
    And so some of these tools are being used to simply to 
expand and enhance traditional forms of recruiting. And if 
people aren't being excluded, if it's not an exclusive method 
of recruiting or the sole method of recruiting, then arguably 
adverse impact is not going to be an issue and so in that case, 
if there is no adverse impact, the guidelines don't come into 
play.
    Ms. FOXX. Quick, quick question. If an employer wrongly 
classifies an employee as an independent contractor, isn't 
there significant potential liability for the employer 
including back pay and liquidated damages under the Fair Labor 
Standards Act which provides substantial incentives not to 
classify workers incorrectly?
    Ms. LANDER. Yes. That's correct. Not only is back pay and 
liquidated damages available, for willful violations, the 
statute of limitations goes back 3 years.
    And when you--the class action activity in this space has 
been quite active and it has actually changed behavior and a 
lot of employers and companies and workers are all quite aware 
of the issues involved with misclassification and rights are 
being protected and asserted on a regular basis through the 
courts.
    Ms. FOXX. Thank you, Madam Chairwoman, I appreciate your 
indulgence.
    Chairwoman BONAMICI. I now recognize the Chairman of the 
Full Committee, Mr. Scott from Virginia for 5 minutes for your 
questions.
    Mr. SCOTT. Thank you. Ms. Ajunwa, several of you mentioned, 
you know, if a women's college gets mentioned that could have a 
negative effect. Who decides what instructions are given to 
create the algorithm and what happens when you get some kind of 
hit? Who does it, who actually designs it?
    Ms. AJUNWA. Thank you very much for your question. I guess 
the question is who comes up with the criteria used for 
programming the algorithm.
    And oftentimes algorithms are programmed by vendors who 
then sell them to employers. But also, employers can have 
algorithms that I call bespoke, meaning that these algorithms 
are created specifically for that employer.
    So this of course can change the liability whether the 
algorithm has been created by the vendor or specifically 
created at the behest of the employer.
    So I guess my focus today is really the first scenario. 
When the algorithm is being created by a vendor and the 
employer perhaps does not know exactly what has gone into the 
algorithm and also how it has been trained.
    So really, my advocacy today is really for both the 
auditing and certification of automated hiring systems before 
they are deployed, before they can actually be used in the 
workplace. Because I do believe that, you know, as all the 
witnesses have stated, if automated hiring is used properly or 
correctly, they could be helpful.
    The problem is they currently are not, right. The problem 
is that there are currently no regulations to actually ensure 
that they are being used correctly and appropriately.
    Mr. SCOTT. Well, once you have designed--once its designed 
with the discrimination kind of embedded, if the employer 
bought it from a vendor, would they be immunized from any kind 
of intentional discrimination?
    Ms. AJUNWA. So that is a gray area. That's a gray area in 
terms of the law because one thing that Title VII requires is 
intent and other than intent the showing of disparate impact.
    And both things can be hard to prove if the automated 
hiring system is coming from a vendor because first, you can 
argue perhaps there is no intent on the side of the employer 
but then there is also the issue of even establishing disparate 
impact because you would need statistical proof and the 
automated hiring is--the automated hiring system might have 
been designed not to retain all the record that you need for 
that group.
    Mr. SCOTT. Well, how does the employer know, he buys this 
little algorithm thing and uses it, turns out it's screening 
people. How would he know?
    Ms. AJUNWA. He wouldn't. So that is why I am advocating for 
an audit requirement for employers who do then buy automated 
hiring systems or use automated hiring systems.
    Mr. SCOTT. Thank you. Ms. Yang, the Ranking Member brought 
up independent contractors. If you are an independent 
contractor, you are not protected under the employer employee 
Title VII, ADA, and others.
    In--but you would be protected under Section 1981 of the 
Civil Rights Act of 1866 where you can't discriminate. Are 
there limitations in Section 1981 in terms of pursuing 
discrimination claims if you are an independent contractor?
    Ms. YANG. Yes. Federal law provides very limited 
protections for independent contractors. Under Section 1981, 
the claims must prove intentional discrimination which can be 
very difficult to show in the case of algorithmic bias.
    In addition, it only covers race and also ethnicity 
discrimination but not other bases, right. So it wouldn't cover 
sexual harassment, age discrimination, disability--based 
discrimination.
    And our Federal antidiscrimination laws contemplate that 
true independent contractors will have the bargaining power 
that they don't need to be protected against discrimination.
    But the way in which many companies are misclassifying 
independent contractors today means that many individuals do 
not truly have bargaining power and they need the protection of 
our antidiscrimination laws.
    Mr. SCOTT. Thank you. Madam Chair, I yield back.
    Chairwoman BONAMICI. Thank you, Mr. Scott. I now recognize 
Ms. Stefanik from New York for 5 minutes for your questions.
    Ms. STEFANIK. Thank you, Chairwoman Bonamici. Ms. Lander, I 
appreciate that you raised how contractual or gig arrangements 
can be beneficial to workers, as I believe this perspective 
needs to be central in our discussions on worker 
classification.
    Millennials, as you know, now comprise the largest cohort 
in the U.S. labor force and these workers place a higher value 
on the flexibility and fulfillment that can exist outside the 
rigid constraints of traditional employment.
    For years, independent contracting has sparked 
entrepreneurship and provided an important source of income and 
flexibility to millions of Americans including students, 
veterans, and single parents.
    In your testimony you mentioned how there are various legal 
tests courts and government agencies applied to distinguish 
employees from independent contractors.
    I have heard from employers, particularly small business 
owners, that this inconsistency between various agencies has 
muddied the line on worker classification and really created 
compliance challenges.
    Do you believe that harmonizing the legal test across 
Federal agencies would help draw a clearer bright line on the 
issue of worker classification and help workers as well as 
business owners know when misclassification has indeed 
occurred?
    Ms. LANDER. Yes, I do think that would actually make life a 
lot easier for employers. However, the problem is that the 
definition of employee differs from statute to statute and so 
unfortunately what that means is when courts are interpreting 
whether a particular civil rights or labor law applies, they 
have to look at the statutory text and apply it. So as easy as 
it would be to have a uniform definition, if you're going to be 
honest to the statutes that involve workers, you can't have a 
uniform definition across all of the agencies.
    Ms. STEFANIK. So how would you address that then? If there 
is a uniform definition legislation which I have worked on, 
what would we need to do in addition to that?
    Ms. LANDER. Well, I'm not a lawmaker so I can't really 
answer that question.
    Ms. STEFANIK. Great. Well, your perspective is important on 
that. I would like to follow up on that issue to make sure that 
we get this right.
    And very briefly, what would happen, what would be the 
impact of bringing California's ABC test nationwide and would 
it allow workers who value freedom and flexibility the choice 
to maintain their status as independent contractors?
    Ms. LANDER. My understanding of the California test is that 
it moves away from the traditional right to control which is a 
critical element in all of the independent contractor analyses 
under the various laws and talks about the essence of the 
business.
    And so if a worker is engaging in services that is the 
essence of the company's business and I don't know if I'm 
wording that exactly right, then he or she can't be an 
independent contractor.
    And that would essentially completely change the entire 
working dynamic for not just the gig economy which has been a 
tremendous boon and not only for companies that have been able 
to expand their reach where they otherwise couldn't, but it's 
also been wonderful for as you described in your opening 
remarks, for individuals who need the flexibility to work 
different schedules and seasonally and things of that sort.
    Ms. STEFANIK. Thank you. You know, as we discuss this 
issue, I think it is really important that we channel these 
technological and entrepreneurial opportunities for young 
people and members of the nontraditional workforce, people that 
maybe are augmenting their full time job, people that as they 
are aging want to earn some money on the side. You know, there 
is lots of benefits to this gig economy and we have to 
remember, it is totally voluntary by individuals who seek out 
those opportunities. And with that I yield back.
    Chairwoman BONAMICI. Thank you very much and I now 
recognize Dr. Schrier for 5 minutes for your questions.
    Ms. SCHRIER. Thank you so much to all of you for being 
here. I really enjoyed reading your testimonies and hearing you 
today and it is so interesting to think about really there is a 
rabbit hole that you can go down when you start thinking about 
how every question you ask or every parameter you put in an 
algorithm can lead down the line to some sort of 
discrimination.
    And I think that this was all developed for efficiency and 
to cast like a broader net but a more specific net but in doing 
that with so many of the things you talked about like age or 
even look alike that looks at a current workforce, has led to 
discrimination inadvertently.
    And so I represent part of Washington State and they just 
had a future of work task force and they released a report in 
December talking about automation in the workplace and how AI 
will change the way we work but it barely touched on this topic 
of algorithmic discrimination and how that leads to people even 
finding out about jobs or being eligible for jobs. And so, Ms. 
Lander, you talked about kind of a look back, you know, once a 
system is in place, how do we look at it and see if it is 
discriminating.
    And I am wondering if there is a way to look forward? So 
this is sort of question for Ms. Yang, Dr. Ajunwa about whether 
you--whether there are things that we can do to either 
fundamentally change the way these algorithms work or whether 
we should look in another place and change privacy laws so that 
the algorithms can't even obtain some of that information and 
how you might balance those two.
    Ms. YANG. Thank you for that question. I think we have a 
lot of opportunity to make algorithms work more fairly than 
they are right now. And it starts with ensuring that the 
information you're considering is truly job related.
    And we talked about the training data. Is it diverse and 
representative of the full spectrum of people that can perform 
the job? And then what are the criteria that you're building 
into the variables? Are you thinking about abstract personality 
characteristics that maybe have some correlation but a heck of 
a lot of people would also be able to perform the job even if 
that weren't their top personality characteristic??
    And so it comes back to ensuring that we are really being 
rigorous about the screen being job related. And the closer you 
can tailor what you're selecting for to behaviors on the job, 
the more you can minimize the risk of screening people out who 
could perform the job.
    And I do think many advances in technology now will allow 
us if you design a system up front to document the decisions so 
that you can explain how they were made which is necessary 
under our current laws to ensure accountability. Employers 
themselves, even if they say I didn't design it, I didn't know 
what was in the algorithm, they are nonetheless responsible. 
And my view is that they absolutely need to understand how 
these decisions are made. They need to be able to explain them 
through when the government comes in and asks about their 
system or in litigation and I do think we need new laws both to 
protect privacy but also to create the right incentives because 
these cases are very expensive to litigate.
    Ms. SCHRIER. This is, I'll get back to kind of a follow-up 
question. I wanted to give you a chance Dr. Ajunwa to give a, 
your answer and then I have a follow-up question about it.
    Ms. AJUNWA. Thank you very much for your question. I do 
strongly agree that we have to be forward looking because being 
backward looking is basically taking action after the harm has 
already been done and I think we can actually prevent a lot of 
harm from the onset.
    And that includes for example mandates for the design of 
these automated hiring systems which we don't yet have. And 
you're very right to pinpoint that part of the problem is the 
way that we handle privacy, especially privacy of workers in 
the United States and that part of the problem is thinking 
through what sort of information is actually being pulled into 
the system of automated hiring--
    Ms. SCHRIER. And it is all out there.
    Ms. AJUNWA. Right.
    Ms. SCHRIER. Can I just quickly in the interest of time, my 
next part was about you had said are these issues kind of 
pertinent to the job?
    And so a few years ago, Google had a project called 
Aristotle and they found out that what really mattered was not 
so much your engineering degree but how well you worked with 
others.
    Ms. AJUNWA. Right.
    Ms. SCHRIER. And so they kind of lifted up characteristics 
like a team leader or a club leader or being on a sports team. 
But even that then chooses for perhaps competitive people or 
people who always want to be the star of the show and might not 
really lead to the best workplace.
    I wondered if you could just comment about that because it 
is job related but it could have inadvertent outcomes.
    Ms. YANG. Part of the challenge is that you may be testing 
only on your current workforce, right. So you will be 
replicating that current model.
    I think algorithmic formulas, you know, algorithmic systems 
can help us identify bias within broader systems. You know, we 
might think confidence, you know, expressed in resumes as words 
like executed, will mean you're going to perform well. In fact, 
more men use those words and in fact that might not mean you 
can perform well, right. Confidence doesn't always equal 
competence.
    And I think the more we can use these kinds of technology 
systems to help identify where some of the bias is within 
processes, then we can actually start to break down some of the 
historic bias.
    Ms. SCHRIER. Thank you.
    Ms. AJUNWA. And I would add that, you know, having actual 
record-keeping mandates would aid in this endeavor, right. So 
to be able to see what are the people that are actually 
applying, what are the people that are getting selected, but 
then also checking that against the wider pool that's out 
there.
    So, you know, somebody mentioned nontraditional workforce. 
So for example, people who have gaps in employment are 
oftentimes excluded algorithmically by automated hiring 
systems. And this can negatively impact women who are often 
called upon to be caregivers.
    It can also impact formerly incarcerated citizens who have 
been rehabilitated and who are trying to reenter the workforce. 
It can impact veterans.
    So I really think, you know, having a proactive approach to 
ensure that there is proper record-keeping for automated hiring 
systems and also proper auditing of automated hiring systems 
will really be a boon for employers, not just employees.
    Ms. SCHRIER. Thank you.
    Chairwoman BONAMICI. We are going to move on to the Ranking 
Member of the Subcommittee, Mr. Comer from Kentucky, for his 
questions.
    Mr. COMER. Thank you, Madam Chair, and I appreciate all the 
witnesses being here today. Ms. Lander, in the modern economy, 
job recruitment is migrating online.
    Based on your experience, what are job seekers and 
employers gain from the use of online platforms when it comes 
to finding and filling jobs?
    Ms. LANDER. The ability to scan the internet to find 
opportunities for work is a tremendous gain for workers. I can 
remember back when I was job hunting and had to look in the 
newspaper at classified ads so it's a completely different 
world that we live into today.
    Mr. COMER. Ms. Lander, under current law if an employer is 
using technology to screen job applicants that has a negative 
impact on a protected class, the employer may need to 
demonstrate the screening criteria is job related and 
consistent with the business necessity.
    What goes into conducting this analysis and would the 
employer have to demonstrate a strong connection between the 
screening criteria and the job that the employers trying to 
fill?
    Ms. LANDER. Yes. It's the Uniform Guidelines process for 
validating a selection device is extremely rigorous. The two 
most common ways are content validity and criterion validity.
    Content validity is less likely to apply to the kinds of 
tools that we're discussing because content validity is 
typically the content of the test or selection device matches 
the content of the job, like a pilot simulator or a typing 
test. Here, we are talking about devices that screen for either 
minimum qualifications or particular personality 
characteristics and those are typically justified by criterion 
validity which is a rigorous statistical process of matching 
performance data with performance on the selection device.
    Mr. COMER. So what are some best practices for an employer 
when it is considering using an online platform or a vendor 
that employs AI to find suitable job candidates?
    Ms. LANDER. As Ms. Yang said, the employer can't get off 
the hook simply by saying that the employer relied on the 
vendor.
    So employers are responsible for knowing how they are 
screening their candidates and so any employer that is thinking 
of using a tool that uses AI or any other sort of technology to 
screen candidates should be insisting upon seeing the vendor's 
adverse impact studies as well as the validation work that has 
been done and to understand what kind of screening criteria is 
being used to screen their candidates.
    Mr. COMER. Okay. Let me ask you this one last question. If 
an employee is incorrectly classified as an independent 
contractor, wouldn't this worker retain all the legal 
protections of an employee including the protections of our 
current civil rights law?
    Ms. LANDER. Yes. Misclassified workers who are actually 
employees are protected by all of the employment laws.
    Mr. COMER. And I want to ask Mr. Romer-Friedman, you had 
mentioned Facebook in your opening testimony and what--Facebook 
gets a lot of criticism, bipartisan criticism, here in 
Congress. What can Facebook and what should Facebook do 
differently with respect to this subject we are talking about 
here today?
    Mr. ROMER-FRIEDMAN. Sure, thank you, Ranking Member Comer. 
So we have already made a lot of progress lately with Facebook. 
They've created a special portal for job, housing, and credit 
ads where you don't have at this point in time most of these 
demographics as selection options to target or exclude and 
that's great and we applaud them for doing that. At the same 
time, as I mentioned in my testimony, Facebook has to decide 
who will see what ad, right. So let's say I want to send an ad 
to everyone here in District of Columbia, but I'll only buy 
10,000 impressions, right. 10,000 people who are going to see 
the ad.
    Facebook has got to decide who is going to see those ads. 
We allege and we are going to get this hopefully in discovery 
and litigation that age and gender are being used and a group 
called Upturn has done a study showing that there are racial 
and gender impacts so that even if the employer doesn't want to 
discriminate by relying on the ad delivery algorithm of 
Facebook, it may be doing just that and even worse than what 
was going on for years in the first place where the employer 
was expressly excluding certain groups.
    And so as you said, and I completely agree with this. Most 
businesses want to follow the law and they want to comply.
    Mr. COMER. Right.
    Mr. ROMER-FRIEDMAN. And that's where I disagree with Ms. 
Lander that creating greater clarity in the law always helps 
compliance and reduces litigation. And I think that's, you 
know, everyone can agree that those are good things.
    Oftentimes you do that in regulations that the EEOC can 
issue but Congress can do that too and I think the laws that 
Congress enacts express the values of this Congress.
    So for example, one very basic thing is Amazon says it has 
a right to send job ads to younger people and not send them to 
elderly people as long as they put the job ad on their website.
    That's something where there, they say there's an ambiguity 
in the law. Congress could step right in there and make it 
clear you can't use race, gender, age, disability, veteran 
status, political status, for example to exclude people from 
getting recruited or getting job ads. Simply put.
    Mr. COMER. Madam Chair, I have to throw this statistic in 
here. My congressional district, the recent poll, they polled 
all the congressional districts on Facebook usage. 84 percent.
    Mine was the second highest in Congress. 84 percent of the 
adults in my congressional district get on Facebook at least 
once a day. So I am a, I represent a Facebook district so.
    Chairwoman BONAMICI. That is fascinating, Mr. Comer.
    Mr. COMER. We are also trying to--
    Chairwoman BONAMICI. You made me want to--
    Mr. COMER. That is right.
    Chairwoman BONAMICI.--look at where mine is and everybody 
else's is. It is really interesting. And next we recognize Ms. 
Lee from Nevada for 5 minutes for your questions.
    Ms. LEE. Thank you. Thank you all for being here. This has 
been really an interesting topic to think about all the 
iterations of what can be, what we view as helping us in this 
modern day to actually promoting discrimination that we have 
not thought about.
    I am going to turn to older Americans because this body 
passed the Protecting Older Americans Against Discrimination 
Act last month with bipartisan support which restores the 
ability of older Americans to apply the so-called mixed motive 
framework which was afforded to protect other classes of 
individuals under Title VII of the Civil Rights Act to claims 
of age discrimination.
    So in light of the new challenges that we are facing in the 
digital age, I would like to ask you, Mr. Romer-Friedman, you 
touched upon this a little bit in your last answer.
    Are there other actions that we should be taking to ensure 
that older workers have the same protections as other protected 
classes?
    Mr. ROMER-FRIEDMAN. That's a great question and, you know, 
I think my former colleague David Lopez who is the general 
counsel of the EEOC during the Obama Administration has pointed 
out to me many times that age discrimination has become so 
normative and so kind of baked into our society that people 
don't even think it's illegal. Right. So we have to I think 
treat it very seriously.
    To that end, mixed motive is so important to protect 
because it this algorithmic bias, digital discrimination 
discussion, companies will say well, age was just one of 
hundreds of factors that could have influenced that decision.
    Of course, then it's very difficult to piece together how 
age was used. You shouldn't have to show that age was more 
determinative in a decision than you would have to for gender 
or race but that's the case right now.
    I think, you know, one thing that could be done is making 
clear that the Age Discrimination Employment Act applies to 
applicants for disparate impact claims. Right.
    Two courts of appeals have held that if you want to bring a 
disparate impact claim, you can only do it as an employee under 
the ADEA, you can't do it as an applicant. And that's the whole 
purpose of the ADEA, to allow older people to get hired. And, 
you know, at the end of the day we need to make sure that 
things like companies not being able to screen out when they're 
recruiting based on the date of graduation or the years of 
experience and just completely take that person out of the 
picture digitally, those are the kinds of things that need to 
be implemented right away.
    Ms. LEE. Thank you. Thank you. As we talked about this and 
I, we deal with this a lot in Congress is that new technologies 
are far outpacing our ability to focus on regulating and 
certainly that is what we are seeing here today.
    So as we look at, I would like to just open this up to all 
of you. Looking down the road, are there potential future 
developments in workplace or hiring technologies beyond the 
ones we have talked about today that particularly concern you 
when it comes to protecting workers rights from employment 
discrimination? I will start with you, Ms. Yang.
    Ms. YANG. Thank you for that question. I am concerned about 
the increasing worker surveillance and monitoring. Many workers 
now are tracked all throughout the day. There are productivity 
metrics that can sometimes be so aggressive that they can 
interfere with a pregnant woman's ability to go to the 
bathroom, you know, prayer time, all kinds of civil rights 
concerns.
    But also just the simple tracking of people throughout the 
day may really deter workers from coming together and raising 
concerns so I do have concerns about that.
    And I did want to add one other point about the age 
discrimination. You know, older workers are disproportionately 
represented at independent contractor positions and so it's 
especially important that even properly classified independent 
contractors have antidiscrimination protections. Right.
    And if somebody says well, I'm not going to hire you just 
because you're old, like right now you have no protections 
against that. And I think that's something that needs to change 
as well.
    Ms. LEE. Right. Thank you.
    Ms. AJUNWA. Thank you very much for your question. First in 
response to, you know, your concern for older workers, I do 
want to note that I have seen in my research more 
discriminations against older workers in terms of their ability 
to participate in a sort of a digital workplace. So people will 
use words like digital native to really exclude older workers 
so that's something of concern.
    I also wanted to point out that workplace surveillance is 
actually something that is on the rise. As I mentioned, the 
microchips that are being embedded under the skin, but also, I 
see workplace wellness programs as a site of workplace 
surveillance.
    For example, now with the sort of trend or introduction of 
genetic testing as part of workplace wellness programs. That 
really raises the question of increased, you know, health 
discrimination or increased discrimination against people with 
disabilities whether real or imagined.
    Because genetic testing is actually just telling you the 
propensity for disease, but employers might look at it as 
actually determinative when it's really not.
    So I think that's a huge concern and something we should 
really, you know, act against.
    Ms. LEE. Right, thank you. All right. My, whoops, my time 
has expired. Thank you.
    Chairwoman BONAMICI. We now recognize Mr. Takano, a Member 
of the Full Committee from California for 5 minutes for your 
questions.
    Mr. TAKANO. Thank you, Chairwoman Bonamici, for this very 
important hearing. As the workforce is changing and we 
transition to a society more dependent on technology, it is 
extremely important that we understand how these tools will 
impact the workforce.
    Currently there is a lack of transparency and without 
knowing the algorithm behind the program we have no way of 
knowing if these tools will remove or reinforce bias.
    Professor Ajunwa, as companies are looking to ensure that 
they remove bias and are mitigating against disparate impact, 
they would need to know what protected classes potential 
employees belong too.
    We know that the more sensitive the information is, 
information such as sexual orientation or disability status, 
the less likely a candidate will disclose this information. So 
my first question is if companies are unable to obtain this 
sensitive information from candidates, how can and should they 
mitigate bias?
    Ms. AJUNWA. So that's an excellent question. Of course, you 
can't compel applicants to release information that's protected 
information. However, employers can do analysis after the fact 
to see if there is indeed a disparate impact based on looking 
at for example the--this is after the fact, not during the 
employment decision.
    You know, just to look at the categories of people that 
have applied versus the categories of people that were hired. 
And this can then help them take steps in the future perhaps to 
broaden their advertisement pool to attract more people from 
protected categories if they are lacking those types of people.
    Mr. TAKANO. A post-hiring review. Ms. Yang, while many 
companies or vendors will claim that they are complying with 
the EEOC regulations, we know that many do not because they are 
currently, they currently operate in a gray area. Does the EEOC 
have the ability to regulate the companies and vendors that are 
contracted by employers to conduct hiring?
    Ms. YANG. That's a very important question. The EEOC does 
have an important role to play. The agency has sub-regulatory 
authority under statutes like Title VII and can provide 
guidance which is the Uniform Guidelines is one form of 
guidance on how the agency believes vendors should validate 
hiring screens.
    So certainly, the agency could provide more up-to-date 
guidance on some of the difficult issues where there are gray 
areas.
    You know, a lot of people say are correlations sufficient 
to demonstrate that validity? I don't believe they are. It 
would be helpful for the agency to make that clear and explain 
why.
    Mr. TAKANO. Well, what recourse if any does the EEOC have 
to hold these companies accountable in the gray area?
    Ms. YANG. Well, during the course of an investigation, I 
mentioned earlier the EEOC has the authority to open its own 
investigation on the commissioner's charge or directed 
investigation depending on the statute.
    So even if an individual doesn't have enough information to 
come forward but the EEOC learns of a problem, it can open an 
investigation. If it finds a problem, it can actually litigate 
it to enforce the law.
    But the challenge is having enough information to know 
where the problems are because as you mentioned there is a very 
big gap in knowledge because of the lack of transparency about 
how many of these systems work.
    Mr. TAKANO. Well, thank you. Thank you. Professor Ajunwa, 
we know that auditing the algorithm and the code can help us 
understand if the code is biased. But what kind of auditing 
should be done and should it be the responsibility of the EEOC 
to do this?
    Ms. AJUNWA. Thank you very much for your question. So the 
question of how the audits of automated hiring systems should 
be completed or performed is one that I address in my two law 
review articles, ``The Paradox of Automation as an Anti-Bias 
Intervention'' and ``Automated Employment Discrimination.''
    I don't come down on one side whether it has to be a 
governmental agency, or it can be a third-party agency similar 
to for example LEED, which certifies green buildings. So of 
course, there is some utility in having it be a governmental 
agency but also there is also the recognition of scarce 
resources.
    Mr. TAKANO. Well, so maybe, maybe not the government but 
what kind of auditing should be done?
    Ms. AJUNWA. So the kinds of auditing that should be done 
should be one that's done with an, essentially an 
interdisciplinary team so it should include lawyers, so labor 
and employment lawyers.
    It should include data scientists who are trained to write 
code and to understand how machine-learning codes work. It 
should include people who are versed in diversity research in 
terms of creating a diverse workforce. So it should be an 
interdisciplinary team.
    Mr. TAKANO. Well, thank you. Madam Chair, while we should 
not fear technology and the wonders of using it to increase 
productivity and efficiency, we cannot move toward a society 
where everything from employment to housing are guided by 
systems that are largely unchecked. Thank you and I yield back 
the balance of my time.
    Chairwoman BONAMICI. Thank you, Mr. Takano. And finally, 
last but not least, we will recognize Ms. Blunt Rochester from 
Delaware for 5 minutes for your questions.
    Ms. BLUNT ROCHESTER. Yeah. Thank you, Madam Chairwoman, for 
this very important hearing and thank you to the panelists.
    I had the opportunity last month, about two months ago to 
start a future work caucus here in the Congress, a bipartisan 
future work caucus and what you have shared today really 
highlights the clarion call for all Members of Congress to be 
engaged in this, in these discussions.
    To me it appears that technology has really outpaced policy 
and the people and so your participation here today is really 
important. And there are so many topics, I wish I could have 
had everybody's time that is not sitting here because there are 
issues like language barriers that we haven't talked about, the 
diversity of those people doing the design work, and making 
sure that those algorithms are working and even returning 
citizens.
    I have a criminal justice bill called Clean Slate that 
deals with people who are coming out of prison but therefore 
are having challenges getting to work.
    And I want to start off by finalizing Ms. Lee's question 
because you two didn't get a, Ms. Landers or Mr. Romer-Friedman 
didn't get a chance to answer the question about your one 
concern, your big concern.
    And then I want to ask the whole panel if there was one 
thing Congress could do right now that would it be? So if I 
could start with Ms. Landers and I have 3 minutes and 40 
seconds left.
    Ms. LANDER. So I'll be quick. The thing that occurred to me 
is that we are really moving in a really positive direction in 
terms of teleworking and worker flexibility.
    However, there are a lot of laws like for example in 
California and even under the FLSA that put such restrictions 
on the employer and having to monitor very carefully--
    Ms. BLUNT ROCHESTER. Yeah. So laws focusing on--
    Ms. LANDER.--individual worktime that they're reluctant to 
allow people the flexibility to telework. And so I think that's 
a growing area because the generation that's coming up after me 
really enjoys working from Starbucks.
    Ms. BLUNT ROCHESTER. Yeah. So telework. Thank you. Mr. 
Romer-Friedman.
    Mr. ROMER-FRIEDMAN. Thanks. If I could just say this whole 
line of argument that independent contracting is flexibility 
and having employer employee relationship is not is a farce.
    You can do all the flexibility you want through the 
traditional employer relationship, get all the protections that 
the New Deal and Great Society and subsequent laws created.
    To your question, Congresswoman, I think that, you know, we 
saw a scandal a couple weeks ago with Clearview that a company 
essentially did, collected all photographs from people, from 
pretty much everyone on the internet who was on social media 
and created facial recognition, gave that to law enforcement 
mostly.
    I think it's concerning to me that we are seeing the 
millennials and the next generation grow up in a time of social 
media.
    I think at some point employers will be able to literally 
press a button and get every--from every piece of information 
about someone that has been on the internet forever which not 
just could be embarrassing to people but won't be 
representative.
    And if you point out, you know, someone who is returning 
from prison from, who's, you know, who has paid their debt to 
society with time, that person may have all that stuff come up 
in the same way that right now you don't want a criminal record 
even if it's not a conviction to be used for employment. So 
thats going to be a big issue.
    Ms. BLUNT ROCHESTER. Yeah. A big issue, thank you. And Dr. 
Ajunwa, in terms of the thing that the Congresswoman--
    Ms. AJUNWA. Yeah, thank you so much for your question. I 
think really what is urgent for government to do right now is 
to ensure that automated hiring is regulated. As it stands 
there are just no regulations as to what information is 
collected, how that information is evaluated and also what even 
happens to that information whether the applicant is hired or 
not.
    And so governmental action is definitely needed both to 
audit and certify the automated hiring system but also to 
ensure that all the data that is being collected on applicants 
is not something that's then used against the applicant in the 
future.
    Ms. BLUNT ROCHESTER. Yeah. Excellent, thank you. and, Dr. 
Yang, Ms. Yang.
    Ms. YANG. Thank you. I believe a workers' bill of rights is 
needed so that workers understand how algorithms are making 
decisions and how that might impact them.
    Because people like Kyle Behm, he only knew he was screened 
out because a friend who worked at the company told him. Most 
people don't know this information and then the systems don't 
get to improve from feedback loops about why people were 
excluded, right.
    So if people know, you know what, I don't think you're 
going to accurately transcribe my accent with the type of 
screen you're using, they can raise that concern and try to 
make systems better.
    Ms. BLUNT ROCHESTER. Yeah. Thank you. And I have so many 
more questions which I will follow up with many of you 
afterwards.
    One question I did have for Dr. Ajunwa, you mentioned 
microchips. I was just curious. I was looking for that in the 
testimony. Could you speak a little bit more on that?
    Ms. AJUNWA. Sure, thank you for your question. So your 
question pertained to the use of microchips embedded under the 
skin.
    Ms. BLUNT ROCHESTER. Yeah.
    Ms. AJUNWA. As a new really I think a surveillance tool. So 
many corporations are marketing this as a convenience for 
employees in terms of helping them to open doors or access 
sensitive areas.
    But in my opinion, because these chips are permanently with 
the employee, they can track the employee wherever that 
employee is, even off the job. So I do see them as surveillance 
devices.
    Ms. BLUNT ROCHESTER. Yeah. Well, my time has run out but I 
thank you all for your testimony and we will be following up 
with you.
    I know data privacy is something that we are doing on my 
other Committee, Energy and Commerce, so look forward to 
working with you. Thank you and thank you, Madam Chairwoman.
    Chairwoman BONAMICI. Thank you so much and I want to remind 
my colleagues that pursuant to Committee practice, materials 
for submission for the hearing record must be submitted to the 
Committee Clerk within 14 days following the last day of the 
hearing, preferably in Microsoft Word format.
    The materials submitted must address the subject matter of 
the hearing. Only a Member of the Committee or an invited 
witness may submit materials for inclusion in the hearing 
record. Documents are limited to 50 pages each.
    Documents longer than 50 pages will be incorporated into 
the record via an internet link that you must provide to the 
Committee Clerk within the required timeframe, but please 
recognize that years from now that link may no longer work. I 
always like that part.
    So again, I want to thank the witnesses for their 
participation today. What we have heard is very valuable and I 
think, I know many of us have a lot more questions and Members 
of the Committee may submit those questions.
    We ask that you please respond in writing. The hearing 
record will be held open for 14 days to receive those 
responses. And I remind my colleagues that pursuant to 
Committee practice, witness questions for the hearing record 
must be submitted to the Majority Committee staff or Committee 
Clerk within 7 days and they must address the subject matter of 
the hearing.
    And I now recognize the distinguished Ranking Member for 
his closing statement.
    Mr. COMER. Thank you. Madam Chair, to begin with, I ask 
unanimous consent to place in the record a statement from HR 
Policy Association providing views on today's hearing topic.
    Chairwoman BONAMICI. Without objection.
    Mr. COMER. And I just again want to thank the witnesses who 
came here to testify. This is an issue we are going to hear a 
lot more about and we certainly want to be on top of this.
    I appreciate all of the suggestions and like to often 
remind this Committee we have a lot of laws already on the 
books that address most of the subjects and topics that we 
discuss in this Committee but it is always good to review the 
issues as they emerge and make sure that if there is anything 
that we can do in a bipartisan way in Congress to improve the 
civil rights of workers then we certainly need to do that and 
that is certainly a bipartisan issue.
    But again, thank you all for being here today and, Madam 
Chair, I yield back.
    Chairwoman BONAMICI. Thank you very much Mr. Comer, and I 
recognize myself for a purpose of making a closing statement. 
And thank you again to your, to the witnesses for your 
expertise which I very much appreciate.
    I just want to reiterate that what the Ranking Member said 
is that we often work on a bipartisan basis and I certainly 
think that this is an issue where we could do that.
    I know, Ms. Lander, you--a couple of times said things like 
done correctly or used properly and I think those are the key 
questions of the use of this technology if it is done correctly 
and used properly, and I think that is the if that we are going 
to be working on solving.
    Today's hearing exposed how digital hiring, evaluation, and 
management tools can threaten to replace civil rights 
protections and left unchecked, these largely non-transparent 
technologies can amplify and perpetuate existing biases that 
intentionally or unintentionally discriminate against workers. 
Our civil rights enforcement institutions and the laws they 
enforce have not kept pace with the technologies that employers 
are using to recruit, screen, interview and manage workers.
    And as our modern workplaces continue to change and 
employers increasingly rely on independent contractors, whether 
misclassified or not, accountability for violations of workers' 
basic civil rights can be diffused, and far too often many 
workers will be excluded from key antidiscrimination 
protections.
    So Congress must fulfill its responsibility to preserve and 
expand workers' civil rights by requiring transparency in the 
algorithms that are used to recruit, hire, and evaluate 
workers.
    Preventing employers from stripping workers of their 
antidiscrimination protections through misclassification and 
clarifying, updating, and better enforcing our landmark civil 
rights laws to meet the challenges workers face in the digital 
age.
    Technology has a tremendous amount of promise, but it is 
the if used properly, if used correctly.
    Congress has an opportunity to incentivize innovation in 
workplace technologies that will put workers first and protect 
and uphold equal employment opportunities.
    And if we work together, we can shape a future in which 
businesses can and will continue to innovate and workers can 
and will enjoy strong antidiscrimination protections and I 
think simply put: the future of work will be what we make it.
    So thank you again. If there is no further business, 
without objection, the Committee stands adjourned.
    [Additional submissions by Chairwoman Bonamici follow:]
    
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]    
       
    Help Wanted: An Examination of Hiring Algorithms, Equity, 
and Bias: https://www.upturn.org/reports/2018/hiring-
algorithms/
    [Additional submission by Mr. Comer follow:]
    
[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]    
    
    
    [Questions submitted for the record and their responses 
follow:]

[GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT]


    [Whereupon, at 3:34 p.m., the subcommittee was adjourned.]