[House Hearing, 117 Congress]
[From the U.S. Government Publishing Office]





                                 


 
  HOLDING BIG TECH ACCOUNTABLE: LEGISLATION TO BUILD A SAFER INTERNET

=======================================================================

                             HYBRID HEARING

                               BEFORE THE

            SUBCOMMITTEE ON CONSUMER PROTECTION AND COMMERCE

                                 OF THE

                    COMMITTEE ON ENERGY AND COMMERCE
                        HOUSE OF REPRESENTATIVES

                    ONE HUNDRED SEVENTEENTH CONGRESS

                             FIRST SESSION

                               __________

                            DECEMBER 9, 2021

                               __________

                           Serial No. 117-61
                           
                           
            GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
                


     Published for the use of the Committee on Energy and Commerce

                   govinfo.gov/committee/house-energy
                        energycommerce.house.gov
                        
                            _______
        
             U.S. GOVERNMENT PUBLISHING OFFICE 
56-985 PDF          WASHINGTON : 2024
                  
                        

                    COMMITTEE ON ENERGY AND COMMERCE

                     FRANK PALLONE, Jr., New Jersey
                                 Chairman
BOBBY L. RUSH, Illinois              CATHY McMORRIS RODGERS, Washington
ANNA G. ESHOO, California              Ranking Member
DIANA DeGETTE, Colorado              FRED UPTON, Michigan
MIKE DOYLE, Pennsylvania             MICHAEL C. BURGESS, Texas
JAN SCHAKOWSKY, Illinois             STEVE SCALISE, Louisiana
G. K. BUTTERFIELD, North Carolina    ROBERT E. LATTA, Ohio
DORIS O. MATSUI, California          BRETT GUTHRIE, Kentucky
KATHY CASTOR, Florida                DAVID B. McKINLEY, West Virginia
JOHN P. SARBANES, Maryland           ADAM KINZINGER, Illinois
JERRY McNERNEY, California           H. MORGAN GRIFFITH, Virginia
PETER WELCH, Vermont                 GUS M. BILIRAKIS, Florida
PAUL TONKO, New York                 BILL JOHNSON, Ohio
YVETTE D. CLARKE, New York           BILLY LONG, Missouri
KURT SCHRADER, Oregon                LARRY BUCSHON, Indiana
TONY CARDENAS, California            MARKWAYNE MULLIN, Oklahoma
RAUL RUIZ, California                RICHARD HUDSON, North Carolina
SCOTT H. PETERS, California          TIM WALBERG, Michigan
DEBBIE DINGELL, Michigan             EARL L. ``BUDDY'' CARTER, Georgia
MARC A. VEASEY, Texas                JEFF DUNCAN, South Carolina
ANN M. KUSTER, New Hampshire         GARY J. PALMER, Alabama
ROBIN L. KELLY, Illinois, Vice       NEAL P. DUNN, Florida
    Chair                            JOHN R. CURTIS, Utah
NANETTE DIAZ BARRAGAN, California    DEBBBIE LESKO, Arizona
A. DONALD McEACHIN, Virginia         GREG PENCE, Indiana
LISA BLUNT ROCHESTER, Delaware       DAN CRENSHAW, Texas
DARREN SOTO, Florida                 JOHN JOYCE, Pennsylvania
TOM O'HALLERAN, Arizona              KELLY ARMSTRONG, North Dakota
KATHLEEN M. RICE, New York
ANGIE CRAIG, Minnesota
KIM SCHRIER, Washington
LORI TRAHAN, Massachusetts
LIZZIE FLETCHER, Texas

                           Professional Staff

                   JEFFERY C. CARROLL, Staff Director
                TIFFANY GUARASCIO, Deputy Staff Director
                  NATE HODSON, Minority Staff Director
            Subcommittee on Consumer Protection and Commerce

                        JAN SCHAKOWSKY, Illinois
                                  Chair
BOBBY L. RUSH, Illinois              GUS M. BILIRAKIS, Florida
KATHY CASTOR, Florida                  Ranking Member
LORI TRAHAN, Massachusetts           FRED UPTON, Michigan
JERRY McNERNEY, California           ROBERT E. LATTA, Ohio
YVETTE D. CLARKE, New York           BRETT GUTHRIE, Kentucky
TONY CARDENAS, California, Vice      LARRY BUCSHON, Indiana
    Chair                            NEAL P. DUNN, Florida
DEBBIE DINGELL, Michigan             GREG PENCE, Indiana
ROBIN L. KELLY, Illinois             DEBBIE LESKO, Arizona
DARREN SOTO, Florida                 KELLY ARMSTRONG, North Dakota
KATHLEEN M. RICE, New York           CATHY McMORRIS RODGERS, Washington 
ANGIE CRAIG, Minnesota                   (ex officio)
LIZZIE FLETCHER, Texas
FRANK PALLONE, Jr., New Jersey (ex 
    officio)
                             C O N T E N T S

                              ----------                              
                                                                   Page
Hon. Jan Schakowsky, a Representative in Congress from the State 
  of Illinois, opening statement.................................     2
    Prepared statement...........................................     4
Hon. Gus Bilirakis, a Representative in Congress from the State 
  of Florida, opening statement..................................     6
    Prepared statement...........................................     8
Hon. Frank Pallone, a Representative in Congress from the State 
  of New Jersey, opening statement...............................    10
    Prepared statement...........................................    12
Hon. Cathy McMorris Rodgers, a Representative in Congress from 
  the State of Washington, opening statement.....................    14
    Prepared statement...........................................    16

                               Witnesses

Jonathan Greenblatt, CEO and National Director, Anti-Defamation 
  League.........................................................    19
    Prepared statement...........................................    22
Answer to submitted questions \3\
Nathalie Marechal, Ph.D., Senior Policy and Partnerships Manager, 
  Ranking Digital Rights.........................................    45
    Prepared statement...........................................    47
Answer to submitted questions \3\
Rick Lane, CEO, Iggy Ventures LLC................................    52
    Prepared statement...........................................    54
    Answer to submitted questions................................   163
Josh Golin, Executive Director, Fairplay.........................    81
    Prepared statement...........................................    83
Answer to submitted questions \3\
Jessica Rich, of Counsel, Kelley Drye, Former Director, Bureau of 
  Consumer Protection, Federal Trade Commission..................    94
    Prepared statement...........................................    96
    Answer to submitted questions................................   168
Imran Ahmed, CEO, Center for Countering Digital Hate.............   103
    Prepared statement...........................................   105
Answer to submitted questions \3\

                           Submitted Material

H.R. 3451, the Social Media Disclosure and Transparency of 
  Advertisements Act of 2021 \1\
H.R. 3611, the Algorithmic Justice and Online Platform 
  Transparency Act \1\
H.R. 3991, the Telling Everyone the Location of data Leaving the 
  U.S. Act \1\
H.R. 4000, the Internet Application Integrity and Disclosure Act 
  \1\
H.R. 5439, the Kids Internet Design and Safety Act \1\
H.R. 6083, the Deceptive Experiences to Online Users Reduction 
  Act \1\
H.R. 6093, the FTC Whistleblower Act of 2021 \1\
Letter of July 16, 2020, by Raymond Kowacic, Assistant Director, 
  Office of Congressional Relations, U.S. Immigration and Custom 
  Enforcement, to Rep. Latta, submitted by Mr. Latta.............   150

----------
\1\ Legislation has been retained in committee files and also is 
  available at https://docs.house.gov/Committee/Calendar/
  ByEvent.aspx?EventID=114299.
Letter of December 8, 2021, from Cathy McMorrir Rodgers, 
  Republican Leader and Gus Bilirakis, Republican Leader, 
  Subcommittee onConsumer Protections and Commerce, to Lina Khan, 
  submitted by Mr. Latta.........................................   152
Letter of July 30, 2020, by Joseph J. Simons, Chairman, FTC Chair 
  Kahn, to Mr. Latta, submitted by Mr. Latta.....................   157
Letter of August 13, 2020, by Karas Gross, Associate Commissioner 
  for Legislative Affairs, U.S. Food and Drug Administration, to 
  Mr. Latta, submitted by Mr. Latta..............................   159
Article of May 2019, ``Online Tracking Study,'' submitted by Mr. 
  Armstrong \2\

----------
\2\ The information has been retained in committee files and also 
  is available at https://docs.house.gov/meetings/IF/IF17/
  20211209/114299/HHRG-117-IF17-20211209-SD008.pdf.
\3\ The witnesses did not answer submitted questions for the 
  record by the time of publication.


  HOLDING BIG TECH ACCOUNTABLE: LEGISLATION TO BUILD A SAFER INTERNET

                              ----------                              


                       THURSDAY, DECEMBER 9, 2021

                  House of Representatives,
  Subcommittee on Consumer Protection and Commerce,
                          Committee on Energy and Commerce,
                                                    Washington, DC.
    The subcommittee met, pursuant to call, at 11:34 a.m. in 
the John. D. Dingell Room, 2123 of the Rayburn House Office 
Building, and remotely via Cisco Webex online video 
conferencing, Hon. Jan Schakowsky, (chairwoman of the 
subcommittee) presiding.
    Members present: Representatives Schakowsky, Rush, Castor, 
Trahan, McNerney, Clarke, Cardenas, Dingell, Kelly, Soto, Rice, 
Craig, Fletcher, Pallone (ex officio); Bilirakis (subcommittee 
ranking member), Latta, Bucshon, Dunn, Pence, Lesko, Armstrong, 
and Rodgers (ex officio).
    Also present: Representatives Burgess, Carter, Doyle, 
Duncan, Rochester, and Walberg.
    Staff Present: Parul Desai, FCC Detailee; Katherine Durkin, 
Policy Coordinator, Waverly Gordon, Deputy Staff Director and 
General Counsel; Jessica Grandberry, Staff Assistant; Tiffany 
Guarascio, Staff Director; Ed Kaczmarski, Policy Analyst; Zach 
Kahan, Deputy Director Outreach and Member Service; Hank 
Kilgore, Policy Coordinator; Mackenzie Kuhl, Press Assistant; 
Jerry Leverich, Senior Counsel; David Miller, Counsel; Kaitlyn 
Peel, Digital Director; Chloe Rodriguez, Clerk; Andrew Souvall, 
Director of Communications, Outreach, and Member Services; 
Michele Viterise, Counsel; Michael Cameron, Minority Policy 
Analyst, Consumer Protection and Commerce, Energy, Environment; 
Emily King, Minority Member Services Director; Bijan 
Koohmaraie, Minority Chief Counsel; Tim Kurth, Minority Chief 
Counsel, Consumer Protection and Commerce; Brannon Rains, 
Minority Professional Staff Member, Consumer Protection and 
Commerce; and Michael Taggart, Minority Policy Director.
    Ms. Schakowsky. The Subcommittee on Consumer Protection and 
Commerce will now come to order.
    Today we will be holding a hearing entitled, ``Holding Big 
Tech Accountable: Legislation to Build a Safer Internet.''
    Due to the COVID-19 pandemic, this hearing will--members 
can participate in today's hearing either in person or 
remotely, via online conference.
    Meanwhile--excuse me, members are--participating in person 
must wear masks. Such members may remove their masks when they 
are under recognition and speaking from a microphone.
    Staff and press who are present in the committee room must 
wear a mask at all times.
    And for members who are participating remotely, your 
microphones will be set on mute for the purpose of eliminating 
inadvertent background noise. Members participating remotely 
will need to--you will need to unmute your microphones each 
time that you wish to speak. Please note that, once you are 
unmuted, anything that you may say in--will be available in 
Webex, and it could be heard over the loudspeaker. And the--and 
also the--in the committee room, and subject to being heard by 
the livestreaming and C-SPAN.
    Since members are participating from different locations, 
the way we are going to order the members will be by seniority 
within the subcommittee.
    Documents for the record can be sent to--I usually get that 
right, yes, there it is--Kaczmarski, there we go, sorry, 
Kaczmarski, at the email address that we have provided to the 
staff. And all documents will be entered into the record at the 
conclusion of the meeting.
    We will begin at this point with opening statements of 5 
minutes by the members, and the Chair now recognizes herself 
for 5 minutes.

 OPENING STATEMENT OF HON. JAN SCHAKOWSKY, A REPRESENTATIVE IN 
              CONGRESS FROM THE STATE OF ILLINOIS

    Bottom line, the Internet is not living up to its promises.
    At its birth in the previous century, the Internet promised 
more social connection, new communities and experiences, and 
more economic opportunity. But these benefits have come with 
very steep consequences and costs.
    Today's Internet is harming our children, our society, and 
our democracy. Five years ago, at the age of thirteen, 
Anastasia Vlasova joined Instagram, which quickly flooded her 
accounts with images of perfect bodies and perfect lives. She 
soon was spending three hours a day on the app, and developed 
an eating disorder. Despite public outcry, recently, as 
recently as--reported as yesterday, it confirmed that Instagram 
is still promoting pro-anorexia accounts to teens. Ms. Vlasova 
actually did eventually quit using Instagram, but millions of 
children and teens remain powerless against the addictive and 
manipulative algorithms and ads.
    On January 6th, DC police officer Michael Fanone was 
grabbed, beaten, and tased, all the while being called a 
traitor to his country. The deadly insurrection was, at least 
in part, coordinated on platforms like Facebook, and 
exacerbated by elevating the--and amplifying algorithms that 
were about election disinformation.
    For too long, Big Tech has acted without any real 
accountability. Instead, they give us excuses and apologies. 
The time for self-regulation is over. Today we will be 
discussing a number of pieces of legislation that will build a 
safer Internet.
    Last week I introduced the FTC Whistleblower Act with my 
colleague, Representative Trahan. This bill protects from 
retaliation current and former employees who blow the whistle 
to the Federal Trade Commission from retailer--from 
retaliation, and it incentivizes the disclosure of unlawful 
activity. It is a critical step toward a more safe Internet.
    The Algorithm's [sic] Justice and Online Platform 
Transparency Act from Representative Matsui prohibits 
algorithms from discriminating against certain consumers.
    The KIDS Act, from Representatives Castor, Clarke, Trahan, 
and Wexton ban online practices that exploit young people.
    The Social Media Data Act from Representative Trahan and 
Castor prohibit--provide transparency into how digital ads 
target consumers.
    The bipartisan DETOUR Act from Representatives Blunt 
Rochester and Gonzalez prohibit large, online platforms from 
using ``dark patterns'' to trick consumers.
    So we can, this subcommittee can create an Internet that is 
better, and safer, and makes sure that consumers are protected, 
that we protect our children, that is transparent, and holds 
bad actors accountable.
    And with that I want to give a hearty welcome and a thank 
you to this wonderful panel that is here, including one, I 
guess, that is here remotely with us.
    Thank you very much.
    [The prepared statement of Ms. Schakowsky follows:]

               Prepared Statement of Hon. Jan Schakowsky
GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 

    Ms. Schakowsky. And the Chair now recognizes the ranking 
member, my friend, Mr. Bilirakis, for--ranking member of the 
subcommittee, for his 5 minutes of an opening statement.

 OPENING STATEMENT OF HON. GUS BILIRAKIS, A REPRESENTATIVE IN 
               CONGRESS FROM THE STATE OF FLORIDA

    Mr. Bilirakis. Thank you, Madam Chair. I appreciate it so 
very much. Good morning to everyone.
    Ms. Schakowsky. Are you on?
    Mr. Bilirakis. Yes, yes, I am.
    Ms. Schakowsky. Pull it close.
    Mr. Bilirakis. Yes, yes. I want to thank my colleagues for 
their interest to improve transparency and increase protection 
online.
    There are a lot of initiatives under consideration today, 
and all of them raise issues that deserve our attention.
    Legislation brought forth by my friend in the majority 
would require the FTC to issue new rules and regulations, and 
would grant the FTC with additional enforcement tools to reduce 
dark patterns, discriminatory algorithms, as you said, Madam 
Chair, harmful content directed at children. It would also 
grant new rights for consumers to take control of their data. I 
hope that means this is a precursor, and not a substitute--and 
we have discussed this with the chairperson--for passing a 
national privacy and data security law. That is the best and 
most comprehensive way Congress can protect our constituents 
through these means. That is my opinion.
    I think many of the issues we will be discussing today can 
and should be a part of that larger privacy and data security 
discussion, and I sincerely hope my colleagues will join me in 
that effort. I will say to my fellow colleagues that my door is 
always open, and we have a great relationship with the 
chairperson. Please don't hesitate to come and talk to me, and 
give us some input on this particular issue.
    Earlier last month, Republican Leader Rodgers released 
draft legislative language for the Control Our Data Act, or 
CODA, which would create one national standard for privacy and 
data security, establish clear rules of the road for businesses 
to be able to comply, and give every American equal data 
protections, regardless of the location of their home. I, for 
one, certainly want to see rules that are clear and easy to 
understand for my constituents, and I am sure you do, too.
    I also want to assure that the FTC Bureau of Privacy that 
was included in our proposal has the appropriate staff and 
resources to enforce the national law.
    I hope the panel agrees today that there are elements of 
all these bills that can be incorporated in some fashion in 
this framework to ensure we leave behind a legacy that will 
benefit every American. That is the goal.
    We must also take seriously the threat from China, and 
moving forward on these two bills today is an important step 
towards holding them accountable.
    The legislation before us will provide Americans with 
greater transparency into the application and websites they use 
online.
    H.R. 3991, the TELL Act, led by Representative Duncan, 
would inform users if their information is stored in China, and 
whether the information is acceptable by the CCP or a Chinese 
state-owned entity.
    H.R. 4000, the Internet Application ID Act, led by 
Representative Kinzinger, would require websites and online 
users or distributors of mobile applications that are located 
in China are owned by the CCP to disclose that location or 
ownership to users.
    Both bills are very reasonable, as far as I am concerned.
    For those asking why we didn't invite a witness today in 
today's hearing that has ties to China to share their views, 
you should know we absolutely did. We used one of our witness 
slots to invite TikTok to testify. But unfortunately, it 
declined. They declined the invitation.
    Madam Chair, I hope we can work together to invite them 
before the subcommittee in the near future, just as Senators 
Blumenthal and Blackburn did in the Senate. There were many 
questions left unanswered in that hearing in the Senate last 
month on the stewardship of their platform. And I am confident 
that the panel today could shed light on our shared concerns.
    Thank you so very much for being here. There are very 
important matters our subcommittee is examining today, so I 
thank the Chair for holding this hearing again, and I thank the 
ranking member, the full ranking member, and to the witnesses 
again for being here today. We really appreciate it.
    I look forward to your testimony on these bills, and other 
proposals we have publicly circulated for this committee's 
review, and I yield back. Thank you.
    [The prepared statement of Mr. Bilirakis follows:]

                  Prepared Statement of Gus Bilirakis
GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    Ms. Schakowsky. Thank you, Mr. Bilirakis. And before I 
invite our--the chairman and ranking member of the committee 
for their opening statements, let me just say I am very excited 
and optimistic. We have had a real good history of working 
together in this subcommittee to get legislation not only 
introduced and passed.
    And I know last week we also sent you something on--a 
proffer on a privacy bill. I--again, I am very confident that 
we are going to be able to work together and get that done.
    And I agree with the urgency that you are projecting today, 
and share it with you, and look forward to moving ahead 
rapidly.
    And now let me recognize the great Chair of this full 
committee, Frank Pallone, for his opening statement.

OPENING STATEMENT OF HON. FRANK PALLONE, Jr., A REPRESENTATIVE 
            IN CONGRESS FROM THE STATE OF NEW JERSEY

    Mr. Pallone. Thank you, Chairwoman Schakowsky. Today's 
hearing is the second of two hearings on legislative reforms to 
hold social media companies accountable.
    And following last week's hearing examining possible 
reforms of Section 230 of the Communications Decency Act, 
today's panel will discuss consumer protection-focused 
legislation that aims to hold these companies accountable by 
enhancing transparency and promoting online safety.
    So these legislative hearings come after years of repeated 
bipartisan calls for online platforms to change their ways. 
Unfortunately, instead of meaningfully addressing the serious 
harms that these platforms can inflict on the American people 
and our children, social media companies continue to make minor 
changes only after negative press coverage, or in preparation 
for an executive testifying before Congress, and they also 
refuse to become more transparent.
    In fact, we only actually learn what is really going on 
inside these massive corporations when a whistleblower steps 
forward, and those courageous actions are becoming exceedingly 
difficult. And even more disturbing, we are now seeing 
instances where these platforms are publicly shutting down 
efforts at transparency.
    So since these companies are clearly not going to change on 
their own, Congress has to act. And today we will discuss seven 
bills that target different parts of the social media ecosystem 
to make platforms safer for users.
    And one of the best ways to make these companies more 
accountable is to make them more transparent. We will discuss 
legislation that grants academic researchers and the Federal 
Trade Commission access to ad libraries, which will help to get 
us the data we need on how these companies are targeting users.
    Another bill will prohibit the use of algorithms that 
discriminate based on race, age, gender, ability, and other 
protected characteristics, or methods that manipulate users 
into providing consent when they wouldn't, otherwise. And this 
legislation will help prevent people using social media from 
losing rights protected under the law.
    We are considering a bill that will protect whistleblowers 
like former Facebook employee Frances Haugen, who testified at 
last week's legislative hearing. Whistleblowers help bring 
truth to light, and are another way of helping ensure that 
companies are held accountable.
    And finally, we will examine how to better protect our 
children online by banning certain design features directed at 
children, and prohibiting the amplification of harmful content 
that is targeted at them. Legislative measures that protect our 
children are critically important, and have bipartisan support 
on this committee.
    Now, Republicans and Democrats also agree that we do not 
want to see our data or our children's data surveilled or used 
in a manner that could risk their safety. And that is why we 
are also discussing bills that attempt to force websites and 
apps to be transparent about their interactions with China. We 
all understand the danger the Chinese Government poses to the 
United States economy and national security, and we must take 
meaningful steps to address that danger from China.
    After multiple hearings, letters, and discussions with 
stakeholders, the members of this committee have developed 
legislation to address the harms caused by Big Tech. There is 
no silver bullet to fix the Internet. The proposals that we are 
discussing today are important steps to improving the online 
ecosystem.
    Another part of tech accountability is protecting people's 
privacy, and the chairwoman already mentioned that, 
significantly, because she is so much involved with it. But I 
think every member of this committee agrees that more must be 
done on privacy. And that is why we have been working since 
last Congress on a bipartisan staff discussion draft. Updates 
to that draft were made last week to address stakeholder 
feedback, and have been shared with the minority.
    I continue to believe that there is a bipartisan path 
forward on privacy, and our work continues to get there. But 
today we are focused on proposals to make these platforms more 
transparent and safer.
    So I just thank the witnesses, and thank Chairwoman 
Schakowsky for being out front on so many of these issues, 
particularly the privacy issue, which I know is not an easy 
one, but you are determined. And I yield back.
    [The prepared statement of Mr. Pallone follows:]

             Prepared Statement of Hon. Frank Pallone, Jr.
GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 

    Ms. Schakowsky. The gentleman yields back. And now the 
Chair recognizes Mrs. Rodgers, the ranking member of the full 
committee, for 5 minutes for her opening statement.

      OPENING STATEMENT OF HON. CATHY McMORRIS RODGERS, A 
    REPRESENTATIVE IN CONGRESS FROM THE STATE OF WASHINGTON

    Mrs. Rodgers. Thank you, Madam Chair. And to our witnesses, 
thank you for being here.
    Last week we discussed many examples of Big Tech companies 
failing to be good stewards of their platforms. Big Tech has 
used its power to censor Americans, control what we see, 
manipulate us through the use of harmful algorithms. Big Tech 
must be held accountable, and that is why, from day one of this 
Congress, Republicans have been exploring legislative solutions 
through our Big Tech accountability platform.
    As a part of our platform, we released a number of 
proposals to focus on content moderation, transparency, and 
protecting our kids online, all issues that are relevant to 
today's hearing.
    My proposal, which I am leading alongside my good friend, 
Congressman Jim Jordan, narrowly amends Section 230 to protect 
free speech. Under our proposal, Big Tech will be held 
accountable for censoring constitutionally-protected speech. 
Big Tech will no longer be able to exploit the ambiguity and 
the discretion we see in the current law. Big Tech will be more 
responsible for content they choose to amplify, promote, or 
suggest. Big Tech will be forced to be transparent about their 
content decisions, and conservatives will be empowered to 
challenge Big Tech's censorship decisions.
    Republican policies would hold Big Tech accountable for 
their content moderation practices, and encourage transparency 
on enforcement decisions, especially when it comes to illegal 
drugs, counterfeit, and stolen products, terrorism, doxing, 
child pornography and trafficking, cyberbullying, and revenge 
porn.
    We are also looking for new ways to improve cooperation 
with law enforcement, while upholding our civil liberties.
    I am pleased to see some of these ideas presented today in 
the package that the Democrats are leading on. It is 
unfortunate that the majority decided not to use this hearing 
to discuss privacy, given many of these bills include 
provisions directly related to the collection and use of data, 
and would best be addressed in the context of a comprehensive 
privacy and data security framework.
    The proposals also include language on protecting data from 
wrongful purposes, other references to the Child Online Privacy 
Protection Act, COPPA, and a data portability provision.
    Despite our interest in continuing our work from last 
Congress on a bipartisan privacy framework, we have yet to have 
a hearing, let alone a markup. And Americans are desperate for 
our privacy and data security bill. It is difficult to address 
the goals discussed today without that national privacy 
framework and the data security bill. We will continue to talk. 
We can continue to talk, but we need a national privacy and 
data security bill.
    Worse yet, the Democrats' tax-and-spending spree, the 
reconciliation package before the Senate right now, includes 
dramatic increases for funding and authority for the Federal 
Trade Commission, the FTC, that never received a bipartisan 
consensus. The majority suggested that this is a way to protect 
America's personal data. It couldn't be further from the truth. 
It includes no privacy and data security framework to implement 
or enforce.
    These bills will add to the confusion in the marketplace by 
creating conflicting rules on how data is used, collected, and 
shared. This confusion only allows Big Tech to become more 
powerful, and it harms small businesses.
    The question I have today is how do these bills fit into a 
comprehensive privacy and data security framework, like some of 
the proposals that the Republicans have released publicly?
    Let me also share another reason that I am concerned, which 
I think we all agree on, and that is the need for a national 
standard because of Big Tech's troubling relationship that is 
being more exposed with the Chinese Communist Party. Big Tech 
has not been responsible with the data that they have 
collected, or who they share it with.
    I am pleased and I am grateful that the majority included 
two bills, related bills, in the hearing today to help address 
that threat, one by Mr. Duncan and one by Mr. Kinzinger.
    Big Tech companies like TikTok have an incredible amount of 
access and control over our data and information supply chain. 
Americans deserve to know if their personal information is 
safe, and to what extent it is being accessed by the CCP. It is 
our duty to uphold American values like free speech, and ensure 
that the United States of America continues to lead the 
cutting-edge technology to beat China. That starts by 
establishing a national privacy and data security framework and 
holding Big Tech accountable.
    I look forward to hearing from the witnesses today.
    [The prepared statement of Mrs. McMorris Rodgers follows:]

           Prepared Statement of Hon. Cathy McMorris Rodgers
GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 

    Mrs. Rodgers. I yield back, Madam Chair.
    Ms. Schakowsky. The gentle lady yields back.
    And I want to remind all members of the subcommittee that, 
pursuant to committee rules, all members' written opening 
statements shall be included and made part of the record.
    And now I would like to introduce our witnesses for today's 
hearing.
    Jonathan Greenblatt is the CEO and national director for 
the Anti-Defamation League.
    Nathalie--let's see, I am going to get it--Marshall, no, 
Marechal--is the senior policy and partnership manager at 
Ranking Digital Rights.
    Rich Lane--Rick Lane is the CEO of Iggy Ventures.
    Josh Golin is the executive director of Fair Play.
    And Jessica Richard [sic] of counsel at--what is it, Kelley 
Drye, got that right? OK.
    And Imran Ahmed is the CEO of the Center for Counter-
Digital--Countering Digital Hate.
    At the--I just want to explain the--I will recognize each 
of you for 5 minutes, but I want to explain the lights that are 
in front of you, just to make sure that you know.
    When the--when your time begins, the light will be green. 
When there is one minute left, there will be a yellow light. 
And I hope at that point you will start wrapping up, so that we 
can keep to, as close as we can, to 5 minutes.
    And we will begin now with Mr. Greenblatt.
    You are now recognized for 5 minutes.

 STATEMENT OF JONATHAN GREENBLATT, CEO AND NATIONAL DIRECTOR, 
ANTI-DEFAMATION LEAGUE; NATHALIE MARECHAL, PH.D., SENIOR POLICY 
 AND PARTNERSHIPS MANAGER, RANKING DIGITAL RIGHTS; RICK LANE, 
    CEO, IGGY VENTURES LLC; JOSH GOLIN, EXECUTIVE DIRECTOR, 
    FAIRPLAY; JESSICA RICH, OF COUNSEL, KELLEY DRYE, FORMER 
    DIRECTOR, BUREAU OF CONSUMER PROTECTION, FEDERAL TRADE 
COMMISSION; AND IMRAN AHMED, CEO, CENTER FOR COUNTERING DIGITAL 
                              HATE

                STATEMENT OF JONATHAN GREENBLATT

    Mr. Greenblatt. Thank you, Madam Chair Schakowsky, Ranking 
Member Bilirakis, and members of the subcommittee. Good 
morning. It is a privilege and an honor for me to be here 
today.
    ADL is the oldest anti-hate group in America. We have been 
fighting anti-Semitism and all forms of bigotry for more than 
100 years, and we have been tracking online hate since the days 
of dial-up. This work includes partnering with law enforcement 
to help prevent online threats from mutating into offline 
incidents. We work with authorities at all levels. In the past 
11 months, we have provided the FBI with more than 1,000 
actionable tips. Our 25 offices across the country engage 
directly with individuals and institutions affected by hate.
    In 2017 ADL launched the Center for Technology and Society 
to double down on our efforts to fight online hate. We were the 
first civil rights group with an operation right in the heart 
of Silicon Valley, and it is staffed not by longtime non-profit 
professionals, but by software engineers, product managers, 
data scientists, and computer experts, all hired from industry. 
We conduct analysis, publish research, build technology, and 
provide recommendations to policymakers like yourselves and 
industry leaders.
    Today there is no distinction between online and offline 
lives. When we say that Facebook is the front line in fighting 
hate, I mean that, literally. We have seen over and over again 
the way that hateful content online leads to violence in our 
communities offline. Poway, El Paso, Pittsburgh, these targeted 
mass shootings were motivated by extremist conspiracy theories 
that were spawned and spread on social media.
    In addition to these tragedies, online hate affects the 
everyday lives of millions of Americans. Our research has found 
that 41 percent of users report experiencing online hate and 
harassment. According to ADL's most recent analysis, 75 percent 
of those harassed report that it happened to them on Facebook. 
That is nearly three times the percentage on any other 
platform.
    And make no mistake, all of them are highly profitable 
companies. So this isn't a resource problem, it is a 
responsibility problem.
    Just today, ADL released new research demonstrating how 
easy it is to find White supremacist, accelerationist content 
on Instagram, less than 24 hours after the CEO sat at another 
table just like this, and said they were cleaning up their 
mess.
    But these platforms lack and neglect safety because, first 
and foremost, they are exempt from liability, due to the 
loophole of Section 230. Now, I know that isn't the topic of 
today's hearing, but make no mistake, Section 230 must be 
changed to force the companies to play by the same rules that 
every other media company on the landscape operates by today.
    It is just not a matter of free speech. It is simply being 
held accountable in courts of law, when the platforms aid and 
abet unlawful, even lethal conduct in service of their growth 
and revenue.
    Tech companies are complicit in the hate and violence on 
their platforms because, if it bleeds, it leads, and it feeds 
their business model and their bottom line. Hate speech, 
conspiracy theories, they are amplified by the algorithms, 
nudged to the top of their news feeds, and they addict users 
like a narcotic driving engagement, which, in turn, increases 
their profits.
    With no oversight and no incentives beyond increasing 
revenue, tech companies will continue to do whatever they can, 
whatever it takes to optimize engagement, regardless of the 
consequences. This just can't continue.
    If not for courageous whistleblowers like Frances Haugen, 
we wouldn't have the hard evidence to prove that Facebook 
knowingly--knowingly--is mainstreaming extremism, inciting 
violence through its algorithms and fracturing societies around 
the world.
    What if other tech companies, tech employees felt empowered 
and protected to expose wrongdoing when they saw it? That is 
why the protections, Congresswoman Schakowsky, in your FTC 
Whistleblower Act are so crucial.
    If platforms have no meaningful motivation to fix the 
harmful algorithms that amplify hate, they won't do it. That is 
why the Algorithmic Justice and Online Transparency Act that 
would protect consumers from harmful and discriminatory AI 
systems are really long overdue, so we applaud that 
legislation, as well.
    Finally, to stay ahead of the curve, we have got to 
prioritize research. In August, ADL Belfer fellow and NYU 
Professor Laura Edelson was de-platformed on Facebook hours 
after the company realized that she and her team were studying 
the role that Facebook may have played in leading up to the 
January 6th insurrection. Platforms should not be able to 
thwart important third-party research at their whim. Bills like 
the Social Media Data Act would ensure that academics can study 
platforms to better inform the public.
    Look, there are no silver bullets. There is no one-size-
fits-all solution to repairing our internet, but there is a lot 
you can do right now to take action. I have highlighted three 
bills, and I am happy to talk about them and others in the Q 
and A.
    But members of the committee, let me conclude by urging you 
to remember that what happens online has a real impact on our 
lives. The status quo directly threatens our kids, our 
communities, and our country. Now is the time for you to 
legislate and act.
    Thank you. I look forward to your questions.
    [The prepared statement of Mr. Greenblatt follows:]
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    
    Ms. Schakowsky. I thank the gentleman. And now we have, 
remotely with us today, Dr. Marechal.
    And you are recognized now for 5 minutes.

                 STATEMENT OF NATHALIE MARECHAL

    Dr. Marechal. Thank you, Congresswoman. Good morning, and 
thank you to all of you for inviting me to testify today.
    I am Natalie Marechal, senior policy and partnerships 
manager at Ranking Digital Rights.
    As Congress crafts legislation to hold Big Tech accountable 
for its negative impacts on society, I urge you to focus on 
upstream structural reforms by regulating online advertising, 
mandating transparency and research access to data, and 
encouraging the Securities and Exchange Commission to use its 
existing regulatory authority to do what its shareholders are 
unable to: get Big Tech to comply with the same laws as all 
other public companies, and to improve their corporate 
governance.
    The tenor and substance of congressional hearings on the 
tech industry has come a long way in the past few years, thanks 
to a growing recognition that the harms users experience 
through social media platforms are connected to business models 
centered on maximizing revenue from targeted advertising. This 
business model incentivizes rapid growth; anti-competitive 
behavior like predatory acquisitions of would-be competitors 
and vertical integration across the ad tech value chain; mass 
commercial surveillance; and data collection without our 
knowledge or consent; reliance on automation to perform tasks 
that actually require human nuance and contextual judgment to 
be done correctly; and consolidation of corporate power that 
thwarts any internal attempt at reform.
    The company now known as Meta is the most brazen example of 
these dynamics. But the basic point that how a company makes 
money plays a determinate role in its products and its behavior 
is true across the tech sector and beyond. A business model 
that relies on the violation of rights will necessarily lead to 
products that create and amplify harms.
    So what should Congress do about it? First, regulate the 
tech--the online advertising industry. Transpose the basic 
principles that govern offline advertising to the online world, 
and pursue antitrust enforcement in the ad tech sector. These 
measures will directly address consumer and civil rights harms 
related to privacy, discrimination, and fraud in online 
advertising. They will also shift the incentive structures that 
contribute to product design and corporate decisions that harm 
consumers and destabilize democracies around the world.
    Further, increased competition in the ad tech market will 
undercut the Alphabet and Meta duopoly, and enable greater 
accountability for these two mega-corporations that often 
behave as though they are above the law.
    Second, create the conditions for evidence-based policy-
making by mandating specific types of transparency for 
information that can safely be made public, and by creating 
mechanisms for qualified, trustworthy, industry-independent 
researchers to verify companies' claims about users' 
experiences, and expand knowledge and understanding about how 
these platforms impact societies and democracy around the 
world.
    The RDR methodology and the Santa Clara Principles on 
Transparency and Accountability and Content Moderation both 
provide granular recommendations for the data that companies 
should disclose publicly.
    And third, Congress should encourage the SEC to use its 
authority to do what shareholders have been trying to do, and 
have been unable to do for reasons I will explain: get Big Tech 
to comply with the same laws as all other publicly-traded 
companies. Numerous whistleblower disclosures to the SEC 
indicate that several Big Tech companies are violating 
securities laws. But because of their dual-class share 
structure, shareholders are unable to hold corporate management 
accountable. When the CEO is also the chair of the board of 
directors, this means that person is accountable to no one.
    I am talking about Mark Zuckerberg. No one should have this 
much power.
    The SEC must address the private market exemptions that 
have allowed Big Tech companies to become so large, and with 
concentrated governance. Because Meta was able to obtain 
significant private market funding before going public, the 
company was able to impose this dual-class share structure, and 
a governance structure that allows Mark Zuckerberg to 
unilaterally make decisions that impact billions of people 
without any accountability. This loophole must be closed so 
that shareholder democracy of the future Facebooks can take 
hold.
    To address the excesses of today's Big Tech firms, the SEC 
should ensue--should issue an enforcement policy declaring that 
it will not grant bad actor waivers to, and will seek increase 
enforcement penalties for companies with class B shares, or 
those in which a single person serves as CEO and share of the 
company's board of directors.
    The bills under consideration today all seek to shine a 
light on Big Tech's secretive business practices, and hold them 
accountable when they harm their users, their competitors, or 
society more broadly, whether through deliberate action or 
through their failure to proactively identify and mitigate 
potential harms ahead of time.
    The Republican Big Tech Accountability Platform also 
contains many provisions that Ranking Digital Rights has long 
called for: transparency into how Big Tech develops its content 
policies and regular, periodic disclosures about content policy 
enforcement, including the types of content taken down, and 
why, and clearly understood appeals processes.
    Big Tech accountability is not a partisan issue. Americans 
may disagree about how social media companies should govern 
content on their platforms, but there is strong bipartisan 
agreement that Big Tech is not above the law and that, whatever 
companies do, they should be transparent about it, and they 
should be accountable to their users, their shareholders, and 
the American people. Legislation should start there.
    Thank you again for the opportunity to testify today, and I 
look forward to your questions.
    [The prepared statement of Dr. Marechal follows:]
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    
    Ms. Schakowsky. Thank you so much. And now let me recognize 
Mr. Lane.
    You are recognized for 5 minutes.

                     STATEMENT OF RICK LANE

    Mr. Lane. Chair Schakowsky, Ranking Member Bilirakis, 
Chairman Pallone, Ranking Member McMorris Rodgers, and members 
of the subcommittee, thank you for inviting me to testify. My 
name is Rick Lane, and I am the CEO of a strategic advisory 
firm, Iggy Ventures. I also volunteer my time to help child 
safety organizations combat sex trafficking and other online 
threats to children.
    Over the past 30 years I had the opportunity to work on 
almost every major piece of technology-related, consumer 
protection, privacy, and cybersecurity legislation that has 
moved through Congress. I testify today in my personal 
capacity.
    Building a more safe, secure, and sustainable internet will 
require Congress to focus on four main issues: one, reforming 
Section 230; two, creating more transparency in the way 
internet platforms operate, while protecting internet users' 
privacy; three, restoring access to the WHOIS data; and four, 
updating the Child Online Privacy Protection Act. These issues 
do not necessarily need to be addressed in the single 
comprehensive piece of legislation, but they should be 
discussed in a comprehensive fashion. All the pieces must fit 
together.
    I recognize that Section 230 reform is the province of 
another subcommittee, and was the focus of last week's hearing. 
I would be remiss, however, if I didn't take this opportunity 
to take a few--to make a few observations on the topic.
    I believe we need to restore to platforms the ordinary duty 
of care that would apply, but for courts' current and overbroad 
application of Section 230. Social media companies are rife 
with offers to sell illegal drugs, yet the former CEO of TikTok 
stated at a 2020 technology event that he had never been told 
of illicit drug transactions on the platform, and doubted their 
very existence. That was a surprising statement, since others 
knew, including the drug dealers that were using TikTok's 
platform.
    TikTok could also increase the threat of espionage and 
cyber attacks, in light of the influence the Chinese Government 
has over both it and ByteDance, the Chinese company that owns 
TikTok. Indeed, we are confronted with a social networking site 
that is, A, susceptible to manipulation by a Communist regime 
with a record of human rights abuses; B, growing more rapidly 
than any U.S. competitor; and C, collecting massive amounts of 
data on our youngest and most easily influenced demographic, in 
an arms race to develop more sophisticated artificial 
intelligence.
    It is for these reasons that both H.R. 3991 Telling 
Everyone the Location of data Leaving the U.S. Act, introduced 
by Rep. Duncan, and H.R. 4000, the Internet Application ID Act, 
introduced by Rep. Kinzinger, are so important. These two 
bills, together, will provide the American people with the 
information they need to know exactly where these types of 
companies are headquartered, where their data is being stored, 
and to fully understand the risks they and their children are 
taking when using these apps, apps that can be used to 
undermine our democracy.
    Another transparency issue that Congress needs to address 
is access to accurate, WHOIS domain name registration, which 
contains basic contact details for holders of internet domains, 
and is fundamental to protecting consumer privacy, promoting 
lawful commerce, ensuring public safety, and protecting our 
national security. Indeed, a Department of Justice report 
states that the first step in online reconnaissance often 
involves use of ICANN's WHOIS database.
    In 2018, registries and registrars like GoDaddy, VeriSign, 
Namecheap increasingly began restricting access to WHOIS data, 
based on an overlap--application of the European Union GDPR. 
Yet almost after five years of ``trying to fix the WHOIS GDPR 
problem,'' ICANN has failed. The time has, therefore, come for 
this committee and Congress to pass legislation requiring 
domain name registries and registrars to once again make WHOIS 
information available, and that will be zero cost to consumers.
    No other area of consumer protection is more important than 
establishing reasonable policies to protect children in the 
marketplace. This is especially true in the area of online 
privacy and market-dominant digital payment apps and debit 
cards that target children, and collect and exploit a shocking 
amount of their data. COPPA, enacted in 1998, creates an opt-in 
parental consent privacy regime for websites directed at 
children under 13.
    By contrast, Gramm-Leach-Bliley, enacted in 1999, created 
an opt-out privacy regime for financial institutions. That 
privacy space between COPPA and GLBA creates a FinTech child 
privacy protecting--protection gap in existing law. This gap is 
especially harmful as we move toward a cashless society, a 
trend accelerated by the pandemic.
    The good news is that one company, FinTech digital company 
which I am involved with, Rego Payment, is the only COPPA-
compliant digital wallet.
    Thank you again for giving me this opportunity to 
participate today. I look forward to your questions, and 
continue to work with you and your staff. We must all work 
together to fix these important problems because, at the end of 
the day, it is the right thing to do.
    Thank you.
    [The prepared statement of Mr. Lane follows:]
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    
    Ms. Schakowsky. Thank you.
    And now, Mr. Golin, the floor is yours for 5 minutes.

                    STATEMENT OF JOSH GOLIN

    Mr. Golin. Thank you, Chair Schakowsky, Ranking Member 
Bilirakis, and distinguished members of the subcommittee for 
holding this important hearing. My name is Josh Golin, and I am 
executive director of Fairplay, the leading independent 
watchdog of the children's media and marketing industries.
    Through corporate campaign and strategic regulatory 
filings, we have changed the marketing and data collection 
practices of some of the world's biggest companies. Currently, 
we are leading a campaign to stop Facebook from launching a 
children's version of Instagram. And last week, with other 
leading advocates, we launched Design with Kids in Mind, a 
campaign to demand regulations that require online operators 
put kids' interests first when designing their platforms.
    Frances Haugen has shone a critical spotlight on 
Instagram's harmful impacts on teens, and Facebook's callous 
disregard for children's well-being. But it would be a mistake 
to view her revelations as problems limited to Facebook and 
Instagram. Compulsive overuse, exposure to harmful content, 
cyberbullying, harms to mental health, and the sexual 
exploitation of children are industry-wide issues that demand 
systemic solutions from Congress.
    To put it plainly, the unregulated business model for 
digital media is fundamentally at odds with children's well-
being.
    Digital platforms are designed to maximize revenue and, 
therefore, engagement because the longer they can capture a 
user's attention, the more money they make by collecting data 
and serving ads. As a result, children are subject to 
relentless pressure and manipulative design that pushes them to 
use and check platforms as often as possible. The harms young 
people--this harms young people in several ways, including 
encouraging the overuse of social media and displacing critical 
online activities like sleep, exercise, and face-to-face 
interactions. Overuse can also lead to isolation from secure 
family relationships, and reduced interest in academic 
achievement and extracurricular activities, allowing for-profit 
tech companies to shape children's character, habits, and 
future.
    Design choices used to maximize engagement are also 
harmful, because they exploit young people's desire for social 
approval, and their natural tendency towards risk-taking. 
Displays of likes and follower counts provide an instant 
snapshot of whose profiles and posts are popular. Children 
quickly learn that the way to improve these metrics is to post 
risque and provocative content, creating a permanent record of 
their youthful indiscretions, and increasing their risk of 
cyberbullying and sexual exploitation.
    Platforms also harm young people by personalizing and 
recommending content most likely to keep them engaged. One 
former YouTube engineer observed recommendation algorithms are 
designed to optimize watch time, not to show content that is 
actually good for kids. This means that, on platforms like 
Instagram and TikTok, teens interested in dieting will be 
barraged with content promoting eating disorders, and a 
depressed user will be shown content promoting self-harm.
    Nearly every concern that parents, public health 
professionals, and children themselves have about digital media 
platforms can be traced to deliberate design choices. It 
doesn't have to be this way. Apps and online platforms could be 
built, instead, to reduce risk and increase safeguards for 
children. But that won't happen without significant action from 
Congress.
    The only Federal law that protects children online was 
passed 23 years ago, long before smartphones, Instagram, and 
YouTube even existed. Congress's continued inaction, combined 
with a lack of enforcement at the FTC, has emboldened Big Tech 
to develop an exploitative business model without considering 
or mitigating its harmful effects on children and teens. It is 
no wonder that polls consistently show that parents want 
Congress to do more to protect children online.
    We know the key legislative solutions. The KIDS Act, which 
we will discuss today, would prohibit companies from deploying 
design techniques like autoplay, displays of quantified 
popularity, and algorithmic recommendations that put children 
and teens at risk. The Privacy Act would expand privacy 
protections to teens, ban harmful uses of data, like 
surveillance advertising, and require platforms to make the 
best interests of children a primary design consideration. 
Together, these bills would create the safeguards children 
need, and transform the online experience for young people.
    Over the last year I have watched several hearings like 
this one, and was heartened to hear Members of Congress speak, 
first and foremost, not as Republicans and Democrats, but as 
parents and grandparents with firsthand knowledge of what is at 
stake.
    But the American people need more than your understanding 
and justified anger at companies like Facebook. Big Tech is 
banking on the fact that partisan divisions will keep you from 
taking action. I hope you will prove them wrong, and advance 
legislative solutions that better protect children while they 
are online, and make it easier for them to disconnect and 
engage in the offline activities they need to thrive.
    There is simply too much at stake for children and their 
futures to allow the status quo to continue.
    Thank you for having me here today, and I look forward to 
your questions.
    [The prepared statement of Mr. Golin follows:]
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    
    Ms. Schakowsky. Well, thank you.
    And now, Ms. Rich, you are recognized for 5 minutes.

                   STATEMENT OF JESSICA RICH

    Ms. Rich. Chair Schakowsky, Ranking Member Bilirakis, and 
members of this subcommittee, I am Jessica Rich, of counsel at 
Kelley Drye, and a distinguished fellow at Georgetown 
University. I am pleased to be here today testifying on holding 
Big Tech accountable, and building a safer internet. My remarks 
today are my own, based on my years of government service.
    My background is as a law enforcement attorney and 
official. I worked for over 26 years at the Federal Trade 
Commission, the last 4 as director of its Bureau of Consumer 
Protection. Before becoming director, I was the first and 
longtime manager of the FTC's privacy program. I have supported 
stronger data privacy and security laws for over 20 years. The 
focus of my testimony today is on that very issue: privacy.
    While I understand that privacy is not the chief focus of 
this hearing, I am highlighting it today because the need for 
privacy legislation, Federal privacy legislation, has never 
been stronger. This hearing is addressing many important 
issues, some of which are closely related to privacy. But 
passing a strong and comprehensive private--Federal privacy law 
is one of the most important things Congress can do to hold Big 
Tech accountable, and build a safer internet.
    Consumers, businesses, regulators, and the marketplace as a 
whole, we all need a Federal privacy law.
    First, survey upon survey shows that consumers are 
concerned about their privacy, and believe they have little 
control about how companies collect, use, and share their 
personal information. They continue to be the victims of 
massive data breaches. Data collection and abuses are 
everywhere. And companies make decisions affecting them every 
day using algorithms and profiles with built-in assumption and 
biases.
    You can't educate consumers about their rights, because it 
depends on the market sector, the state they are in, and the 
type of company and the data involved. Often, consumers have no 
rights at all. And consumers can't be expected to read hundreds 
of privacy policies a day from companies they have never heard 
of. Consumers need a clear and consistent privacy law that they 
can understand and rely on every day, no matter where they are 
or what they are doing.
    Businesses are similarly confused about privacy laws in 
this country. At the Federal level, we have the FTC Act, as 
well as dozens of sector-specific laws like COPPA, HIPAA, and 
the Fair Credit Reporting Act. We also now have three 
comprehensive state laws, with more on the way.
    Honest companies spent enormous time and money to navigate 
all these laws, while the unscrupulous exploit the gaps and the 
loopholes. Meanwhile, large companies have benefited. That 
includes the platforms, because they can afford the cost of 
compliance, and because many existing laws favor large entities 
that can keep their operations in house, and not share data 
with third parties.
    In sum, businesses too need a clear and consistent Federal 
privacy law to help them navigate a difficult regulatory 
environment, and create a more level playing field.
    But there is more. For over 20 years, the FTC, my former 
agency, has overseen privacy using a law that is just not up to 
the task: the FTC Act. While the FTC has accomplished a lot, 
this law does not establish clear standards for everyone to 
follow before problems occur, and there are big gaps in its 
protections, creating uncertainty for the marketplace.
    Many in Congress on both sides of the aisle have criticized 
the FTC for these problems: too strong, too weak, too much, too 
little. But, with respect, it is Congress that needs to fix the 
problems by passing a law with clear standards for the FTC and 
the public.
    Finally, we now, all of us, understand that concerns 
surrounding the use of personal data reach well beyond 
traditional notions of privacy to issues like discrimination, 
algorithmic fairness, accountability, whistleblower 
protections, dark patterns, protecting our kids, data 
portability, and even, with respect to data security, our 
critical infrastructure. A privacy law could address many of 
these issues, at least in part, achieving far more than could 
be achieved by adding yet more sectoral requirements to the 
confusing mix of laws we now have in the United States.
    Thank you so much for inviting me here today. I stand ready 
to assist the subcommittee and its members and staff with 
ongoing work related to consumer protection and privacy.
    [The prepared statement of Ms. Rich follows:]
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    Ms. Schakowsky. Thank you very much.
    And last, but certainly not least, Mr. Ahmed, you are 
recognized now for 5 minutes.

                    STATEMENT OF IMRAN AHMED

    Mr. Ahmed. Chairs Schakowsky and Pallone, Ranking Members 
Bilirakis and McMorris Rodgers, members of the committee, thank 
you for this opportunity to appear before you today.
    The Center for Countering Digital Hate, CCDH, is a 
nonprofit research in the dynamics of misinformation and hate 
on social media, how it undermines democracy, the rule of law, 
child safety, and our ability to deal with life-threatening 
crises such as COVID.
    So why is this happening? Why are we here? The ugly truth 
is social media companies discovered prioritizing hate, 
misinformation, conflict, and anger is highly profitable. It 
keeps users addicted, so they can serve them ads.
    CCDH's research has documented bad actors causing harm, but 
also bad platforms encouraging, amplifying, and profiting from 
that harm. The platforms have managed to successfully stop any 
credible action by deploying a well-worn playbook: one, 
initially deny there is a problem; two, admit there is a 
problem, but deflect responsibility; three, finally, 
acknowledge responsibility, but delay any action. Deny, 
deflect, delay. I can show you how that works in practice.
    On March the 24th we released a report showing that up to 
65 percent of anti-vax content circulating on Facebook and 
Twitter, 65 percent, originates with sites and accounts 
operated by just 12 anti-vaxxers, the Disinformation Dozen. 
Now, this committee asked Mark Zuckerberg about the report in a 
hearing the next day, on March the 25th. He promised to do 
something about it. He did not.
    Six months later, after the surgeon general and the 
President weighed in--again, citing our report--Facebook 
responded, claiming our report had a faulty narrative. However, 
Facebook whistleblower Frances Haugen revealed that, on the 
very same day we released our report, March the 24th, Facebook 
produced an internal study confirming that a tiny number of 
accounts were responsible for more than half of anti-vaccine 
content on their platform. So they were lying, while the 
American public were suffering under COVID, and people were 
dying.
    The members of this committee have seen the same tactics 
from social media executives time and time again. You have 
correctly determined, as have legislators in the UK, Australia, 
Germany, and other allied nations, that social media companies 
cannot self-regulate, and that we need new legislation.
    There is no silver bullet. That is right. Section 230 shows 
the limitations to a single solution based on one core 
principle. It did not predict nor deal with the harms we are 
now seeing emanating from social media. There will need to be a 
range of approaches to transparency and accountability to nudge 
social media into a place that balances dialogue, privacy, 
safety, and prosperity.
    The bills being considered today would collectively 
represent a big step forward to protecting children, families, 
society, and our democracies. The KIDS Act would put real 
protections in place for our children.
    Transparency is an essential tool in countering online hate 
and lies. The Social Media Data Act, therefore, would give 
independent researchers the access needed to detect dangerous 
trends.
    Whistle blowers have leaked internal documents illuminating 
wrongdoing by Big Tech, providing new urgency to the reform 
debate. But whistleblowing is still profoundly risky for the 
whistleblower, which is why the incentives and protections 
provided by the FTC Whistleblower Act are critical.
    Social media apps trick users very often into giving up 
their personal data, their thoughts, their fears, their likes, 
their dislikes, which they then sell to advertisers. Big Tech's 
big data is designed to exploit people, not to serve them 
better. The DETOUR Act puts a stop to that destructive spiral.
    There are also two much-needed bills to address the growing 
threat of hostile foreign actors who revel in the divisions 
that social media creates and exacerbates in democratic 
societies. In approving these bills, the committee would take a 
huge step forwards towards better regulation, and give us hope 
that an internet that brings out the best in people is 
possible.
    Thank you very much.
    [The prepared statement of Mr. Ahmed follows:]
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT
 
    
    Ms. Schakowsky. Thank you very much. We have now concluded 
the incredible--and I am so grateful for the witness testimony, 
and their opening statements are finished.
    And at this time we will move to member questions. Each 
member will have 5 minutes to question our witnesses. I will 
start by recognizing myself for 5 minutes.
    Let me begin by saying the Federal Trade Commission is the 
top regulatory agency tasked with keeping Americans safe online 
by preventing unfair and deceptive practices. But the FTC 
stands out from many other regulatory agencies because 
whistleblowers are not protected by Federal law.
    Recent events, as we have seen with Frances Haugen, have 
made it clear how important whistleblower protection really is, 
and that is why I introduced the FTC Whistleblower Act and--
along with Lori Trahan, my colleague. This legislation protects 
whistleblowers from retaliation for coming--that is, coming 
forward.
    And I wanted to get the opinion of some of our witnesses.
    It also incentivizes--and Mr. Ahmed, you mentioned 
incentivization--to make sure that these harms are not present 
there. And I wondered if you could comment on--a little bit 
more on, you know, whether or not and why you believe that the 
FTC Whistleblower Act would actually help deter social media 
companies from making business decisions that could be harmful 
for consumers.
    Mr. Ahmed. Well, thank you. Yes, I mean, Frances Haugen 
turned on the floodlights, so to speak, within Facebook. But 
what she did can't easily be replicated.
    For one thing, it is incredibly expensive. She had lawyers. 
You know, there is government affairs, there is the loss of 
income. And her real value, the reason it is so important, is 
that she really exposed deception, active deception by social 
media companies, something that can't easily be replicated with 
any other mechanism beyond whistleblowing. So, you know, the 
only way to cast a light on that deception is for moral people 
to shed light on immorality from within.
    But the window of a whistleblower like Frances Haugen is 
limited. Think that, since she took all these documents, they 
have evolved into Meta, they have moved into the metaverse. 
Most of the anti-vax crisis has happened since then. And we 
need disclosure of deceit not every decade, but every time that 
there is active deceit on something of great public interest.
    So this bill is incredibly important in bringing forward 
more moral characters when we need them.
    Ms. Schakowsky. Thank you.
    Mr. Greenblatt, in your view, would this legislation, do 
you think, work in favor of protecting consumers and ending 
some of the spreading of the harms that are done?
    Mr. Greenblatt. Yes, Madam Chairman. I think there is no 
question that the Whistleblower Act is necessary.
    I mean, to build upon what Mr. Ahmed just said, what we 
know is--I mean, I have had direct conversations with Mark 
Zuckerberg and other Facebook executives, and they have lied to 
my face. They have lied to you, they have lied to their 
advertisers, they have lied to the public.
    But let's be clear. Silicon Valley is a clicky place. It is 
not easy. And so we need to give these people the protections 
that they need, so they don't risk being in violation of their 
NDAs, they don't risk future opportunities for employment.
    But I think, again, if we are playing the long game here, 
we need to realize the moral leadership and the courage 
displayed by people like, again, Frances Haugen--but think 
about it. We learned, because of her bravery, that Facebook is 
only tackling three to five percent of the hate speech on their 
platform, despite their protestations. We learned that they--
their AI gets less than--wait for it--one percent of the 
incitements to violence on their platform. The reason why this 
has prevailed for so long is they are exempt from liability, 
and lack the incentives.
    So, Madam Chairman, unless we have the means to protect the 
people who have access to this information, it is clear the 
companies will not volunteer it to us. So I think it is vital 
that your Act, the whistleblower--FTC Whistleblower Act is 
passed.
    Ms. Schakowsky. Thank you. I wanted to ask Mr.--Dr. 
Marechal how this legislation would actually help regulators 
and law enforcement to better understand the economic incentive 
behind decisions by internet platforms and the ones that they 
make.
    Dr. Marechal. I agree wholeheartedly with the points that 
my esteemed colleagues on the panel have made.
    Again, Federal whistleblower protections make it easier for 
Big Tech workers who want to do the right thing to do that.
    Again, Ms. Haugen benefited from the SEC whistleblower 
statute, which is why so many of her disclosures directly 
relate to matters within the SEC's jurisdiction. I would--I am 
confident that, if there were an equivalent for the FTC, we 
would have seen additional disclosures from her, additional 
whistleblower complaints related to matters under the FTC's 
jurisdiction, which includes economic decision-making and the 
economic factors that go into companies' decision-making.
    Ms. Schakowsky. OK, thank you so much, and my time has 
expired, and now I welcome the questioning by my ranking 
member, Mr. Bilirakis, 5 minutes.
    Mr. Bilirakis. Thank you, Madam Chair. I appreciate it very 
much.
    And I want to thank all of you for your testimony today. 
Very informative.
    There are reasonable proposals on and off the bills--again, 
off the list of bills being considered today and in the future. 
However, I am concerned by the unintended consequence that will 
arise if Congress decided to legislate--in other words, decides 
to legislate on privacy and data security in multiple bills, 
without establishing a comprehensive framework.
    Ms. Rich, a question for you. Can you elaborate on any 
potential consequences that businesses and our constituents may 
face as a result of enacting several individual one-off bills 
on privacy, as opposed to one comprehensive bill?
    I know you touched on it. If you could elaborate, I would 
really appreciate it very much.
    Ms. Rich. Right now, it is a confusing--a highly confusing 
environment for both businesses and consumers. There are so 
many sectoral laws that pertain to privacy, to technology, to, 
you know, many related issues, and no one really knows what the 
rules are.
    So one of the chief benefits of enacting a comprehensive 
privacy law, which could include many of the issues we have 
talked about today, is to bring it all together, even if 
certain laws--it is not going to repeal all the sectoral laws, 
it is not going to roll back, you know, everything that people 
are dealing with now, but it could bring it together and create 
a comprehensive enforcement scheme.
    And so that is one of the reasons getting rid of that 
confusion, make--bringing greater clarity to the marketplace, 
that it is so vital that we pass that kind of law.
    Mr. Bilirakis. Thank you so much. Next question, it 
ultimately will be for Mr. Lane, but I want to--I have--I do 
have some comments first.
    In addition to privacy and data security, one central theme 
to today's conversations, a Big Tech accountability platform, 
that particular Act is sponsored by Leader Rodgers, and we 
released it earlier this year.
    One issue that is very near to my constituents is the 
growing rise of illegal activity, like the scale of deadly 
fentanyl products that are plaguing social media platforms. In 
fact, I was able to question the DEA about this issue just last 
week, and I am holding a roundtable in my particular district 
in Florida, the 12th congressional district of Florida, in the 
Tampa Bay Area, to discuss the fentanyl crisis with local 
leaders and law enforcement. We are doing that on Monday at 
noon.
    To curb the tide of this activity, I also authored draft 
legislation that would direct the GAO to conduct a study on how 
online platforms can better work with law enforcement to 
address illegal content and crimes on their platforms.
    So the question is for Mr. Lane.
    What do you believe, Mr. Lane, is important for us to 
consider as part of this particular discussion?
    Mr. Lane. Well, as you know, I have been working with 
families who have had children die from fentanyl poisoning, and 
it is a very sad situation that we are facing.
    I do believe that, working with the FDA and others, they 
are taking some important steps. There is a lot of groups out 
there that are focusing on this. But there are two things that 
have to occur.
    One, I know that groups have asked expressly to have an 
open and accessible and accurate WHOIS database, because that 
is how they are finding websites that are engaged in selling 
these drugs. And right now it is dark, and the FDA itself has 
asked for an open, accessible, and accurate WHOIS database. So 
that is a very important step in moving forward.
    The other important step is that everyone talks about how 
these social networking sites are rabbit holes. Rabbit holes 
were 1996, when you had bulletin board services, and you had to 
find the rabbit hole. These social networking sites are more 
like black holes. They have a gravitational force of sucking 
people in to the darkness, and it is very hard for them to see 
the light again.
    And those are the issues that we have to address: what are 
the algorithms? How are these black hole social networking 
sites that are sucking these young people in, and exposing them 
to drugs that maybe they would not have ever had access to, and 
how do we stop that?
    Mr. Bilirakis. All right, thank you very much. I appreciate 
it. And I want to discuss that even further with you, but I 
appreciate your response.
    One last question. During the Senate Commerce Committee 
nomination of Gigi Sohn and Alan Davidson, both nominees 
discussed the harms that are occurring regarding the misuse of 
consumer personal information, and ultimately expressed support 
for passing a comprehensive privacy bill. I think this 
highlights how important it is for Congress to pass a national 
law on privacy and data security.
    To the entire panel, a yes or no answer would be fine. 
Would you support this committee passing a comprehensive, 
national privacy and data security bill that sets one national 
standard, provides new rights to consumers, and sets clear 
guidelines for businesses to comply?
    Again, a yes or no. Ms. Rich, please. I know what your 
answer is going to be.
    Ms. Rich. Yes.
    Mr. Bilirakis. Yes. Mr. Golin, please.
    Mr. Golin. Yes.
    Mr. Bilirakis. Thank you.
    Mr. Lane, please.
    Mr. Lane. Yes.
    Mr. Bilirakis. Thank you.
    Ms. Marechal--Dr. Marechal, excuse me.
    Dr. Marechal. Yes, but it must be a strong standard, and it 
must----
    Mr. Bilirakis. OK.
    Dr. Marechal [continue]. With appropriate enforcement 
mechanisms.
    Mr. Bilirakis. Thank you.
    Mr. Ahmed?
    Mr. Ahmed. Yes.
    Mr. Bilirakis. OK. And Mr. Greenblatt?
    Mr. Greenblatt. Yes, but I would want more information.
    Mr. Bilirakis. Thank you. Thank you so very much.
    And I yield back, Madam Chair. Thanks for the extra time.
    Ms. Schakowsky. Absolutely. I would say yes also.
    Mr. Bilirakis. Yes, I was going to ask you, but I knew your 
answer, as well.
    Ms. Schakowsky. Yes, absolutely. And now I recognize the 
chairman of the full committee for 5 minutes for questions, Mr. 
Pallone.
    TMr. Pallone. OK. Thank you, Chairman Schakowsky.
    As--I mentioned in my opening statement that we have held 
several hearings in the committee examining the real harms some 
social media companies have caused. And obviously, we are here 
today to discuss meaningful solutions. But I wanted to start 
out with Mr. Greenblatt.
    The Anti-Defamation League has done important work showing 
the role social media companies play in amplifying racist, 
extreme, and divisive content. And you have also shown how 
those actions disproportionately affect marginalized 
communities. So can you talk about the real harms you have seen 
social media companies cause through the use of their 
algorithms in that respect?
    Mr. Greenblatt. Sure. Thank you for the question, Mr. 
Chairman.
    Yes, and I would say right off the bat, you know, the 
companies often use the smokescreen of freedom of speech to 
explain why this shouldn't be regulated. But the founding 
fathers wrote the Constitution for Americans, not algorithms, 
right? Products aren't people, and they don't deserve to be 
protected. But citizens do.
    And we, indeed, have a situation where hate crimes are on 
the rise in this country. You know, the FBI reported a 13 
percent increase in 2020, and the largest total since 2001. And 
ADL indeed has been studying online hate and harassment, and we 
find that one out of three users who report being harassed 
online relate it back to a characteristic like race, religion, 
gender, sexual orientation. And we have seen real examples.
    I think about Taylor Dumpson, who is the young woman--she 
was the first African American female president of the student 
government at American University. I think she may have 
testified before you a year or two ago. And she was--after she 
was elected president, she was mercilessly attacked with a 
campaign that was conducted all online. It originated on a 
disgusting blog, neo-Nazi blog, and was perpetrated through 
Facebook and other platforms. And it ended up--started with the 
hate online, Mr. Chairman, and then you had nooses being placed 
all over campus. ADL worked very closely with Ms. Dumpson, and 
she is in a much better place today.
    I think about a woman named Tanya Gersh, a Jewish woman 
from Whitefish, Montana, who had the misfortune of being from 
the same town that Richard Spencer, the notorious leader of the 
alt-right, was from. And when Ms. Gersh was identified and then 
doxed by the alt-right and neo-Nazis, she indeed, as well, was 
so mercilessly attacked, her and her family, they had to not 
only change all of their information, like their phone numbers, 
they had to move to a different home. They had to get 24/7 
protection. Literally, again, death threats happened offline 
because of what started online.
    So algorithms, we need much more transparency around them 
to ensure that they don't discriminate against marginalized 
communities. We need to realize that, as we were saying 
earlier, Facebook's AI, their vaunted machine learning, 
literally misses 95 to 97 percent of the hate speech.
    You know, I used to be an executive at Starbucks, Mr. 
Pallone. I didn't get to say to my customers, ``Well, three to 
five percent of our coffees don't have poison, so we think they 
are pretty good.''
    Mr. Pallone. That----
    Mr. Greenblatt. You have to have a success rate of 100 
percent, and I don't think it is too much to ask of, literally, 
one of the most well-capitalized and profitable companies in 
America to ensure that their products simply work, and don't 
harm their customers or the public.
    Ms. Schakowsky. Thank you. I wanted to ask you another 
question, though, about transparency, because, in the case of 
holding Big Tech accountable, increased transparency, I think, 
would go a long way to making it a safer place.
    So how would the bills before us today bring greater 
transparency and, with it, greater accountability to the Big 
Tech platforms, if you--
    Mr. Greenblatt. Well, first and foremost, making the 
companies simply share their data about how the algorithms 
perform for the benefit of researchers and watchdogs. Think 
about it. These are public companies who have the privilege of 
getting resources from the public, right? Selling shares. But 
they don't disclose their information. Forget the risk to the 
companies, it is a risk to the general public.
    The right analogy here is really Big Tobacco or Big Oil. We 
learned later that Big Tobacco knew the damage that their 
products were doing to their consumers, but suppressed the 
research. And we didn't have insight until it became revealed. 
And we learned that Big Oil knew the damage that fossil fuels 
were doing to the environment, but they denied it, and lied, 
until it was revealed. Well, now we know the damage that Big 
Tech is doing to our children, and to our communities. So 
asking them to simply be transparent, to simply make the 
information available.
    The last thing I will just say to keep in mind is--what is 
the information we are asking for? It is user data. You know, 
there is this--there is an expression: If the product is free, 
you are the product. The information that we want is 
information about us. That shouldn't be too much to ask.
    Mr. Pallone. Thank you. Thank you, Madam Chair.
    Ms. Schakowsky. Mr. Latta, you are recognized for 5 
minutes.
    Mr. Latta. Well, I think the Chair, my good friend for 
yielding, and thanks for the hearing today, very, very 
informational. And I want to thank our witnesses for all being 
with us today.
    Ms. Rich, if I can start my questions with you, and my good 
friend, the ranking member of the subcommittee, was getting 
into some privacy questions, and that is one of the issues 
that, you know, that is being struggled with today because, you 
know, looking at the testimony that you submitted, you know, 
you say for consumer survey--one of the surveys shows that 
consumers are concerned or confused about their privacy. Then 
it says consumers need a clear and consistent privacy law. 
Businesses, they are confused. Then we look at the enforcers.
    And this was kind of also interesting. It says the lack of 
clear privacy standards are undermined--has undermined the FTC, 
too. And you state that, among other things, that the law does 
not establish clear standards for everyone to follow before 
problems occur. And what are some of these--because it says it 
is largely reactive.
    So what is out there that the FTC has been doing, even 
though they have been trying to do what they are supposed to be 
doing in enforcement, but what are some of the standards that 
they need to have right now, to go forward and be clearer for 
the public?
    Ms. Rich. Well, some of the basic building blocks that we 
see in every privacy law aren't required by the FTC Act: basic 
transparency, choices, accountability. There aren't--there 
isn't a data security law that applies across the country.
    So--and, you know, you may not want this in a law, but, you 
know, access, correction, deletion, all of those types of 
rights that you see in law after law, anti-discrimination 
provisions, all of that--the FTC has to examine a specific 
company and decide after the fact, using its authority to 
police unfair or deceptive practices, whether a practice was 
unfair or deceptive. But there aren't clear requirements. All 
those elements aren't clearly required in any nationwide law 
that applies across different situations.
    And so, as I think I said in my testimony, the FTC has been 
able to do a lot with its authority under the FTC Act. But it 
would be so much better for the public, for consumers, for 
businesses, for everybody, for the marketplace to have rules 
that everyone knows what they are, and they know what the 
consequences are if they violate them.
    Mr. Latta. Well, thank you very much.
    Mr. Lane, you know, I am very glad we are holding today's 
hearing today, where we can consider legislative proposals like 
the Big Tech discussion draft that I authored that would 
require companies to disclose their content enforcement 
decisions. This is intended to cover illegal activity and harms 
that are happening online, such as fraud, illegal drug sales, 
and human trafficking.
    I think complementary to this goal is the ability to have 
access to accurate WHOIS data. This would go a long way in 
helping to solve these problems.
    As you mentioned in your testimony, WHOIS information can 
play a vital role in combating fraud and facilitating better 
cybersecurity. In 2020 I sent letters to several executive 
branch agencies to ask them about the importance of WHOIS in 
conducting their investigative and prosecutorial obligations. 
In responses from the FDA, FTC, and DHS, they emphasized the 
importance of this information in identifying bad actors, and 
connecting criminal networks, and protecting consumers about 
our cyber assets (sic).
    You know, would restored access to WHOIS complement my 
discussion draft to make the internet safer?
    Mr. Lane. Yes, absolutely. First of all, I want to thank 
you, Mr. Latta, and your staff for taking a leading role in the 
WHOIS issue. Your letters have been critically important to 
show and highlight the real concerns and cybersecurity threats 
that our nation is facing because of a dark WHOIS, based on the 
decision from the European Union and the GDPR, and a very broad 
interpretation of having it go dark.
    I just also wanted to add one thing, and it is not just me 
saying it. In 2021, a survey by the two leading cybersecurity 
working groups found that restricted access to WHOIS data 
impeded investigations of cyber attacks. Two-thirds of the two 
hundred and seventy-seven respondents said their ability to 
detect malicious domains has decreased, and seventy percent 
indicated they can no longer address threats in a timely 
manner. And more than 80 percent reported that the time it 
takes to address abuse has increased, which means that cyber 
attacks harms the victims, lasts longer.
    The group basically said this: Changes to WHOIS access 
following ICANN's implementation of the EU GDPR continued to 
significantly impede cyber applications and forensic 
investigation, and thus cause harm to victims of phishing, 
malware, and other cyber attacks.
    The Federal Trade Commission, as well as ICANN, is trying 
to fix this problem. And it is--what you are pushing in your 
legislation, and your letters--and, hopefully, this Congress 
will enact legislation--is critical. We can no longer put the 
multi-stakeholder process of ICANN ahead of the American people 
and the safety and security--and our national security needs to 
be protected by this Congress. And we should not be kowtowing 
to a law and a regulation that is from another country.
    And I just want to end on this. ICANN itself, this 
chairman, the CEO of ICANN, has said that they are limited in 
their actions because of the GDPR, not because of U.S. law, not 
because of the California privacy laws, but by the GDPR. So we 
are at risk of having our own security put at risk because of a 
foreign entity's legislation and regulation.
    And thank you so much for everything you are doing in this 
space.
    Mr. Latta. Well, thank you very much.
    Madam Chair, before I yield back, I would like to ask 
unanimous consent to ask for the--entering the documents from 
the DHS, the FTC, and the FDA, and a report from the ICANN, 
GDPR, and a WHOIS user survey into the record.
    Ms. Schakowsky. Without objection.
    [The information appears at the conclusion of the hearing.]
    Mr. Latta. Thank you very much for your indulgence. I yield 
back.
    Ms. Schakowsky. Now I recognize Mr. Rush for 5 minutes for 
his questions.
    Mr. Rush. I want to thank you, Madam Chair, for convening 
this important hearing.
    Like my colleagues, I am also a strong advocate for a 
comprehensive Federal policy legislation. In fact, when I 
served as chair of this subcommittee, we passed a strong, 
bipartisan bill that, ultimately and unfortunately, died in the 
Senate.
    While I continue to advocate for policy legislation, Madam 
Chair, I am also cognizant of the fact that privacy is not a 
panacea that would solve all of the internet-connected problems 
that our nation currently faces.
    Today, in addition to privacy issues, we also face very 
real and very pressing threats from issues like misinformation, 
disinformation, and algorithmic biases. With that in mind, and 
while I look forward to working on comprehensive privacy 
legislation, I am pleased that we are addressing these other 
equally important issues, as well.
    That said, Mr. Golin, in your testimony you state that --
and I quote--``children in lower-income households spent nearly 
two hours more on screens than children from higher-income 
households, and Black and Hispanic children spend significantly 
more time on screens than their White peers.''
    You also described how increased exposure to screen time is 
linked to increases in mental health issues, such as 
depression. It is too often the case that when--catches 
pneumonia. And while I feel that--this is true when it comes to 
screen time, also.
    To that point, what type of impact is this increased screen 
time having in lower-income households, and particularly for 
Black and Hispanic children?
    Has there been any data that shows how these outcomes 
compare to White or children in higher-income households?
    Mr. Golin. Thank you so much for that question. Yes, so, as 
you referenced, the data shows that low-income and Black and 
Hispanic children have more screen time and spend more time 
playing games online than their higher-income and their White 
peers. And you know, the data also shows that screen time-
linked problems, like childhood obesity, there are much higher 
rates in--for low-income children and Black and Hispanic 
children.
    So I think that, you know, given what we know about the 
severity of the problems linked to excessive screen time, and 
that these children from these communities are having even 
higher rates, it is absolutely essential that we pass policies 
to protect to protect them.
    Like all issues, you know, this is--affects all children. 
But like every issue, children from marginalized communities, 
children from more vulnerable communities are getting the worst 
of it. And so that is why it is so important that we create a 
new set of rules, and build a better internet for children, 
because we need to protect the most vulnerable among us.
    Mr. Rush. Does this create problems in the public education 
system?
    Also, do you--is there any data that supports other 
ramifications of this particular phenomena?
    [No response.]
    Mr. Rush. Hello.
    Mr. Golin. I am sorry, I don't think I heard the question. 
Was that a question for me? I am not sure if I heard it 
correctly.
    Mr. Rush. Yes, this is you, this is the second question.
    Is there any data that says that this particular phenomena 
affects the public education system, students in the public 
education system?
    Is there an effect on--the increase in screen time--on 
children in school?
    Mr. Golin. Yes. Well, there is data that shows the more 
time that kids are spending online for entertainment, the--it 
is correlated with lower academic achievement.
    There has also been a rush to use EdTech in our schools, 
and to see EdTech as a panacea for fixing educational 
inequality when, in fact, what the data is showing is that, the 
more hands-on learning that kids get, it is actually better for 
their academic achievement.
    So I think one of the things that is really worrisome is 
this, you know, this idea that, if schools invest heavily in 
EdTech platforms, that that is going to fix educational 
inequality. And, in fact, I think there is a real danger that 
is going to worsen it, because what kids need is quality 
teachers. They need smaller class sizes. They need to interact 
with each other. And the more time that kids are spending on 
screens for their learning, it is taking away from those 
things.
    Mr. Rush. Thank you.
    I yield back, Madam Chair. Thank you for your indulgence.
    Ms. Schakowsky. The gentleman yields back, and now Mrs. 
Rodgers is recognized for 5 minutes.
    Mrs. Rodgers. Thank you, Madam Chair.
    Ms. Rich, thank you for your decades of service. Your 
experience at the FTC was under a democratic chair, yet I 
appreciate your dedication to bipartisan consensus when 
possible, which had been the Commission's tradition.
    Yesterday, Mr. Bilirakis and I sent a letter to FTC 
Chairwoman Khan regarding the FTC's current direction. It 
expresses concern with the Commission's use of zombie voting to 
pass rules, and the recent decision to delete legitimate 
business activity from the FTC mission statement.
    Given the number of bills before us, I think it is 
essential that we find a realistic enforcement balance. We need 
to know how the Commission would manage all these competing 
priorities, without hurting legitimate business activity.
    This alarming mission statement change happened while the 
Build Back Better Act was pending in the Senate. That 
legislation includes an amendment to the FTC Act, which would 
give the Commission broad, first-offense penalty authority.
    How expansive is this proposed authority?
    Is there any commercial activity or sector of the economy 
that it wouldn't apply to?
    Ms. Rich. The civil penalty provision in the Build Back 
Better Act, as I read it, would apply to anything covered by 
the FTC Act: unfair or deceptive practices under the FTC Act.
    So the FTC does lack jurisdiction over certain sectors of 
the marketplace: banks, non-profits, certain functions of 
common carriers. But otherwise, as I understand the provision, 
if it were to pass, it would apply across wide swaths of the 
marketplace.
    Mrs. Rodgers. Thank you. Regarding the proposed new 
authorities, am I correct this only deals with civil penalties, 
and not remedies, like judgment or restitution?
    Ms. Rich. That is right. Civil penalties only.
    Mrs. Rodgers. During your FTC service, was the Commission 
able to predict how many violations would occur each year?
    Ms. Rich. No.
    Mrs. Rodgers. That is in line with our experience. The FTC 
cannot predict who is going to break the law.
    I would note we supported and enacted such civil penalty 
authority targeting COVID-19 scams, and the Congressional 
Budget Office reported back that such revenues were 
insignificant over the 2021 to 2030 period.
    This might be a basic question, but if all companies are 
following the law, there is no violation of the FTC Act. And 
thus, revenue is not generated via enforcement actions. 
Correct?
    Ms. Rich. Yes, although I have never seen a situation where 
all companies are----
    Mrs. Rodgers [continue]. See changes in actions. I worry 
about the lack of regulatory certainty for small businesses. 
They, after all, are not experts, like you, on what protections 
they may have under the FTC Act.
    Is it fair to say that they may not have the resources or 
the sophistication to manage a review by the FTC of their 
operations?
    Ms. Rich. Yes, but I am--not to be a broken record, but I 
think Congress can fix this problem by passing a privacy law 
that does provide standards.
    Mrs. Rodgers. OK, well, I appreciate you answering those 
questions and providing the insight. And I do thank all the 
witnesses for being here.
    I want to note that we have incorporated first-offense 
penalty authority in our comprehensive privacy and data 
security legislation, the Comptroller Data Act, as a means of 
policy enforcement, and I urge this committee to take action.
    I yield back. Thank you.
    Ms. Schakowsky. The gentlewoman yields back, and now I 
recognize Congresswoman Castor for her 5 minutes of questions.
    Ms. Castor. Well, thank you very much, Chair Schakowsky, 
for holding this very important hearing, and for including my 
Kids Internet Design and Safety Act that I am leading with 
Representatives Clarke, Trahan, and Wexton, and, of course, 
Senator Markey and Blumenthal, and including the Social Media 
Data Act that Rep. Trahan and I are leading, as well.
    We really do come to this hearing more than--more so than 
other hearings, as parents and as grandparents. We know, as Mr. 
Greenblatt said, these Big Tech companies are complicit in the 
harm that is being caused by online operations and, as Mr. 
Ahmed pointed out, profiting from the harm. So we clearly have 
to take action now on 230, on children's privacy, everyone's 
privacy, and especially the design of these platforms.
    So I want to focus in on the KIDS Act. Mr. Golin, thank you 
very much for your years of work on this. So your testimony is 
that they--these Big Tech platforms like Instagram and YouTube 
and others, they intentionally design the way children interact 
online to kind of keep them addicted. Will you go into a little 
more detail on that?
    Mr. Golin. Sure. And, first of all, Representative Castor, 
thank you for your tireless work to see that children get the 
online protections that they deserve.
    So the business model for all of this media is to maximize 
engagement, because the more time a kid is on a platform, the 
more money they are worth to the platform. And so they design 
their platforms intentionally in ways to keep kids on those 
platforms, and to keep them checking those platforms as often 
as possible.
    Just a few examples of that, they use things like rewards, 
and nudges, and push notifications. So things like Snap 
streaks. So on Snapchat, kids are incentivized to communicate 
through Snapchat every day with a friend, and then keep a 
streak going, and that becomes a very powerful motivation. It 
gamifies the relationship, and kids really want to keep that 
going.
    They use things like autoplay and infinite scrolls on 
TikTok to make it really, really, really easy to keep using a 
platform, and really, really hard to disconnect.
    They use things like likes and the follower counts, and so 
there is--everybody can see who is popular, and whose posts are 
popular at any given moment. And this is a really powerful 
incentive for kids to create content. And not only just create 
content, but to create provocative content, and risque content, 
because they know that is what is most likely to get them 
attention.
    And then, of course, there is the algorithmic 
recommendations, which personalize everything to kids to show 
them the content that is most likely to keep them engaged and 
keep going on a platform, regardless of whether that content is 
good for them. And in fact, as we have been talking a lot about 
lately, very often that content is terrible for them.
    Ms. Castor. And, you know, I have been out when I am out 
and about, and I see very young children now on tablets and 
iPhones. I mean, we are talking toddlers. And what does the 
latest research tell us about how young children are when they 
are first interacting with online platforms?
    Mr. Golin. Well, I mean, I think one of the things that is 
really disturbing is we all know that the age for social media, 
when you are supposed to go on social media, is 13. Forty 
percent of nine to twelve-year-olds report using TikTok every 
day. And the numbers are just about identical for Instagram and 
Snapchat.
    Ms. Castor. And do they have the ability to kind of self-
regulate at that age?
    Mr. Golin. No, absolutely not. Executive functioning is 
still developing. It is very--you know, I mean, these are 
platforms that adults get lost in. These are platforms that, 
you know, we are all struggling with, as adults. And to think 
that developing children, who are still developing their 
executive function, and whose habits are being formed are using 
these platforms----
    Ms. Castor. So how will the KIDS Act then help parents, and 
help address these harms that these online platforms are 
peddling and profiting off of?
    Mr. Golin. So I think the KIDS Act does a number of really 
important things.
    So, first of all, it prohibits those design choices that 
are there to maximize engagement, things like--to children--
things like autoplay, things like rewards, things like 
quantified popularity.
    It prohibits algorithmic--platforms from using algorithms 
to amplify harmful content to children, something that we have 
all been talking about a lot lately.
    It also bans influencer marketing to children, which is one 
of the most manipulative forms of advertising there is.
    So it really would do a huge amount to start creating that 
online environment that kids----
    Ms. Castor. And then we have to pair it with privacy 
protections, right? And I have worked with you on the Kids 
Online Privacy Act. Do you agree that we need--those need to 
work together, and be passed together?
    Mr. Golin. If we could pass both of those bills, we would 
really go so far towards creating the internet kids deserve.
    Ms. Castor. Thank you very much. I yield back.
    Ms. Schakowsky. The gentle lady yields back.
    Mr. Dunn, you are recognized for 5 minutes.
    Mr. Dunn. Thank you very much, Madam Chair. I appreciate 
the opportunity to discuss these important issues.
    You know, the Chinese Communist Party is probably the 
single greatest threat to the free world since the Cold War, 
and they seek to sabotage freedom, democracy everywhere it 
exists. And malign influence permeates all of their 
corporations, including those that operate in the United 
States. They have CCP members in key board positions, and many 
of those organizations, they have direct control over decision-
making.
    Despite that, American tech companies still continue to 
operate within China, and we allow them--or companies with 
those ties--to operate quite freely here, in the United States, 
as well. Just this year, Microsoft was the victim of a Chinese 
state-sponsored cyber attack. Yet, if you look at the number of 
job postings for Microsoft in China, you get the feeling they 
are expanding rapidly in China.
    So I think it is the concern of this committee what these 
U.S. tech companies are doing within China, and what those 
Chinese companies are doing here. For purposes of this hearing, 
I want to focus on what the CCP-affiliated companies might be 
doing here, in the United States.
    The CCP doesn't respect the rights of their own citizens. 
Why should they respect ours?
    Congress has a responsibility to ensure that American 
consumers are protected from these evolving threats. And I 
think this can be accomplished, and a number of you have said 
that today, as we--if we can get a comprehensive data security 
bill through that protects our citizens, without sacrificing 
innovation and competitiveness in our nation's technological 
fronts.
    Mr. Lane, I, like many of my constituents, am very 
concerned about the amount of personal information that is 
currently collected without any basic level of protection. A 
specific example is BGI--that is the Chinese genomics giant--
and the activities that they instituted during the COVID 
pandemic. They sold millions of tests kits to U.S. labs, and 
offered their own sequencing services to the government and 
individual states.
    The lack of privacy standards attached to that does pose a 
national security risk, and I would like to know what concerns 
you most when it comes to protecting Americans' consumer data 
from foreign adversaries. What keeps you awake at night?
    Mr. Lane. Thank you for the question, Congressman. What 
keeps me awake at night is that most people don't realize that 
the driver in this artificial intelligence race and machine 
learning is human interaction and data. And those who collect 
it the most will win in that fight.
    And I do have strong concerns that we don't know how data 
is being collected and used. There is some great legislation. 
The Duncan bill and the Kinzinger bill are great examples of 
how we can try to know that.
    But we also have to be concerned, because the head of 
government affairs for TikTok, over in the Senate, basically he 
talked about how the data is stored in Singapore. Well, my 
pictures are stored I don't know where, somewhere in the cloud. 
But I can manipulate them, I can access them, I can even print 
them. So we need to make sure that we know, not just where the 
data is stored, but how they are getting access to it.
    And one of the things that has always bothered me about one 
of the TikTok statements is that they will never hand over U.S. 
American citizen information to China. And maybe they believe 
that. But if someone gets a knock on their door, and a family 
member who is still living in China--from the Chinese Communist 
Party, and says, ``We would like your relative to hand over the 
data,'' I don't--I know what I would do. Just as a person, if 
it was my family being threatened, would I hand that data over? 
Probably. And so those assurances cannot be taken seriously 
with that.
    Mr. Dunn. So physical location of the data, which is real, 
even in the cloud, right, is something that is important. And 
of course, the jurisdiction over that data is important.
    Ms. Rich, in the remaining seconds we have, I would like 
you to address what help you would like from Congress to give 
to the FTC to improve the security of our data.
    Ms. Rich. Specific data security requirements, which do not 
apply across the market right now, there is no general data 
security law that applies to the U.S. marketplace. That would 
include process requirements, such as doing a risk assessment, 
accountability among officers in the company, oversight of 
service providers, contracts with service providers. There is 
many elements.
    Mr. Dunn. A reliable audit on these companies, perhaps, as 
well.
    Ms. Rich. Yes.
    Mr. Dunn. Thank you very much for your time. All of you 
have been excellent witnesses.
    Madam Chair, I yield back.
    Ms. Schakowsky. Thank you, Mr. Dunn. Now I recognize 
Congresswoman Trahan for 5 minutes.
    Mrs. Trahan. Thank you. Chairwoman Schakowsky and Ranking 
Member Bilirakis, thank you for convening this important 
hearing, and thank you to the witnesses. Many of you have 
offered invaluable expertise to my team and me when we 
introduced the Social Media Data Act in May, and now, as we 
draft text to create a new bureau at the FTC focused on 
platform transparency and safety.
    Mr. Golin, Fairplay, formerly the Campaign for Commercial-
Free Childhood, has been studying the impact of advertising on 
child--on children for decades. Can you explain why 
surveillance advertising, the method used by Instagram and 
YouTube, is particularly harmful for our teens?
    Mr. Golin. Sure. There is a couple of reasons it is so 
harmful.
    And first of all, thank you so much for all of your work to 
protect children online.
    There is--so it is harmful because it allows companies to 
target teens' vulnerabilities. In fact, Facebook, a couple of 
years ago, they bragged to their advertisers that they were 
able to target a teen at the exact moment that they were 
feeling bad about themselves, and including when they feel bad 
about their bodies. So this leads to things like, you know, 
girls who express interest in dieting getting targeted with ads 
for flat tummy tees and dangerous exercise routines.
    So again, being able to target those things that people are 
very vulnerable to, and try and encourage consumption of 
products that will make those things worse.
    The other thing is that there is a complete asymmetry of 
information. It is just completely unfair. The only thing that 
teens may know about surveillance advertising is that there is 
some creepy ad that keeps following them around, and they do 
use the word ``creepy'' to describe the advertising. But the 
advertisers know everything about that child. They know every 
website they have ever visited, every video they have ever 
liked, every comment they have ever made online, how much money 
their parents make, where they live, all the places they go. So 
it is just--it is completely unfair. The advertiser knows 
everything about the child, and the child knows very little 
about how the advertising works.
    And then the last thing I will just say is, of course, it 
leads to a tremendous amount of data collection, and that data 
can be misused in all sorts of ways.
    Mrs. Trahan. Well, certainly. I thank you for that. I mean, 
as Congresswoman Castor pointed out, many of us are mothers. I 
am the mother of two young girls. I am very concerned that they 
could be watching an online video of their favorite athlete, 
only to be targeted with a dangerous weight loss supplement. 
And we certainly need more transparency into how these ads are 
targeted.
    Dr. Marechal, can you speak to why it is important for 
researchers to be able to study all digital advertisements, as 
opposed to just a subset, like political ads?
    Dr. Marechal. First, it is very difficult to draw a clear 
line around what ads are political or not. For example, when an 
oil company runs ads advertising its commitment to green 
energy, is that political?
    How about when Facebook runs ads claiming to support 
updated internet regulation, while lobbying against it behind 
closed doors?
    What about these diet ads that we were just talking about, 
is that political?
    Moreover, even if we agree where to draw the line, can we 
trust platforms to enforce it accurately? I think it is clear 
that the answer there is no.
    But more importantly, ads can be dangerous or 
discriminatory, even if they are not political. The diet ads 
here is a great example, again.
    But more importantly--but many people would say that a 
housing ad is not political. But if it is targeted in such a 
way that Black users can't see it, that is discriminatory and 
harmful. And that is exactly what----
    Mrs. Trahan. That is----
    Dr. Marechal [continue]. What targeted advertising enables.
    [Audio malfunction.]
    Mrs. Trahan [continue]. You can speak to why researchers 
need to have details regarding, not just the aggregated 
description of its audience that is targeted, but also a 
description of the aggregate users who saw or engaged with an 
ad.
    Dr. Marechal. Right. So the targeting parameters only tell 
you who the advertiser was trying to reach. They don't tell you 
who saw the ads. Many times those two groups are the same. But 
if they are not, there is one of two things that is likely 
happening: either the platform is defrauding the advertiser by 
charging for a service that they didn't deliver, or it is 
optimizing the targeting beyond what the advertiser asked for, 
often in ways that are discriminatory. Either way, this is 
something that we should know, so that we can put an end to it.
    Mrs. Trahan. Thank you for that. I do want to emphasize I 
think political ad transparency is important. I know the lines 
are blurred more and more.
    And on the resource page of my website, I have started a 
digital ad library, where I am posting all of my political ads. 
I have included all the data outlined in the Social Media Data 
Act. I am happy to chat with my fellow members, if they would 
like to join me in that.
    But I think, just in my close--and I do have a few more 
questions I will submit for the record.
    [The information appears at the conclusion of the hearing.]
    Mrs. Trahan. But Frances Haugen told us just last week that 
researchers have begged and begged and begged for very basic 
data, data that they will never get unless Congress acts. And 
the Social Media Data Act begins to address this issue. And I 
look forward to continuing to work with all of you on the 
transparency issues that will pave the way for us to legislate.
    Thank you.
    Ms. Schakowsky. Thank you. The gentlewoman yields back, and 
I recognize Mr. Pence for his 5 minutes of questions.
    Mr. Pence. Thank you, Chairwoman Schakowsky and Ranking 
Member Bilirakis, for holding this hearing. And thank you to 
the witnesses for appearing here today.
    This hearing is imperative to exploring the parts of Big 
Tech that could be negatively impacting the social fabric of 
our country, and harm the--harming the well-being of Hoosiers 
and all Americans.
    I am increasingly concerned with the growth-at-any-cost 
mindset of Silicon Valley, which has been around for a long 
time, as we heard last week. Social media platforms monetize 
inflammatory content using opaque algorithms and tactics 
intended to manipulate the tendency of its users. This 
information allows Big Tech platforms to sell highly-valued 
advertising space with precisely placed ads at the most optimal 
times.
    If profit is the ultimate goal, and there is nothing wrong 
with making money, one way to get there is to gin up users by 
promoting content that elicits the strongest responses. This 
creates a feedback loop of more clicks that lead to more data, 
which leads to smarter algorithms that can collect even more 
data. These efforts seem to work in conjunction with the 
expansive shield of Section 230 to evade accountability.
    For Big Tobacco, warning labels plastered on the side of a 
pack of cigarettes served as a long-time immunity defense. For 
Big Tech it is Section 230. And much like Big Tobacco, tech 
companies use these same tactics on our youth to bring in 
lifelong customers--if some of you remember Joe Camel.
    Unfortunately, for my constituents, there is a little 
insight--there is little insight into algorithms Big Tech 
employs to take advantage of their sweeping access in our 
everyday lives, nor do Hoosiers have adequate control over the 
amount of information collected, or how it is used to tailor 
personal and curated content.
    You know, we had truth in lending. We had to take care of 
that many years ago.
    Building off the Communications and Technology Subcommittee 
hearing last week, which many of my colleagues here attended, 
it is clear this committee needs to get serious with our 
efforts to rein in Big Tech.
    Mr. Greenblatt, I think you would agree that there are 
positive aspects of social media. Whether it is checking in 
with family or friends, or for small businesses to expand their 
reach, there are healthy uses of social media. But it seems to 
me these tech companies realized early on that they sit on top 
of a gold mine of user information with virtually no guardrails 
to protect consumers. And, as you detailed in your testimony, 
incendiary and controversial content is good for business.
    Throughout this hearing, we have acknowledged the harmful 
aspects of overexposure to hateful content. This is--this has 
become a--very much a bipartisan issue. We--in my opinion, we 
ought to consider proposals that stop a platform's ability to 
generate revenue off content that has been adjudicated to have 
harmed the well-being of its users.
    If platforms--Mr. Greenblatt, if platforms were 
eliminated--or limited in their ability to use algorithms to 
curate content for users, what would happen to social media 
companies, would they still be profitable enough to stay in 
business?
    Mr. Greenblatt. Well, first of all, I would just say, 
Representative Pence, I agree with the analogy that you drew to 
Big Tobacco. I mean, speech may be different than cigarettes, 
but addictive products that the companies fail to manage, about 
which they obfuscate and lie to elected officials and to 
watchdogs, there is clearly a problem that requires government 
intervention. I wish it were different. Unfortunately, it is 
not the case.
    And I also agree that, like tobacco, you know, social media 
can be used in moderation for fun. And Facebook and other 
services have connected people across cultures, across 
countries. There is a lot of value to that. But the way they 
have been exploited by extremists, the way they have been used 
to abuse against children and manipulate them in ways that have 
been described is indefensible.
    Now, the reality is these companies, indeed, are so big, 
and are so profitable, I actually believe they could fix this 
problem today, if they wanted to. Sure, it might hurt their 
margins a little bit as they made some capital investments. But 
if they have the resources--think about Facebook. It is 16 
years old, and yet it has 3 billion users across the Planet 
Earth. It has the most sophisticated advertising----
    Mr. Pence. So, in the interest of time, you think that they 
could be profitable, they wouldn't necessarily go out of 
business?
    Mr. Greenblatt. Absolutely.
    Mr. Pence. Thank you.
    Mr. Greenblatt. Yes.
    Mr. Pence. Madam Chair, I yield back.
    Ms. Schakowsky. I thank the gentleman, and now Mr.--no, Mr. 
McNerney, sorry.
    Mr. McNerney, you are recognized for 5 minutes.
    Mr. McNerney. I thank the Chair for correcting that 
observation, and I thank the witnesses. Your testimony is very 
stark and important.
    Mr. Golin, I just first want to say I appreciate your 
observation that Big Tech is counting on partisan division to 
prevent meaningful reform. And so we have to take that upon 
ourselves to make sure that that isn't the case.
    Dr. Marechal, AI and machine learning are significantly 
more efficient for targeting specific consumers and for 
moderating content. Also, amplify and shape content in a way 
that creates entirely new harms, which we are hearing about 
this morning. So how does the use of AI and machine learning 
accelerate the spread of harmful content online, when employed 
to prioritize engagements of profits?
    Dr. Marechal. Thank you for that question.
    I want to be really clear that we are talking about two 
different types of algorithms here.
    On one hand, we have the algorithms that boost content, 
including recommendation algorithms, the algorithms that tell 
you what groups to join, what people to add as friends, what 
accounts to--and order the content on your timeline. That is 
based primarily on correlation, and on predictions based on 
engagement. What are you most likely to click on, watch, 
comment on, like, et cetera.
    On the other hand, we have algorithms that are meant to 
perform content moderation. That is to say, to identify the 
types of content that is illegal, that is against the 
platform's own rules, because it is harmful to--judged to be 
harmful to users and to society.
    AI is not good at this latter part. This is one of the big 
lies that the tech industry has been selling us, that we are 
just around the corner from a big achievement in AI that will 
suddenly make it possible for them to have these huge and 
profitable platforms, where their goal is to have as much of 
human economic activity and human life filter through these 
platforms, so that they can make money off of it. They want us 
to believe that they are just around the corner from being able 
to identify and moderate away all the direct sales, all the 
incitement to violence, all the hate speech, all the content 
that we are rightly concerned about today. Again, that is not 
true. Only human judgment can do that.
    Mr. McNerney. Well, thank you for that clarification. So 
could increased transparency, artificial intelligence, and 
machine learning by internet platforms help to improve online 
safety?
    Dr. Marechal. Absolutely. On the content moderation front, 
we need to know much more about the state of the art, as it is 
today, and what technology can and cannot do.
    We have learned from Ms. Haugen's revelations, as well as 
from other whistleblowers previously, that Facebook in 
particular basically does not moderate content in languages 
other than English. I am exaggerating slightly here, but if you 
look at--again, at Ms. Haugen's testimonies before Congress and 
in other places, it is really clear that that--as things are 
for us in the U.S., and for other English speakers around the 
world, it is orders of magnitude worse than that elsewhere.
    When it comes to content recommendation, you know, 
recommendation systems, likewise, we really need to understand 
what recommendations we are getting, what other people are 
getting, right? I have a sense of what is being recommended to 
me; I have no idea what is being recommended to you, or to 
other people in society.
    And again, policymaking in this area requires evidence. The 
first step towards getting evidence is greater transparency.
    Mr. McNerney. Well, thank you. Some clarification there.
    I also want to thank you for your recommendation that we 
not allow CEOs to be both board members and majority 
shareholders. Hopefully, we can work with the committees of 
jurisdiction to get that done to do something there.
    You also recommended that we should create conditions to 
help us produce evidence-based policy. Would you expand on that 
a little bit?
    Dr. Marechal. Yes, absolutely. So that is what I was 
referring to when I was speaking to the need for transparency, 
and for researcher access to platform data.
    So much of what we believe about--or think we know about 
platforms is based on our own individual experience, on 
anecdotes, on investigative journalism, on kind of one-off 
research studies, but it is not comprehensive, right? We have 
little snapshots of a huge problem, but that does not--that is 
not enough to fully understand the nature and extent of the 
problems, because only the platforms have access to that 
information.
    So I believe that, in order to legislate effectively, we 
need a much more detailed understanding of the facts on the 
ground.
    Mr. McNerney. I yield back.
    Ms. Schakowsky. The gentleman yields back.
    Mr. Armstrong, you are recognized for 5 minutes.
    Mr. Armstrong. Thank you, Madam Chair. I appreciate 
everybody being here today.
    And I think how we get here--I have sat through a lot of 
hearings in this committee and in my former committee, and I 
think we come down to this simple truth, that, as the larger 
the platform gets, more data is collected, more sophisticated 
algorithms are developed, which further entrenches their place 
in the marketplace, and stifles competition, and continues to 
incentivize the collection and use of that data to maximize 
profit. And seven--several of you have basically said this, and 
you are not unique.
    The problem is with the business model, one that is 
designed to attract attention, collect and analyze what keeps 
that attention in place: ads. Whether the content is somehow 
detrimental to that individual, minor or adult or society in 
general, isn't a concern.
    Now, several tech companies have recently announced that 
they will eliminate targeted advertising on certain topics, and 
we all know contextual advertising still occurs in other media. 
But after doing this for nearly three years now, I think my 
question is basically this: Should we restrict targeted 
advertising? Should we just restrict it?
    Should we ban targeted advertising to children? I 
understand there would be significant consequences. But if the 
cost, societal costs are as high as some of the witnesses here 
and witnesses, indeed, that we have heard talk about today, it 
becomes a simple cost-benefit analysis.
    The business model is not a bug, it is a feature. And it 
continues to do that.
    And listen, Republicans talk about increasing competition 
in the marketplace, and how we do that, and often times--and 
these aren't unique, right? We have had members on both sides 
of the aisle agree on certain issues. We have had members 
disagree on issues. But eventually, when we are talking about 
capitalism, we are talking about profit, we are talking some of 
the largest, most powerful companies in the history of the 
world, should we start talking about taking away the financial 
incentive for platforms----
    [Audio malfunction.]
    Mr. Armstrong [continue]. Of at least one empirical study 
from 2019 that concludes that, after accounting for other 
factors like user device information or geolocation data, 
publishers' revenue only increases by about four percent when a 
user's cookie is available. That increase corresponds to an 
average increment of just $.00008 per advertisement.
    And as we continue to do this, and we move around, and we 
talk about how we do all of these things, I think the question 
has to become how do we disincentivize these companies from 
financially profiting off of conduct that is particularly 
harmful to adults and children? And I think we do this--and I 
have listened, I have learned more about--I have learned just 
enough about all of this to be dangerous, I think. And we 
continue to move our way through this.
    But I think it is about we, as a legislative body, and as 
people who interact in this industry, I think it is about time 
we start having the real conversations about that. And I have 
got a minute and 50 seconds.
    Yes, Mr. Lane. Question mark, question mark.
    Mr. Lane. The industry is actually moving away from 
targeted advertising. If you--the last interactive advertising 
bureau meetings because of the GDPR and other related rules are 
slightly--you know, are moving away.
    The question isn't targeted advertising that is the 
problem, especially if you talk with Jonathan Greenblatt. It is 
what are they watching. And if the algorithms--you know, I 
worked for Fox, right? So it was--you know, the goal was to, 
you know, spend a lot of money to--for the Super Bowl, because 
you got a lot of people watching it. The ads weren't relevant. 
And so people are going to pay for the ads. They pay a lot of 
money for Super Bowl ads that are not targeted because of the 
crowd, the viewership.
    So the question is how are the algorithms, as I mentioned 
before, this black hole where they are trying to create people 
to be stuck in this system, the--you know, the edge of the net, 
the edge players, and how do we deal with that issue? I don't 
think getting rid of targeted advertising is going to help as 
much for the issues around what Jonathan is talking about as 
the issue of the manipulation of people, and bringing them down 
this black hole.
    Mr. Greenblatt. I would reinforce what Rick said. It is the 
surveillance advertising that is a problem. So I don't have a 
problem with advertising to our children. It happens on 
Saturday morning cartoons, you know, since the dawn of 
television. It happens in other media. The challenge is that we 
don't know what information they are collecting, they refuse to 
be transparent about it, and it is one--to use the term--one 
big black hole.
    So I think what we need is these--companies to submit to a 
degree of transparency, which would elucidate how their 
marketing works and, again, prevent children and others from 
being manipulated.
    Mr. Lane. And if I was going to have one area, in talking 
with the groups I work with on child safety, it is to have the 
parental control set to on, instead of off. That would go a 
long way of protecting the kids, because most parents don't 
know how to turn on these parental controls. And having them 
set to on for children and younger users, both at the device 
level, as well as at the social networking level, would be very 
helpful.
    Mr. Golin. Can I just agree with you, Representative 
Armstrong, that I think getting rid of data-driven advertising 
to children is one of the most important things that we could 
do to protecting them?
    Mr. Armstrong. Well, and I am 26 seconds over----
    Ms. Rich. And----
    Mr. Armstrong [continue]. But I would say the one thing--
the one point to that is if you--whatever the new financial 
incentive is, we will have to deal with that one secondly. But 
the reason I bring it up is the financial incentive to be 
there.
    And with that, I yield back.
    Ms. Schakowsky. The gentleman yields back.
    And Congresswoman Clarke, you are recognized for 5 minutes.
    Ms. Clarke. Thank you, Chairwoman Schakowsky and Ranking 
Member Bilirakis, for holding this very important hearing. And 
thank you to our witnesses for your insightful testimony today.
    Technology will always be a double-edged sword. While it is 
often a source of good and progress in the world, we must also 
take care to limit the harms and abuses that inevitably occur.
    As I mentioned during our hearing last week in the 
Communication Technology Subcommittee, the widespread use of 
algorithms by social media platforms to determine the content 
that users view has far too often resulted in discriminatory 
practices and the promotion of harmful misinformation.
    Recent whistleblower reports make it quite clear these 
platforms knowingly amplify the most dangerous, divisive 
content. Indeed, it is central to their business model. This is 
a major concern of mine when it comes to safeguarding our 
democracy and stopping the spread of online misinformation 
aimed at marginalized groups.
    After the 2016 election, a Senate Intelligence Committee 
report found that Black Americans in urban areas were 
disproportionately targeted on social media with false reports 
and conspiracy theories meant to propagate distrust in our 
democratic institutions. The report specifically notes that 
Russian operatives ``took advantage of the Facebook 
recommendation algorithm, an assessment Facebook officials have 
corroborated.''
    Mr. Ahmed, how would legislation like Congresswoman 
Matsui's Algorithmic Justice and Online Platform Transparency 
Act help prevent the targeted flow of disinformation aimed at 
marginalized communities like we saw during the 2016 elections, 
and are now seeing again with the COVID-19 vaccine?
    Mr. Ahmed. Thank you for the question. I think there are 
two ways in which it would help, and--to abate civil rights 
concerns.
    The first is that it would help us to deal with the kinds 
of algorithms that feed racist, discriminatory material to 
people that weren't already following it. So one of our reports 
on algorithms showed how people following wellness influencers 
were fed anti-vax content. People that then followed anti-vax 
content were fed anti-Semitic content, because it knew that you 
could broaden, as well as deepen, people's extremisms.
    The second thing it would do is--there is this issue 
where--misinformation is a very old thing. It has been around 
for a long time. But social media is like retrofitting a sort 
of homing package onto that misinformation, in that it turns, 
you know, a dumb weapon into a smart weapon, which can hone 
into the communities that it is most effective on. And we have 
seen that--the incredible ability of the--of content being 
produced by bad actors, such as anti-vaxxers.
    So Robert F. Kennedy, Jr. and his misinformation about 
vaccines, which is then--the algorithm drives it to the 
audiences that are most vulnerable to it. And that, of course, 
has led to--it has led to death. I mean, 49 out of the last 50 
deaths in DC were--of COVID--were of African American people. 
And that is a direct reflection of the misinformation that has 
been pumped into those--into our communities.
    Ms. Clarke. Thank you, Mr. Ahmed. The lack of 
accountability and transparency into how companies are using 
algorithmic systems is an issue I have been sounding the alarm 
on for years, and it is important we recognize that the use of 
discriminatory algorithms isn't limited to social media 
platforms. Increasingly, algorithms are being used by large 
companies to determine everything from who is eligible for 
health care coverage to whether or not a homebuyer receives a 
mortgage.
    While this may have certain benefits, the reality is that 
our current safeguards are insufficient to protect Americans 
from the harmful biases and design flaws inherent in new 
algorithms--excuse me, in many algorithms. And this is why I 
will soon be introducing an updated version of my Algorithmic 
Accountability Act, along with Senators Wyden and Cory Booker, 
which requires that large companies audit their algorithms for 
bias and discrimination, and to report their findings to the 
FTC for review.
    Ms. Marechal, from a general perspective, why is it so 
important that we address the instances of algorithmic bias 
that affect critical decisions in people's lives?
    Dr. Marechal. Thank you for that question, Representative 
Clarke.
    I think you described the stakes very well and clearly, 
yourself. Algorithms make decisions based on data. That data is 
often faulty. That data, even when it is accurate, reflects 
information that should not be taken into account when making 
certain decisions, right--make decisions----
    [Audio malfunction.]
    Dr. Marechal [continue]. To make them with things like 
race, or gender, or age, or other key markers of identity in 
mind, in order to be fair.
    Algorithms can only make decisions based on data. And so, 
it is--and right now this is something that is perfectly legal 
in many cases, and----
    Ms. Clarke. Ms. Marechal, I am so sorry, I am over time. I 
didn't realize it. I thank you for your response.
    I yield back, Madam Chair. Please, pardon me.
    Ms. Schakowsky. Yes, thank you.
    Congressman Bucshon, you are next. You are recognized for 5 
minutes.
    Mr. Bucshon. Thank you, Madam Chair. In recent years there 
has been proposals for the creation of internet platforms and 
services aimed at children--some of this I know we have 
covered, I apologize for missing part of the hearing--which, I 
am thankful, have largely been put on indefinite hold, since I 
am quite certain they would become havens for predators, 
fraudsters, and cyber bullies. Our society has been seeing the 
terrible impacts of cyberbullying on our children, with far too 
many being injured, or even losing their lives as a result of 
malicious actors online.
    Mr. Lane, I applaud you for your work as a child safety 
advocate imposing these type of bad actors.
    One proposal that I have put forward would require the 
publication and annual updating of content moderation practices 
relating to cyberbullying for internet platforms. This 
transparency would be a powerful tool for parents and other 
users to know what kinds of content and actions will not be 
tolerated on a platform, and they could be used--and they could 
use this information to allow and restrict their child's 
access.
    Do you--would you agree that providing clear and consistent 
rules in this space would reduce the incidence of 
cyberbullying?
    Mr. Lane. Yes, I do. When News Corp bought Myspace--and 
people maybe remember Myspace, it was the largest social 
networking site at the time--this was one of the areas that we 
focused on, because of the concern that our CEO and others had 
when we purchased it, the harm that could be occurring through 
cyberbullying. And it was the first time that we looked. And we 
did instill a lot of practices to try to stop it, and monitor, 
and report, to try to hinder the access of folks who are 
cyberbullying one another.
    So I do think having clear processes in place would be very 
helpful, but I also think--getting back to the point I was 
making earlier about having the parental control functions on 
in these--in this world, what kids can talk to which kids, and 
making sure that their kids--is critically important.
    Mr. Bucshon. I mean, it is--I have got four kids. I mean, 
it is a tough nut to crack. I mean, sometimes you don't even 
know that your kids are on certain sites. They have dual sites. 
They have the one where they show their parents, and they have 
the one that they are actually communicating on.
    And, as a parent, I do think parent engagement is extremely 
important in this situation, because we, as parents, said, ``We 
have access to all of your phone information and your computer 
information, and the first time that you don't give it to us, 
you lose your phone, you lose your access to the computer.''
    Mr. Lane. Yes, this has been an area where--has been very 
active in this space because of the harms, as kids go down a 
really bad rabbit hole in this area, and it can be so 
detrimental to their health, their safety, and their education, 
and it is something that really needs to be addressed.
    Mr. Bucshon. Yes, and we can have everything in place, in 
that if the parents aren't--or guardians are not daily, 
really--I mean, I have got four kids--daily engaged in what 
their kids are doing, we can do all we want here, and we may 
not still be able to stop it, but it is important to do it.
    Do you think the current patchwork of laws, regulations, 
and policies regulating the space to date have actually helped 
to allow cyberbullying, in many cases?
    Mr. Lane. I don't know. I mean, the hard part with 
cyberbullying that we faced even at Myspace was, you know, the 
free speech--you know, First Amendment. What is cyberbullying, 
what is bullying? That is always difficult to address.
    So the patchwork of different state laws, I mean, it is 
always hard when it is that way, and there is no natural law.
    Mr. Bucshon. Yes.
    Mr. Lane. I don't know--and we tried to figure this out 
ourselves--how you draft a law that completely can stop 
cyberbullying.
    Mr. Bucshon. Do you--I am just curious. Did you have 
childhood and teenage consultants on this, when you--you know, 
I know it sounds crazy, but all of us that have kids understand 
that what we think, as parents, might be one thing. The kids 
actually have quite a bit of insight.
    And I--you know, I talk to my kids, and I am like, OK, 
like, I don't quite get this. But it would be interesting to 
know if that--you think that would be helpful, where, actually, 
companies, and maybe even Congress, hear from teenagers, hear 
from kids about what is happening out there.
    Mr. Lane. Yes, it is funny. We didn't have any teens that 
were with us. But Parry Aftab, who is one of the leaders and 
child safety advocates in the early days of the net, had this 
group called Teen Angels, and she would talk to them, and we 
would talk to her and get ideas.
    The other thing that we did is we had a direct line to the 
National Center for Missing and Exploited Children to see what 
could we do to fix it, to make it better. And we basically took 
every recommendation that they made, some may say to the 
detriment that now it is all about Facebook, and no one knows 
about Myspace.
    But we thought it was the right thing to do, and we took 
steps. We would not implement certain functionality because we 
couldn't figure out how we could protect children that made 
sense. Himanshu Nigam, who is our chief safety officer, we 
would talk almost every day on what we could do to make Myspace 
safer. And it is tough, but you can do it.
    Mr. Bucshon. Yes, and it not only needs to make sense to 
us, it needs to be--make sense to the people who are 
potentially being cyberbullied.
    So I would suggest that we seriously consider that in the 
future, when we are talking about this subject. We might have a 
few people who--young people, who are actually in the arena, so 
to speak--give us some advice. I mean, I think that is not a 
bad idea.
    I yield back.
    Ms. Schakowsky. The gentleman yields back.
    And now, Mr. Cardenas you are recognized for 5 minutes.
    Mr. Cardenas. Thank you very much, Madam Chairwoman, and 
also Ranking Member Bilirakis, for holding this critical 
hearing. And I want to thank all the witnesses for all your 
expertise and opinions today to help educate us, so that we, 
hopefully, can make good policy to guide what is going on 
underneath our noses every single day.
    Every day Americans are forced to accept extremely complex, 
opaque, and one-sided terms of service to enjoy popular 
platforms that often market themselves as free.
    What I am holding up here is 27 pages of an agreement 
that--anybody who uses Snapchat has agreed to these 27 pages. 
There are roughly 106 million active Americans on Snapchat. How 
many of those users do you think have the time or formal legal 
education to understand and agree to a contract such as this, 
written by a team of lawyers, by the way? The average American 
doesn't have a team of lawyers, nor could they afford it.
    I predict that right around none is the number of Americans 
who have actually read every single one of these pages. And 
this goes for many, many, many of the platforms. Some of the 
platforms have reduced their agreements to two pages, probably 
much finer print and a lot more legalese. And once again, 
still, at the end of the day, same typical terms.
    Snapchat prides itself on protecting user privacy, and 
those who use the platform believe their snaps exist 
temporarily before being automatically deleted. But when you 
read the terms of service, you realize that this is not the 
case. In fact, Snapchat employees can access your private user 
data, including photos and/or videos. To go even further, 
hidden in Snapchat's terms of service, you grant Snapchat and 
its affiliates an unrestricted, worldwide, royalty-free, 
irrevocable, and perpetual right and license to use the name, 
likeness, and voice of anyone featured in your public content 
for commercial and non-commercial purposes. That is one of the 
clauses that is buried in these 27 pages.
    Folks, I said one of any--I said of anyone featured in your 
content. That is what that just meant. Anybody featured in your 
content. So if I put out content, and my colleague, Ms. Kelly, 
is next to me, all of a sudden I have wrapped her into it, and 
she hasn't agreed to anything. But it applies to what I have 
done, and I may have injured or aggrieved somebody that I care 
about. That means people who do not even sign up are subject to 
the--this agreement.
    And again, even if that person disagrees, do they have a 
team of lawyers to go ahead and fight for their rights?
    Those who read the terms would notice that platforms often 
include an arbitration clause, stripping the ability of users 
to take these companies to court. Instead, they force users to 
resolve issues in house, on the company's home turf, with their 
team of lawyers against you.
    For supposedly free services, these platforms seem to take 
a lot of our users for granted, and a lot from us.
    Mr. Greenblatt, can platforms use the terms of service to 
include a provision that harms users and put them outside the 
reach of the law?
    Mr. Greenblatt. Thank you for the question. I will preface 
my response by noting that I am not a lawyer, or a consumer 
protection lawyer, at that.
    That being said, it seems to me that the point you have 
raised is incredibly valid. Pages and pages and pages of 8-
point legalese, and expecting my, you know, 15-year-old or 12-
year-old to understand that is laughable, at best, and it is 
malicious, at worst.
    I mean, the reality is this is why we need transparency. We 
need transparency in how these algorithms work. We need 
transparency in the data they are collecting. And, Mr. 
Congressman, we need a kind of not truth in advertising, but a 
truth in terms. I mean, what you just laid out is indefensible 
when it is directed at a minor.
    Mr. Carbajal. And not just the minor, the average American 
just cannot?
    Mr. Greenblatt. Absolutely.
    Mr. Carbajal. It is just not an even playing field, not at 
all.
    Yes, Mr. Lane, briefly.
    Mr. Lane. Yes, very briefly. This is why we need Section 
230 reform, because if there is a violation of the terms of 
service, we need to have the civil litigation to be able to 
find out if there is a violation, so we can get teams of 
lawyers to engage in this process. And without the Section 230 
reform that we are talking about, and the duty of care, we are 
waiting for a whistleblower, which we hope comes, but may 
never.
    Mr. Carbajal. Well----
    Dr. Marechal. Can I jump in here? I realize it is awkward, 
because I am remote, but Section 230 has absolutely nothing to 
do with this. This is about privacy.
    Mr. Carbajal. OK, thank you. I would like to ask a quick 
yes-or-no ?
    Dr. Marechal. Any--can I just say any value that we care 
about shouldn't be subject to notice and choice in a--deep in a 
terms of service.
    Mr. Carbajal. Thank you. Thank you very much. And this 
issue is, obviously, important, not only to the average 
American, especially for those of you are deeply involved in 
this every single day, as I can see by your answers.
    Very quickly----
    Ms. Schakowsky. The gentleman's time has expired. You are 
going to have to put that in--am I right? Yes, you are going to 
have to put that in writing.
    Mr. Carbajal. I was hoping you would afford me the same 
generosity I have seen my colleagues do.
    I love you, just kidding.
    Ms. Schakowsky. OK, but----
    Mr. Carbajal. I am going to yield back.
    Ms. Schakowsky. Ask the question and then get an answer.
    Mr. Carbajal. I yield back, I yield back.
    Ms. Schakowsky. OK.
    Mr. Carbajal. I just saw everybody go a little extra, I 
thought----
    Ms. Schakowsky. I would, but I----
    Mr. Carbajal. I thought I would use my position, as well. 
Thank you.
    Ms. Schakowsky. OK. And now, Congresswoman Dingell, you are 
recognized for 5 minutes.
    Mrs. Dingell. Thank you, Madam Chair. Thanks for holding 
this hearing, and to all of you who are testifying here today.
    In our March hearing, with many of the major tech CEOs, I 
raised the fact that violative, provocative, and divisive 
content often receives more engagement on social media 
platforms, which many of you have raised in your testimony. 
Several audits, investigations, and reports continue to 
substantiate the claims that companies are aware of this fact. 
And I believe it is our duty to ensure that they are not 
prioritizing profits and engagement over the safety and the 
health of their users. I would like to move some questions 
focused on these protections, first on prioritizing engagement.
    To the panel, if you would just answer this with a simple 
yes or no, are these companies actively making the choice to 
prioritize profits and engagement over combating 
disinformation, violent content, and negative health outcomes 
for individuals and children, yes or no?
    Dr. Marechal?
    Dr. Marechal. Yes.
    Mrs. Dingell. Mr. Greenblatt?
    Mr. Greenblatt. Yes.
    Mrs. Dingell. Mr. Ahmed?
    Mr. Ahmed. Yes.
    Mrs. Dingell. OK. Mr. Golin ? Golin, sorry.
    Mr. Golin. Yes.
    Mrs. Dingell. Mr. Lane?
    Mr. Lane. Yes.
    Ms. Rich. Yes.
    Mrs. Dingell. Ms. Rich--OK, so we got that. So my next 
question is for Dr. Marechal.
    Is there significant evidence that the changes we are 
proposing today to these platform algorithms will have an 
outsized impact on user engagement on the platform?
    What is the cost benefit for consumers and companies in 
incentivizing or requiring these changes?
    Dr. Marechal. That is a great question, Congresswoman. I 
think the single most impactful thing that we could do to 
change the current incentives, which, as you say, push 
companies to prioritize engagement above all else, is to ban 
surveillance advertising. This could--this would most 
effectively be done through comprehensive privacy reform.
    Mrs. Dingell. Thank you for that. I firmly believe that 
independent researchers and the FTC should have access to data 
from these companies to ensure that features and user data are 
not being exploited in ways that push individuals and children 
towards disinformation, violence, extremism, negative health 
outcomes. And that is why I am supporting one of--the Social 
Media Data Act, introduced by my colleague, Rep. Trahan, to 
ensure that researchers have access to information on targeted 
online digital advertisements, to study their potential harms 
to consumers, and create a working group to establish guidance 
on handling this data.
    In March I asked Mark Zuckerberg if he was opposed to a law 
to enable regulators to access social media algorithms--can't 
even talk today. In his response he said that giving more 
transparency into these systems was important, but we sure 
haven't seen any progress on Facebook since--on that issue so 
far.
    So Dr. Marechal, why have companies so far resisted 
increased transparency on sharing advertising data with 
independent regulators and researchers, despite repeated 
commitments to do so, and repeated revelations that they are 
aware of the impact?
    Dr. Marechal. In short, because, as bad as they are at 
moderating and governing user content on their platforms, they 
are even worse at moderating advertising. Facebook and other 
platforms are replete with ads that are illegal in the country 
in which they are served, that violate the platform's own 
stated rules. And they don't want to be--get caught doing that.
    And they know that when, in the case of Facebook, it is--99 
percent of their revenue comes from targeted advertising, for 
Google it is in the--90 percent, or something like that, it is 
very high for other platforms, as well--that once you start 
tugging at that string, that the whole house of cards is likely 
to come down.
    This is a completely ungoverned and anti-competitive sector 
of the economy that needs to be regulated as soon as possible.
    Mrs. Dingell. So I have many other questions, which I will 
submit for the record.
    [The information appears at the conclusion of the hearing.]
    Mrs. Dingell. But I will give you my last one for Dr. 
Marechal.
    How do platforms create additional barriers or, in some 
cases, completely block independent researchers from obtaining 
data?
    And how would the Social Media Data Act alleviate some of 
these obstacles?
    Dr. Marechal. That is a great question. So, you know, the 
New York--the NYU ad observatory case from this summer is 
really the prime example of that.
    Companies, first of all, are constantly changing their code 
to make it harder for researchers to scrape, or to 
automatically connect--collect information that is published on 
the internet that you don't need to log in to access.
    They are--they also shut down the accounts, deplatform 
individual researchers when they start to do research that the 
companies find threatening. That is what happened to ?
    Ms. Schakowsky. You are going to have to wind up your 
answer right now.
    Dr. Marechal. Thank you, ma'am. They also sue individual 
researchers, which is very, very chilling to research.
    Mrs. Dingell. Thank you, Madam Chair. I will say one thing: 
the consequences of these decisions are boldly apparent and, in 
many cases, deadly. Thank you, Madam Chair, for holding these 
hearings, and I hope our committee acts soon.
    Ms. Schakowsky. The gentle lady yields back, and now my 
colleague from Illinois, Congresswoman Kelly, for 5 minutes.
    Ms. Kelly. Thank you so much, Madam Chair, for holding this 
hearing today, building off of our productive Communications 
and Technology Subcommittee hearing last week. I want to thank 
the witnesses for testifying today, and helping us craft 
legislation to hold Big Tech accountable.
    And to Mr. Greenblatt, I just wanted to say to you, 20 
years ago, maybe more now, I got engaged with the Anti-
Defamation League, and it changed my life, because I got 
involved in a World of Difference and--difference, so you 
helped me see things through a great lens that I still have 
with me.
    One of the fastest-growing methods for acquiring customers 
online is through influencer marketing. Influencers are people 
who have a lot of followers or social influence online, and who 
then use that influence to endorse and sell products. Today 
influencer marketing is a multibillion-dollar industry in the 
U.S.
    What I find concerning is that so many of our--of today's 
top influencers are children, so-called kid influencers, with 
massive followings on social media. It is not clear online when 
content is organic or sponsored advertising. Studies show this 
problem is significantly worse for children, because children 
do not yet have the cognitive abilities to make these 
distinctions.
    Mr. Golin, can you talk about the harms that kidfluencers 
pose for children online, and why do you believe such 
advertising has become so prevalent?
    Mr. Golin. Yes. So the reason it has become so prevalent is 
because it is allowed on on the internet, and it is not allowed 
on children's television.
    So on children's television we have the Children's 
Television Act, which prohibits product placement. It prohibits 
hosts from selling directly to children. And we don't have the 
same rules online, which is--which makes no sense. If a child 
is watching a video on YouTube, they certainly deserve the same 
protections as if they are watching it on Nickelodeon, or 
Disney, or another television channel.
    And the harms--you know, so children's understanding, they 
already understand advertising less than adults. But the way 
that we can get children to understand advertising better is by 
having it clearly separated from content. What research shows 
is the more that advertising is embedded, the less children 
understand about what is going on.
    So you have, on--situations like on YouTube, unboxing 
videos. You have unboxing stars like Ryan's Toys Reviews, 
literally billions of views of these videos, where kids--where 
Ryan is talking about a toy he has been paid to talk about for 
10, 15 minutes. Kids are watching infomercials. Studies have 
shown that kids who watch these videos are more likely to nag 
their parents for what is advertised, and more likely to throw 
a temper tantrum if they say no.
    These--influencer marketing is also linked to higher levels 
of materialism. And if you look at Frances Haugen's documents, 
one of the things that teens themselves are saying is that 
influencer culture is toxic, and makes them feel bad about 
themselves.
    Ms. Kelly. We also know that social media platforms often 
facilitate and certainly make a lot of money from influencer 
marketing. What responsibility do you think that these 
platforms have to protect children from this kind of marketing, 
and, in your mind, are they fulfilling these responsibilities?
    Mr. Golin. They are absolutely not fulfilling these 
responsibilities. I mean, YouTube is making so much money off 
of kids watching unboxing videos. Influencer content on TikTok 
and Instagram is making those platforms--but I don't think we 
can wait for these platforms to do the right thing. That is why 
I think we need legislation like the KIDS Act, that would ban 
these platforms from recommending influencer marketing to kids.
    Ms. Kelly. So how do you think the KIDS Act would help 
protect children in these instances, where it is hard to 
distinguish between authentic and sponsored content?
    Mr. Golin. Well, what it would do is it would prohibit the 
platforms from amplifying that content to children. And so that 
would be a mechanism where the platforms could be held 
responsible. And I think, if they were facing fines for doing 
that, that they would start cleaning up their act.
    Ms. Kelly. And because I have a little bit more time, does 
anyone else want to make a comment about that?
    No? OK, well, I will yield back. Thank you, Madam Chair.
    Ms. Schakowsky. The gentleman--the gentle lady yields back, 
and Mr. Soto is recognized for 5 minutes.
    Mr. Soto. Thank you, Madam Chair.
    Transparency, privacy, integrity of information, protecting 
our kids, all critical ideals that our committee is charged 
with helping uphold in social media. These are a challenge in 
English. It is pure chaos right now in Spanish and in other 
languages, trying to uphold these ideals. So I applaud the 
Chair and the ranking member, my fellow Floridian, for the 
bipartisan group of bills that have been put forward today that 
we are starting to review.
    We have seen lies about the vaccines, and about January 
6th, and about the 2020 election, and we have seen lies that 
breed hate and division in our nation. And so this committee 
takes this very seriously.
    For Spanish language content, it is often less moderated 
for misinformation and violence than English content. Spanish 
language content posts are often allowed to remain on social 
media pages for longer durations than English content. A 
question for Mr. Greenblatt, then Mr. Ahmed.
    How does having unregulated Spanish misinformation hurt 
minority communities and people of color?
    And how should--how do social media companies and their 
algorithms fail to address the Spanish misinformation?
    Mr. Greenblatt?
    Mr. Greenblatt. So it is a very good question, Congressman 
Soto.
    And one of the revelations of the Facebook whistleblower 
was that Facebook spends upwards of 90 percent of its resources 
on dealing with misinformation in English, despite the fact 
that less than 10 percent of its users are doing so in English. 
So there is a vast misallocation of resources, despite the fact 
that they do a pretty poor job, as has been stated already.
    ADL participates--proudly participates--in the Spanish 
Language Disinformation Coalition, and we work a great deal to 
look at these issues. I can tell you we have found examples. We 
did an analysis last year, last November, of Spanish language 
anti-Semitism on Facebook, and we found, with just a few 
keystrokes, about two dozen Spanish language accounts that were 
wildly in violation of Facebook's own terms of service, that 
they failed to take down, that got hundreds of thousands of--
coming from groups with hundreds of thousands of users getting 
upwards of 55,000 views. So we know this is a big problem.
    Mr. Soto. And we have seen that published in even local 
newspapers and on--in local television in places in our state, 
so we are deeply concerned about it. And then it is repeated in 
social media.
    I want to turn to Mr. Ahmed next.
    Again, how does unregulated Spanish misinformation and 
other foreign language misinformation hurt minority communities 
and communities of color?
    And how do algorithms fail to address this misinformation?
    Mr. Ahmed. Well, this is a mixture of both algorithms, 
which are very good at targeting the right misinformation to 
the most vulnerable audiences, and bad actors, who are--who 
understand that, actually, the Spanish-speaking market is an 
easier one to sell misinformation into, because there isn't as 
much moderation of the content there. And it is just--it--there 
is a lower potential of that content being removed.
    What that means, in practice, is that if you take, for 
example, vaccine misinformation, that the content that was 
being targeted to Spanish audiences by non-Spanish-speaking 
originators--so you found some of the key members of the 
Disinformation Dozen who aren't themselves Spanish speakers 
were having their content translated into Spanish at the same 
time, and pumping it out into Spanish-speaking audiences. And 
we saw that being taken up, we saw people debating it, and we 
saw people deciding not to vaccinate initially because of it.
    And what did that mean? That meant that, literally, you 
know, Latinx communities in America were dying because they 
were being--A, they were more exposed to--you know, there was a 
higher prevalence of acute COVID; and second, that they were 
then being persuaded not to take the vaccine, the thing that 
would most protect them.
    Mr. Soto. Thank you, Mr. Ahmed. And just as a comparison, 
we saw vaccination rates really high in central Florida among 
both Puerto Rican and Mexican American communities. Puerto Rico 
has the highest rate in the nation, because it wasn't 
politicized in the media, in social media. But we saw in other 
areas, like in South Florida and South Texas, where 
misinformation campaigns were deliberate. And what did that 
lead to? Low rates.
    I heard crazy things said about the vaccines, when the only 
crazy thing about it is not taking them to stop this deadly 
virus.
    So thank you, gentlemen, for your input.
    And Madam Chair, I yield back.
    [Pause.]
    Ms. Schakowsky. It is to Doyle? OK. The gentleman yields 
back, and now as--we welcome a waive-on to the committee, and 
that would be the chairman of--also a chairman of the 
subcommittee, Mr. Doyle, for his 5 minutes of questions.
    Mr. Doyle. Well, thank you very much, Madam Chairwoman, and 
to both you and Chairman Pallone, for continuing this series of 
legislative hearings to move forward with common-sense 
solutions to protect consumers online, and to hold online 
platforms accountable for their actions.
    Last week, at the Communications and Technology 
Subcommittee, we heard from experts on the harms caused by 
online platforms, as well as experts on legislative solutions 
to address these significant problems. And as we have heard 
from panelists today, providing victims access to the courts is 
not enough to address the breadth of issues surrounding tech 
platforms.
    I agree that transparency and other accountability measures 
are necessary, as well. So today's hearing and the witnesses' 
testimony are very important as we move forward.
    Mr. Greenblatt, you also made comments to this effect. In 
your testimony you note that hate speech and, potentially, 
disinformation and other dangerous content is often protected 
in the First Amendment. And then you go on to say that we need 
to do more than just focus on Section 230 reform as required to 
hold platforms accountable.
    Can you first talk about how some platforms are tuned for 
disinformation?
    I would like to hear more detail on how some platforms' 
designs encourages disinformation, hate speech, and harmful 
content.
    Mr. Greenblatt. Thank you very much for the question, 
Congressman Doyle.
    So, first of all, let's just acknowledge that hate speech 
is part of living in a free society. Our First Amendment 
protects ideas, even those that we don't like. But the 
challenge is hate speech is not the same. And I am sorry, 
speech that causes direct harm is different.
    Freedom of speech is not the freedom to slander people. 
Freedom of expression is not the freedom to incite violence. So 
platforms like Facebook or Twitter, Congressman, that often 
will use anonymity, that don't take down posts that are 
directly threatening to people, that don't take down posts that 
express lies or misinformation are directly damaging to the 
public good.
    Now, the reality is that there is a reason why newspapers, 
magazines, movies, television, radio, and all other media do 
not allow such content on their services, because they would be 
liable for litigation and for lawsuits if they did. Only the 
social media companies enjoy the privilege of non-
accountability, and that is because of the loophole in the law, 
Section 230, that was referenced earlier.
    Mr. Doyle. Thank you. Research has shown that, with very 
little information about a user, Facebook's algorithms can 
simply begin showing conspiracy theory and other disinformation 
to that user. Is it good policy that Federal law protects 
Facebook from any harm that comes to the user as a result of 
that information?
    Mr. Greenblatt. Absolutely, it is bad policy. It is 
unambiguously bad public policy, and it is a loophole that 
extremists have exploited to great effect.
    And again, we have seen where, out in the open, extremists 
use Facebook groups to organize actions against other 
individuals. This would be inexcusable, again, in any other 
context. People are allowed to say hateful things. The question 
is whether Facebook and the other services should privilege 
them, should amplify them, should elevate them. I say the 
answer is no.
    Mr. Doyle. So how do we pair the transparency and reporting 
requirements with other reforms, like we discussed last week, 
to protect both online users, and maintain a healthy online 
ecosystem?
    And how do we have meaningful transparency requirements 
that are not abused by those promoting hateful and other odious 
forms of speech, even if protected by the First Amendment?
    Mr. Greenblatt. Well, I think one of the things that one--
could be done right away, Mr. Congressman, would be to allow 
researchers access to this information. You don't have to 
necessarily make it available to the entire public, but 
accredited researchers who apply could be given access. And you 
would need to have real criteria, so that Facebook and the 
other companies couldn't deny credible requests.
    But you have--as public servants, you and the government, 
you are--have to be compliant with a FOIA request. There is no 
reason why we couldn't create a similar FOIA-type requirement 
of these companies, because the data they have is our data, it 
is public data, it is citizen data, and they should be 
sharing--more transparent, and sharing it.
    Mr. Doyle. Thank you.
    Mr. Ahmed, we know, through your research, and now through 
Facebook's research, thanks to Frances Haugen, that a small 
number of users are responsible for much of the disinformation 
that we are seeing online. Clearly, the incentives are not 
aligned for these platforms to take this type of content more 
seriously, even when we know it leads to real-world harms.
    Can you tell us how the bills before us today will help 
realign the incentives?
    Mr. Ahmed. Well, I think, comprehensively, what they do is 
give us more illumination as to the underlying rationale: the 
drivers, the business decisions, the economic rationale for 
allowing this content to remain on their platforms. And they 
really have.
    I mean, look, the Disinformation Dozen, of their 98 social 
media accounts, 42 are still up. They still have around 52 
percent of their audiences that they had before we wrote that 
report. So yes, some action has been taken. But for the main 
part, over half of it is still up there.
    And why is that true? What these would collectively do is 
start to create some transparency and, therefore, 
accountability for those failures.
    Mr. Doyle. Thank you, Madam Chair----
    Ms. Rich. Mr.----
    Mr. Doyle. [continue]. For holding this hearing, and I 
yield back.
    Ms. Schakowsky. Thank you, Mr. Doyle. We are honored to 
have your presence today.
    I want to now recognize Representative Lesko for your 5 
minutes.
    Mrs. Lesko. Thank you very much, Madam Chairman, and thank 
you to all of the panel members for testifying today. This is 
such an important issue.
    It has been said that false information spreads so much 
faster on social media than accurate information, and I found 
that to be true. And I think a lot of it is because people, you 
know, whether it is media outlets or whoever it is, want us to 
have salacious titles and things so that we click on it, and 
then--and use it. But my first question is for Jessica Rich.
    Jessica, the FTC recently released the draft fiscal year 
2022 through 2026 plans. I understand Chairman Khan deleted 
language from the FTC mission that specifically says that the 
FTC will accomplish their mission without unduly burdening 
legitimate business activity. How concerned are you that this 
altered mission statement could lead to increased costly 
regulatory burdens on businesses?
    Ms. Rich. The deletion of that language sends a really bad 
message. And I would like to think of my former agency that it 
was a mistake. But one--and they should--and that they are 
planning to put it back in.
    One thing that is important to remember is that, regardless 
of whether that language is in a mission statement, that 
concept runs throughout so much law and policy at the FTC that, 
regardless of mission statement or no mission statement, it is 
going to be very hard to ignore undue burdens on legitimate 
business activity. It is built into deception, it is built into 
unfairness, it is built into substantiation, fencing in so many 
doctrines.
    But it was very ill-advised to take it out of the mission 
statement, and it sends a terrible message.
    Mrs. Lesko. Thank you for that answer. And also to you, 
Jessica Rich, as you said, you are a former FTC director of the 
Bureau of Consumer Protection. What is your reaction to the--
granting the FTC civil penalty authority language in the 
mission statement, or granting them civil penalty authority?
    Ms. Rich. Under the Build Back Better Act. The FTC badly 
needs stronger remedies, especially with the rollback of 13(b) 
authority. But it would be far better for both the FTC and the 
public if this type of authority came with more direction from 
Congress regarding the situations that--where this would apply.
    One thing to note that hasn't been talked about very much 
is that, even with this new authority, the FTC will still need 
to prove that any company, before paying civil penalties, has 
knowledge that they are violating the law. So that would be an 
important safeguard that would still be in there.
    Mrs. Lesko. All right, thank you very much. My next 
question is for Mr. Rick Lane.
    Areas of clear vulnerability--and you have said it in your 
testimony--to putting our sensitive, personal data at risk are 
those situations where sensitive, personal information is 
stored in foreign countries known to be hostile to the United 
States--one, namely, is China. Mr. Lane, how important is it 
that any reforms to Section 230 also include reforms to 
transparency, and content moderation practices, and them 
storing our personal information?
    Mr. Lane. I think it is very important. We have, actually, 
treaties now that we have signed about how we can't require 
data localization, and so we can't say where people can store, 
based on our treaties, and that should be looked at, as well.
    But in terms of what is happening with TikTok and others, I 
do believe that we need to take a closer look at how this data 
is being accessed, who is accessing it.
    One of the concerns I have is, if you have ever seen the 
documentary ``A Social Dilemma,'' is where they show the--you 
know, supposed to be Facebook or--turning the dial to try to 
influence our behaviors just a little bit. You know, elections 
are won and lost by two percentage points sometimes. And I 
would hate to see that there is information that is being 
derived that is just--someone behind the scenes is turning that 
dial who may be hostile to our U.S. interest.
    Mrs. Lesko. Well, I agree with you, and I did watch 
``Social Dilemma,'' and I think it is very interesting, because 
it kind of opens your eyes on how we are being influenced 
behind the scenes.
    Thank you, Madam Chairman, and I yield back.
    Ms. Schakowsky. The gentlewoman yields back, and now I 
recognize Congresswoman Blunt Rochester for her 5 minutes of 
questions.
    Ms. Blunt Rochester. Thank you, Madam Chairwoman, for the 
recognition, and allowing me to join this very important and 
timely hearing.
    The internet's remarkable power and potential have been 
used to create, unite, and innovate. Unfortunately, it has also 
been misused by bad actors to misinform, divide, and distract, 
preying on unsuspecting Americans. This hearing today 
represents a bipartisan consensus that large tech companies 
must reform their practices to ensure the internet remains a 
place of innovation and potential. The common denominator 
underlying the horrible things that we have heard about today 
is the ability for tech companies to use design practices to 
undermine user choice for the sake of profit.
    For my part, I introduced the bipartisan and bicameral 
DETOUR Act, because tech companies have used decades' worth of 
research on compulsion and manipulation, often conducted on the 
gambling industry, to design products that trick or strong-arm 
people into giving up their data or consent to potentially 
harmful content.
    Today we often call these ``dark patterns,'' and they exist 
on virtually every tech platform today, because this data 
collection scheme fuels the algorithms and targeted ad programs 
we have decried in a bipartisan way.
    If we allow tech platforms to hamper Americans from making 
choices in their own self-interest, we will never see the 
internet reach its full potential.
    Dr. Marechal, I would like to begin with you. Can you 
provide us an example of a dark pattern that undermines user 
choice on the internet today?
    And what makes these tactics so ubiquitous online, and so 
effective in influencing user behavior?
    Dr. Marechal. Absolutely, ma'am. Since the GDPR and CCPA, 
internet users have gotten used to seeing data collection 
consent pop-ups when they visit websites. And the point of that 
is to give us choice over whether or not to share--to make 
our--make it possible for companies to collect our data. But 
this is undermined by the type of deceptive design that you are 
talking about.
    You have noticed, I am sure, that many of them make it 
much, much easier to allow the website to collect whatever data 
it wants than to refuse that permission, or to get details 
about what data we want to allow or not to be collected. Even 
someone like me, who is onto them, I am often pressed for time, 
and so I click accept, rather than going through half a dozen 
more clicks to limit the data collection to what is needed for 
the website to work properly.
    Ideally, sites should only be able to collect the data that 
they actually need to do the thing you want them to do. But, at 
a minimum, it should be just as easy to protect your privacy as 
it is to give it away.
    Ms. Blunt Rochester. Great, thank you so much.
    And Mr. Golin, why is it important that we consider 
regulation of dark patterns that target children, especially 
those that cause compulsive behaviors?
    Mr. Golin. Yes. Well, we should regulate dark patterns that 
are aimed at children for three reasons.
    The first of all is because, as you mentioned, they are 
extremely prevalent. Most of the apps and the games that 
children are on use manipulative techniques, finally owned by 
endless A/B testing, in order to get kids to stay on platforms 
longer, in order to get them to watch more ads, and in order to 
get them to make in-game purchases.
    The second reason that we should do it is because it is 
unfair. You know, that--when the idea is to undermine user 
autonomy and to manipulate children, that is unfair. Just a 
couple of examples. There are preschool apps aimed at very 
young children, where the characters in the game start mocking 
children if they try to stop playing, and taunt them into 
playing even longer. And you know, so many of the games that 
children play use virtual currencies that have no fixed rate, 
and so they manipulate those currencies, and--so kids don't 
understand, when they are buying things with real money, how 
much money they are actually spending.
    And finally, we should regulate them because they cause 
harm to children. There is the financial harm that I just 
mentioned, where kids are racking up hundreds and thousands of 
dollars in in-game purchases, but they are also being used to 
drive compulsive use, to get kids to have more screen time, 
which, of course, displaces things that would be--that they 
could be doing that would have much more benefit to them.
    Ms. Blunt Rochester. Yes, and also contribute to healthy 
child development. I think you are correct.
    And Mr. Greenblatt, you know, a lot of times we hear, when 
we discuss dark patterns, about things that companies shouldn't 
do. But can you--you, you know, mentioned the Social Pattern 
Library, and it considers some very important things. What are 
good design principles? Can you describe some of the findings 
and recommendations that ADL made, as part of the Social 
Pattern Library?
    Mr. Greenblatt. Yes, thank you for the question. A few 
points.
    I mean, number one, nudges are very useful. And we have 
seen services like YouTube and Twitter implement them based on 
our recommendations, and actually decrease the prevalence of 
hate on their platforms.
    Number two, doing things like turning off the automatic 
auto reel that you often see on services like YouTube. So the 
videos keep playing over and over again, and the young people, 
the children, are just fed this content without actively 
choosing it.
    Number three, another design principle is you don't have to 
have, let's say, controversial videos. I think you have to have 
controversial videos, but videos that violate the policies, if 
you will, there is just no reason to be promoting them. They 
should be taken down. But while they are being viewed, you 
don't have to put them in search.
    There are lots of little techniques that product managers 
can do in order to iterate the results slightly in a way that 
is consistent with preserving freedom of speech, but that 
doesn't----
    Ms. Blunt Rochester. Thank you.
    Mr. Greenblatt [continue]. Will promote the fringes.
    Ms. Blunt Rochester. Yes, my time has run out, but I will 
follow up with a question for Mr. Ahmed.
    [The information appears at the conclusion of the hearing.]
    Ms. Blunt Rochester. And thank you so much, Madam 
Chairwoman, for this very important hearing, I yield back.
    Ms. Schakowsky. Thank you.
    And Mr. Walberg, you are now recognized for 5 minutes.
    Mr. Walberg. Thank you, Madam Chairwoman, and I appreciate 
being waived on today. This is a hearing that I think is 
important, with multiple hearings we are doing on Big Tech and 
its impact.
    I know members of this committee on both sides have long 
supported a comprehensive national privacy and data security 
framework, and we have a record of working in a bipartisan 
manner to achieve that. For that I am grateful. While many 
worthy proposals are being considered today, I fear that, 
without a bipartisan, cohesive framework, we will continue down 
a path of patchwork laws that confuse consumers and place undue 
compliance burdens on businesses.
    We may have significant differences on issues such as 
Section 230 reform, but privacy, particularly when it comes to 
children, should be a no-brainer. Or maybe that is the wrong 
term to use. It should be a good-brainer. That is why I have 
introduced, with my good friend, Congressman Rush, a bipartisan 
bill that would update and modernize the Children's Online 
Privacy Act, or COPPA. I wish that it was part of the hearing 
today, but it isn't. But still, it can be in the future, and I 
hope it is.
    Mr. Lane, as you know, this is not the only legislation 
aimed at enhancing child privacy laws. There are Democratic 
proposals in both the House and Senate, which reemphasizes my 
point that this should be a bipartisan issue.
    However, I have concerns with some of the COPPA legislation 
that has been introduced, including language that would grant 
new authorities to the FTC that may unduly burden legitimate 
business activity, such as good actors that have FTC-approved 
self-regulatory guidelines. And so, Mr. Lane, could you speak 
to why elimination of self-regulatory guidelines is harmful, 
and what might be some unintended consequences of doing just 
that?
    Mr. Lane. Sure, happy to, and thank you for the question.
    First of all, I want to say I am a big supporter of 
reforming COPPA. I actually think it should start at 17 and go 
younger, and not at 16. I think it needs to be updated. Things 
have changed since Ed Markey moved the bill back in 1998. But 
one of the pieces of the bill that is actually important that 
has not--that may be left out, or included in part of the--some 
of the reform bills, is the self-regulatory environment of 
having FTC-compliant COPPA entities being certified.
    And the reason that we supported that in the past, and why 
we liked it, was it was to help parents. It was to help parents 
to know that, if their kids were going on a site that was for 
12 and under, that there was some mechanism, though, that was 
like a Good Housekeeping Seal of Approval, because we were 
concerned that, as Jessica knows, the lack of resources at the 
FTC, they can't investigate everybody.
    So we thought we could help put together a mechanism that 
would say we have a certification program that you go through. 
That certification program and that company can be certified by 
the Federal Trade Commission, and it would help provide parents 
with information that the sites that they were going to have 
their kids on would be COPPA-compliant.
    Now there have been some bad actors, and recently one of 
those bad actors got booted from the program. They should have. 
And I would support stronger enforcement of those entities 
like--that are doing a great job.
    But I think it may do a disservice to parents that, if they 
have to kind of guess and hope and pray that these thousands of 
websites that are targeting 12 and under are COPPA-compliant, I 
think that maybe that would just be a mistake.
    Mr. Walberg. Thank you. My legislation, of course, as you 
may know, raises the age for parental consent protections for 
children online from under 13 to under 16 years of age. It just 
seems that Big Tech, in this space, has a race to the bottom 
going on.
    Mr. Lane. Yes. And if I can just add one other piece--and 
Jessica was actually one of the first individuals I reached out 
to on this--is this FinTech child privacy protection gap. 
Because what has happened is that, as kids are migrating into 
this digital e-commerce world, and having debit cards and 
digital wallets, those privacy rules are Gramm-Leach-Bliley, 
which is an opt-out regime, and you hope that the parents would 
opt out. As Congressman Cardenas had basically said, no one 
reads the opt out, and no one opts out.
    COPPA is for websites targeted 12 and under. So the concern 
is that, as you have this combination of kids' financial 
information being collected, and then tagging that along with 
social networking information, you have the perfect storm of 
underage kids having a whole dossier on them prior to them 
hitting 18. That could be detrimental to their future. And that 
gap, I think, needs to be filled by legislation.
    Mr. Walberg. I appreciate that. I have some more questions, 
but I don't have time. I will get them to you.
    [The information appears at the conclusion of the hearing.]
    Mr. Walberg. But I appreciate you adding that, because that 
is insightful. Thank you.
    Ms. Rich. Can I make one quick point about the COPPA safe 
harbors?
    Mr. Walberg. If the chairperson allows it.
    Ms. Rich. Can you----
    Ms. Schakowsky. I am afraid that is going to have to go 
into the--to respond in writing.
    Ms. Rich. OK.
    Ms. Schakowsky. We have to move on. And I now recognize for 
5 minutes Mr. Carter.
    Mr. Carter. Thank you, Madam Chair and Leader Bilirakis, 
for allowing me to waive on this hearing. I appreciate it very 
much.
    Ms. Rich, I will go to you, but I have another question 
here. I want to go back to the exchange that you had with 
Ranking Member Rodgers.
    We have got a lot of supply chain issues that are going on 
right now, and they can go beyond just a local retailer. Say I 
am the owner of a car dealership in Georgia, or a wine shop in 
Washington State, or even a grocer in a small town in West 
Virginia. I am paying more now than I was before to get access 
to products that aren't as available as they were before. I may 
have to charge more than I did a month ago, just simply because 
of the increased cost, obviously.
    I don't know the ins and outs of the FTC Act, so aren't the 
processes--the process changes, the new authorities that the--
that have been discussed today, and other actions going to 
cause a lot of confusion and--for me, as a retailer, and just 
for--trying to responsibly run my business?
    Ms. Rich. I haven't done that analysis, but I do know that 
right now there is a lot of confusion about when the FTC 
instead chooses to pursue something through deceptive or unfair 
practices. And so the FTC is always better off when it has 
direction from Congress as to what the standards are for 
particular concerns like content moderation, privacy, et 
cetera. So I think, at least in many circumstances, direction 
from Congress decreases confusion.
    Mr. Carter. Decreases confusion.
    Ms. Rich. Decreases confusion.
    Mr. Carter. OK.
    Ms. Rich. Now, what I think maybe you are asking about, 
though, is the issue of having multiple sectoral laws, instead 
of one law together, which I have been advocating for privacy, 
where at least companies would be able to look in one place for 
a lot of direction about important issues like data use.
    Mr. Carter. Right.
    Ms. Rich. And I do think having one comprehensive privacy 
law, which could include many of these elements in it, would be 
better off than having multiple sector rules.
    Mr. Carter. Look, I was in business for over 32 years, and 
I can tell you, first of all, I didn't have time to do all this 
kind of research. Secondly, I mean, we are inside baseball 
here. But many of these people, many of these business people, 
they don't know how to navigate all this.
    Ms. Rich. I agree that multiple sectoral laws, which is in 
the area I am the greatest expert in, which is privacy, has not 
been good for small companies, or even big companies. But it 
definitely is worse for small companies who really can't figure 
out what laws apply to them.
    Mr. Carter. Right. All right, let me move on.
    Earlier this year there were several Senate Democrats that 
sent a letter to Chairwoman Khan at the FTC, encouraging her to 
begin a rulemaking process on privacy. I am hopeful my 
colleagues in the Senate will second-guess this approach, once 
they know how complicated it truly is, because it is truly 
complicated, and we don't need it to be complicated. We need to 
simplify. Be Thoreauish: simplify, simplify, simplify.
    Ms. Rich, I am also concerned with the timeliness that it 
is going to take to complete a rulemaking process on data. Can 
you shed some light on how long that process might take, and 
what that might mean for consumers and companies looking to 
understand all this patchwork of state laws?
    Ms. Rich. There has been a tremendous overselling of the 
potential of the FTC to issue a rule on its own, using its Mag-
Moss authority. Under that--that is a very cumbersome process. 
It requires--for each mandate in a rule, the FTC has to make--
prove it is unfair, deceptive, and prevalent, and then there is 
all sorts of procedural hurdles. Many rules that have been 
pursued under this process have taken years to complete.
    And also, given the controversy and all the debates 
surrounding privacy that have happened over the course of 20 
years, the public would be best served if Congress is the one 
to make the tough choices in this area.
    Mr. Carter. Understood. But, you know, again, years of work 
that it is going to take in order to get this.
    Ms. Rich. And litigation that would ?
    Mr. Carter. Absolutely.
    Ms. Rich [continue]. Likely ensue.
    Mr. Carter. Absolutely. And, you know, most business owners 
just get so frustrated, they just throw their arms up, and they 
just--and a lot of them quit.
    I have got a lot more, but I will submit it in writing, and 
thank you.
    [The information appears at the conclusion of the hearing.]
    Mr. Carter. And I will yield back, Madam Chair.
    Ms. Schakowsky. The gentleman yields back. And last, but 
not least, Mr. Duncan, you are recognized for 5 minutes.
    Mr. Duncan. Sometimes they save the best for last. I am not 
sure that is the case here. But I want to thank you and--Madam 
Chair, and the ranking member, for hosting today's hearing, and 
including my bill, the TELL Act. This legislation would 
disclose whether China and other--and their state-owned 
entities are storing, accessing, and transferring the personal 
data of American citizens without being transparent about it.
    TikTok, one of the most popular social media platforms for 
our children, is a subsidiary of Beijing-based ByteDance. While 
I have notable concerns about American companies doing business 
in China, and accommodations they make to the People's Republic 
of China, it is astonishing to me that there is any doubt over 
the level of access and control the Chinese Communist Party has 
over this conglomerate and similar entities.
    Mr. Lane, it is great to see you again. Thanks for being 
here. As this committee thinks about the future of internet, 
and holding Big Tech accountable, are you concerned about the 
data being collected by TikTok and companies with similar 
relationships in China, and what that might mean for national 
security for our country?
    Mr. Lane. I am concerned about that. I think we should all 
be concerned about that.
    Mr. Duncan. Thank you. What other provisions on security 
vulnerabilities do you think we--should be incorporated in this 
legislation to protect our economic and national security 
interests?
    Mr. Lane. Well, I think the legislation starts in the right 
place. You know, as parents, I like to say it is a teachable 
moment, that people will know where their information is being 
housed, and where the companies are based. And hopefully, they 
will take their self-correction action that is necessary.
    But I also worry about those websites and other apps that 
are not going to disclose, and how do we find those. You know, 
as we know, Russia and Iran and China, you know, and the 
surrogates, are well-known cyber warriors. And there is going 
to be a lot of mischief underneath the ones that we see.
    And my concern is that, you know, we have this dark WHOIS 
issue, where we could find out. So, combining your information, 
are they where they say they are, and headquartered where they 
say they are? We could find information like that out through 
an open, accessible WHOIS. That is what forensics does.
    But unfortunately, you know, the NTIA and its bureaucrats 
have, for the past five years, stonewalled Congress taking 
action in this space. Congressman Latta was talking about the 
letters he sent to Homeland Security, the FTA, and others. And 
you have companies like VeriSign and GoDaddy and Namecheap, you 
know, they will be up on the Hill, talking to you guys about 
how we don't need to upset the multi-stakeholder process of 
ICANN. That process is now going on five years. And if--and 
five years of darkness. And if it--if they did develop 
something tomorrow, it would take three more years to 
implement.
    Congress can act on this now. Congress has the opportunity 
to fix a cybersecurity problem at no cost to the U.S. taxpayer. 
It is in our hands. And you can ask any cybersecurity expert. I 
have reports, I have letters from the, you know, the top people 
talking about this. So adding your legislation on where they 
are, and where the data is being stored, on top of a strong 
WHOIS legislation to fix this GDPR problem--it is not a U.S. 
problem, it is a foreign government.
    And I will end on this. Imagine if this law that shut down 
the WHOIS, that is threatening our national security, was a 
Chinese law or an Iranian law. Would we still stand here, as a 
U.S. Congress, and say we shouldn't all set the multi-
stakeholder process to address these laws? The answer would be 
no. And I think it is time for the U.S. Congress to step up, 
and try to fix this problem before more people get hurt.
    Mr. Duncan. You are exactly right. You know, Big Tech is 
not just Facebook or Twitter. It includes companies like 
Microsoft, and Apple, and Google, each of which has a 
significant presence in China.
    My time is going to expire. I had another question, but I 
just want to make this point, because I thought about this 
while you were speaking.
    I don't know that we truly care about all this being 
collected from our children through platforms like TikTok and 
others. And I raise that awareness because, for the past two 
congresses, I have tried to get this committee and this 
Congress to find one Democrat to cosponsor a piece of 
legislation that would stop the importation of child-like sex 
dolls, dolls that are used by pedophiles.
    Images, likenesses that are stolen from social media 
platforms, the doll created, crafted to look like the child of 
one of our constituents, so that someone can play out sex 
fantasies with a child-like sex toy, a doll. Very humanlike, 
very robotic, where even the voice is taken from the child's 
TikTok, and digitally put into that child-like sex toy, so that 
it can actually talk like that child to the pervert that is 
enjoying themselves with it.
    Madam Chair, find me a Democrat that will cosponsor that, 
and let's get that over, and let's stop the importation of 
child-like sex dolls. When I talk to your colleagues, ``Oh yes, 
we''--yes, I will show them pictures of the dolls. I will be 
glad to share them with you. ``Oh my God, we need to do 
something about that,'' and nothing is done, and so we continue 
to import sex dolls into this country that look like the 
children of people in our communities, sound like the children 
of people in our communities. And it is just wrong.
    With that I yield back.
    Ms. Schakowsky. The gentleman yields back, and that 
concludes the questioning.
    And I want to thank, from the bottom of my heart--this has 
been a wonderful panel, and I thank all of you for the work 
that you have done. And I know that it will lead to real 
action, I believe, in the Congress.
    And before we adjourn, let me also just thank my ranking 
member.
    I don't know if you wanted to make any final comment for 
our witnesses. OK, you are OK?
    And I request unanimous consent to enter into the formal--
the following document into the record: an online tracking 
study.
    Without objection, so ordered.
    [The information appears at the conclusion of the hearing.]
    Ms. Schakowsky. And just stay for one more second, because 
I want to remind members that, pursuant to committee rules, 
they have ten business days to submit additional questions for 
the record--I know there were some unfinished questions that 
need answers--to be answered by the witnesses who have appeared 
today.
    And I asked the witnesses to respond as promptly as 
possible to any questions that may come to you.
    Once again, thank you. Thank you to--the participation. 
There were five waive-ons to this committee, which is a lot, 
showing the kind of interest in this committee.
    And, at this time, the subcommittee is adjourned.
    [Whereupon, at 2:52 p.m., the subcommittee was adjourned.]
    
    GRAPHIC(S) NOT AVAILABLE IN TIFF FORMAT