[House Hearing, 117 Congress]
[From the U.S. Government Publishing Office]


.                                    

                         [H.A.S.C. No. 117-25]

                       TECHNOLOGY AND INFORMATION

                        WARFARE: THE COMPETITION

                         FOR INFLUENCE AND THE

                         DEPARTMENT OF DEFENSE

                               __________

                                HEARING

                               BEFORE THE

                 SUBCOMMITTEE ON CYBER, INNOVATIVE 
                TECHNOLOGIES, AND INFORMATION SYSTEMS

                                 OF THE

                      COMMITTEE ON ARMED SERVICES

                        HOUSE OF REPRESENTATIVES

                    ONE HUNDRED SEVENTEENTH CONGRESS

                             FIRST SESSION

                               __________

                              HEARING HELD

                             APRIL 30, 2021

                                     
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT] 

                              __________

                    U.S. GOVERNMENT PUBLISHING OFFICE                    
44-945                     WASHINGTON : 2021                     
          
-----------------------------------------------------------------------------------


SUBCOMMITTEE ON CYBER, INNOVATIVE TECHNOLOGIES, AND INFORMATION SYSTEMS

               JAMES R. LANGEVIN, Rhode Island, Chairman

RICK LARSEN, Washington              JIM BANKS, Indiana
SETH MOULTON, Massachusetts          ELISE M. STEFANIK, New York
RO KHANNA, California                MO BROOKS, Alabama
WILLIAM R. KEATING, Massachusetts    MATT GAETZ, Florida
ANDY KIM, New Jersey                 MIKE JOHNSON, Louisiana
CHRISSY HOULAHAN, Pennsylvania,      STEPHANIE I. BICE, Oklahoma
    Vice Chair                       C. SCOTT FRANKLIN, Florida
JASON CROW, Colorado                 BLAKE D. MOORE, Utah
ELISSA SLOTKIN, Michigan             PAT FALLON, Texas
VERONICA ESCOBAR, Texas
JOSEPH D. MORELLE, New York

                         Troy Nienberg, Counsel
                Chris Vieson, Professional Staff Member
                         Caroline Kehrli, Clerk
                            
                            C O N T E N T S

                              ----------                              
                                                                   Page

              STATEMENTS PRESENTED BY MEMBERS OF CONGRESS

Langevin, Hon. James R., a Representative from Rhode Island, 
  Chairman, Subcommittee on Cyber, Innovative Technologies, and 
  Information Systems............................................     1
Stefanik, Hon. Elise M., a Representative from New York, Ranking 
  Member, Subcommittee on Cyber, Innovative Technologies, and 
  Information Systems............................................     4

                               WITNESSES

Gerstell, Glenn S., Senior Adviser, International Security 
  Program, Center for Strategic and International Studies........     5
Jankowicz, Nina, Disinformation Fellow, Wilson Center............     7
Kirschbaum, Joseph W., Director, Defense Capabilities and 
  Management Team, Government Accountability Office..............    11
Lin, Herbert, Senior Research Scholar, Center for International 
  Security and Cooperation, Stanford University..................     9

                                APPENDIX

Prepared Statements:

    Gerstell, Glenn S............................................    34
    Jankowicz, Nina..............................................    47
    Kirschbaum, Joseph W.........................................    80
    Langevin, Hon. James R.......................................    31
    Lin, Herbert.................................................    60

Documents Submitted for the Record:

    [There were no Documents submitted.]

Witness Responses to Questions Asked During the Hearing:

    [There were no Questions submitted during the hearing.]

Questions Submitted by Members Post Hearing:

    Mr. Moulton..................................................   107
    
    
                  TECHNOLOGY AND INFORMATION WARFARE:

      THE COMPETITION FOR INFLUENCE AND THE DEPARTMENT OF DEFENSE

                              ----------                              

                  House of Representatives,
                       Committee on Armed Services,
       Subcommittee on Cyber, Innovative Technologies, and 
                                       Information Systems,
                            Washington, DC, Friday, April 30, 2021.
    The subcommittee met, pursuant to call, at 3:04 p.m., via 
Webex, Hon. James R. Langevin (chairman of the subcommittee) 
presiding.

 OPENING STATEMENT OF HON. JAMES R. LANGEVIN, A REPRESENTATIVE 
FROM RHODE ISLAND, CHAIRMAN, SUBCOMMITTEEE ON CYBER, INNOVATIVE 
             TECHNOLOGIES, AND INFORMATION SYSTEMS

    Mr. Langevin. Good afternoon, everyone. The subcommittee 
will come to order. First of all, just some housekeeping 
business that I need to take care of, since this is a remote 
hearing.
    I would like to welcome the members who are joining today's 
remote hearing, which, I believe, is just about everybody.
    Members who are joining must be visible onscreen for the 
purposes of identity verification, establishing and maintaining 
a quorum, participating in the proceeding, and voting. Those 
members must continue to use the software platform's video 
function while in attendance unless they experience 
connectivity issues or other technical problems that render 
them unable to participate on camera.
    If a member experiences technical difficulties, they should 
contact the committee staff for assistance.
    A video of members' participation will be broadcast via the 
television internet feeds.
    Members participating remotely must seek recognition 
verbally, and they are asked to mute their microphones when 
they are not speaking.
    Members who are participating remotely are reminded to keep 
the software platform's video function on the entire time they 
attend the proceeding.
    Members may leave and rejoin the proceeding. If members 
depart for a short while for reasons other than joining a 
different proceeding, they should leave the video function on.
    If members will be absent for a significant period or 
depart to join a different proceeding, they should exit the 
software platform entirely, and then rejoin if they return.
    Members may use the software platform's chat feature to 
communicate with staff regarding technical or logistical 
support issues only.
    Finally, I have designated a committee staff member to, if 
necessary, mute unrecognized members' microphones to cancel any 
inadvertent background noise that may disrupt the proceeding.
    So with the technical announcements out of the way, I am 
just going to now give my opening statement.
    First of all, I want to say welcome to our hearing today on 
the Technology and Information Warfare: The Competition for 
Influence and the Department of Defense. I want to thank 
Ranking Member Stefanik for joining me in holding the hearing 
today.
    I would also like to thank our witnesses for appearing 
today. To discuss technology-enabled information warfare as a 
national security threat, we welcome Mr. Glenn Gerstell, senior 
adviser at the Center for Strategic and International Studies, 
and Ms. Nina Jankowicz, disinformation fellow at the Wilson 
Center. And to provide insight on the Pentagon's information 
operation strategy and leadership, we are joined by Dr. Herb 
Lin, senior research scholar at Stanford University. And 
finally, Dr. Joseph ``Joe'' Kirschbaum, Director, Defense 
Capabilities and Management Team at the Government 
Accountability Office.
    First of all, I want to say, Dr. Kirschbaum, welcome back, 
and I want to thank you all for appearing today. It is an honor 
to have you here, and truly it is an esteemed panel.
    So, the United States is challenged in the information 
environment daily. Competitors like China, Russia, and violent 
extremist organizations use information warfare to achieve 
their objectives, while--below the threshold of armed conflict, 
as they seek to avoid traditional U.S. military advantages, and 
undermine the free international order and democratic values.
    The recently released Annual Threat Assessment of the U.S. 
intelligence community makes clear that a variety of state and 
non-state actors weaponize information to undermine the United 
States by sowing discord among our citizens, influencing 
decision makers, and reversing what had once been a strength of 
our Nation's historical information advantage.
    So, I often focus on what lies ahead in defense, but it is 
worth noting that the United States and the military are facing 
momentous challenges in the information environment right now, 
which can undermine the very fabric of our democracy.
    And what makes these threats particularly powerful is that 
foreign adversaries can target U.S. and allied citizens almost 
instantly without crossing physical boundaries or borders. 
These threats will only grow as artificial intelligence, 
machine learning, and other technology-enabled information 
operations exponentially increase the speed and the scope of 
the danger.
    So according to the National Security Commission on 
Artificial Intelligence, state adversaries are employing 
artificial intelligence-enabled disinformation attacks to sow 
division in democracies and disrupt the public's sense of 
reality.
    But how to confront these national security challenges is a 
difficult question. So I believe the Nation must respond 
forcefully to deter bad actors in the information domain, 
invest in robust U.S. public diplomacy, and educate the public 
and our service members about these dangers.
    We must also articulate a vision for the information 
environment and delineate thresholds of behavior that will 
trigger a response.
    So I was sort of encouraged when the National Security 
Commission on Artificial Intelligence recommended that the 
United States develop a new strategy to counter disinformation 
while investing in technology to counter artificial 
intelligence-enabled information warfare.
    And I am also looking forward to the insight our witnesses 
will provide on how to address these threats.
    Likewise, we will explore how the Department of Defense is 
organized to compete in the information environment, including 
cyber, electromagnetic spectrum, military information support 
operations, deception, and operational security.
    The military is challenged, in the information environment, 
by capable adversaries--make no mistake about it--and 
Department of Defense priorities must reflect this reality. The 
Pentagon has a critical role in protecting the Nation, our 
partners, and our allies from threats in the information 
environment, and in advancing our national interests in this 
sphere.
    Recognizing this, Congress and this committee have 
continuously pushed the Department to prioritize adapting to 
the weaponized information environment, including by creating 
the principal information operations adviser.
    Yet, I am concerned the Department leadership has been slow 
to adapt to the changing nature of warfare in this domain. To 
give an example, in 2020, 9 of the then 11 four-star combatant 
commanders wrote a memorandum asking for additional support for 
their information operations.
    They wrote, and I quote, ``We continue to miss 
opportunities to clarify truth, counter distortions, puncture 
false narratives, and influence events in time to make a 
difference,'' close quote.
    I couldn't agree more. Too often, it appears, the 
Department's information-related capabilities are stovepiped 
centers of excellence with varied management and leadership 
structures which makes critical coordination more difficult.
    Further, the Pentagon has made limited progress 
implementing the 2016 Operations in the Information Environment 
Strategy, which raises questions about the Department's 
information operations leadership structure.
    So with that, these are challenging questions without easy 
answers, I know that. But I hope my colleagues will take 
advantage of the impressive array of witnesses that we have 
before us to get a little clarity and a clear path forward 
after this hearing.
    So with that, I will now turn to Ranking Member Stefanik 
for her opening remarks. Elise, you are recognized.
    [The prepared statement of Mr. Langevin can be found in the 
Appendix on page 31.]

STATEMENT OF HON. ELISE M. STEFANIK, A REPRESENTATIVE FROM NEW 
    YORK, RANKING MEMBER, SUBCOMMITTEE ON CYBER, INNOVATIVE 
             TECHNOLOGIES, AND INFORMATION SYSTEMS

    Ms. Stefanik. Thank you, Chairman Langevin, and thank you 
to our witnesses for testifying today. Information warfare is 
one of the most complex and important missions undertaken by 
the Department of Defense, especially in the 21st century 
information age.
    From large-scale, conventional conflicts of the past to the 
modern-day, gray-zone conflicts of today, information 
operations have been critical to shaping the operating 
environment and weakening our adversaries' strategic position.
    Eroding the resilience of our target adversaries, while 
also winning the hearts and minds, remains the ultimate 
objective of information operations. As a former senior adviser 
to the Secretary of Defense, Robert Riley, said, quote, 
``Ultimate victory comes when the enemy speaks your language, 
and embraces your idea,'' end quote.
    Unfortunately, we know our adversaries are not embracing 
our ideas. Instead, China, Russia, Iran, and non-state actors 
alike, are weaponizing information to undermine the United 
States and our interests, employing asymmetric information 
capabilities, rather than engaging us in traditional military 
means.
    Therefore, we must be prepared to not just resist 
information operations and defend our interests, but also 
project our own capabilities to exploit and shape the 
information environment.
    Today's information and media ecosystem is significantly 
different than the past, with exponential advancements in 
technology allowing words and ideas to spread faster and wider 
than ever before.
    In the last decade, we have seen how a short video, photo, 
or social media post, can have a profound impact on the 
geopolitical landscape.
    Going forward, international competition, diplomacy, and 
military operations will be increasingly based on human-centric 
networks and patterns. Fortunately, our military and 
intelligence community recognize this, and both are adapting to 
this landscape and the information in which we live.
    Congress has given clear authorities to DOD [Department of 
Defense] to conduct information operations, and we expect the 
Department to use those authorities effectively. As such, we 
can no longer just rely solely on our special operations forces 
to conduct these operations. This must be a comprehensive 
approach by the DOD, the services, and combatant commands, to 
ensure our messages are effective in achieving our objective to 
positively shape the operating environment.
    Two years ago, Congress required the Department to conduct 
a review of its information operation strategy. However, we are 
still awaiting this review and briefing.
    This subcommittee, in particular, with jurisdiction over 
cyber and artificial intelligence, is uniquely suited to 
support the Department's information operations. Yet without 
the proper review and information from DOD, it is difficult to 
appropriately support this priority.
    Congress has also created the position of the principal 
information operations adviser, so the Department would have a 
single person overseeing military information support 
operations, or MISO, efforts.
    Unfortunately, this position was layered below the Under 
Secretary of Defense for Policy, contrary to congressional 
intent. This position was not created as another bureaucratic 
layer, but as an agile single role with the mandate to guide 
each service's efforts.
    We must also act on the recommendations from the AI 
[artificial intelligence] commission and invest in technologies 
to combat AI-enabled information threats, as well as increase 
coordination with the State Department's Global Engagement 
Center to counter foreign propaganda targeted towards the 
United States.
    I look forward to hearing from our witnesses on how DOD can 
organize information operations to be more coherent, nimble, 
agile, and effective, and how the Department and the IC 
[intelligence community] can work together to enhance MISO 
efforts.
    Likewise, we must continue to discuss the critical 
defensive roles DOD can play to protect the information 
environment as our adversaries continue to wage a persistent 
information war on our interests abroad, and our citizens here 
at home.
    Thank you, Mr. Chairman, and I yield back.
    Mr. Langevin. Thank you, Ranking Member Stefanik.
    With that, I will now turn to our witnesses. We will now 
hear from Mr. Glenn Gerstell. Mr. Gerstell served as the 
National Security Agency general counsel from 2015 to 2020, is 
now a senior adviser at the Center for Strategic and 
International Studies.
    Mr. Gerstell, you are now recognized to summarize your 
testimony for 5 minutes, and thank you for appearing today.

 STATEMENT OF GLENN S. GERSTELL, SENIOR ADVISER, INTERNATIONAL 
   SECURITY PROGRAM, CENTER FOR STRATEGIC AND INTERNATIONAL 
                            STUDIES

    Mr. Gerstell. Chairman Langevin, Ranking Member Stefanik, 
and members of the subcommittee, thank you for the opportunity 
to appear before you today along with such distinguished 
experts.
    Over the past few months, social media platforms have been 
awash in falsehoods on political topics ranging from election 
fraud, to the Capitol insurrection, to climate change and 
Antifa protestors.
    Even the seemingly non-partisan sphere of public health has 
been politicized and damaged by cyber falsehoods about the 
efficacy of face masks and vaccinations.
    As a former national security official and a lawyer 
concerned with our civil liberties, I would offer three 
observations relevant to the subcommittee's work.
    First, perhaps the most pernicious aspect of the digital 
revolution, disinformation, intentionally misleading, erroneous 
information threatens our very democracy, leading to mistrust 
of institutions, cynicism about our leaders, and skepticism 
about our ability to solve social problems.
    Second, the problem of foreign disinformation is almost 
surely going to get worse, and will pose serious national 
security threats against which our military prowess will be 
largely ineffective.
    Third, while it may be difficult, there are indeed steps we 
can take to counter these threats.
    Returning to my first point, with three out of four 
Americans getting some or all of their news from social media 
platforms, disinformation could specifically affect our 
military in concerning ways.
    At the most basic level, the resulting cynicism, or lack of 
trust in our military, as was revealed in the recent Reagan 
Institute survey, might well erode the national consensus 
underpinning congressional appropriations for weapons systems 
or veterans affairs, and more directly, recruiting for our all-
volunteer military forces.
    Border threats to our military arise from our foreign 
adversaries' use of disinformation as a tool of their 
statecraft. For example, China's concerted online campaign to 
deflect investigations into the cause of the COVID-19 outbreak, 
to paint themselves as successful in curtailing the virus when 
Western democracies have been floundering, and to deny their 
militarization of the South China Sea, all complicate, if not 
undermine, our foreign relations and heighten the chance for 
conflict.
    The second point is that foreign cyber-propelled 
disinformation is likely to get much worse, to the extent that 
we would have difficulty in fending off weaponized 
disinformation coming from a sophisticated foe.
    Indeed, the recent final report of the National Security 
Commission on Artificial Intelligence cited a, quote, 
``gathering storm of foreign influence and interference,'' and 
asserted that our foreign foes will use artificial intelligence 
systems to enhance their disinformation campaigns, including by 
creating undetectable, deep-fake videos and audio recordings.
    The resulting skepticism, treating official and counterfeit 
news sources equally, would yield a chaotic and unreliable 
reality in which truth and genuine information are elusive.
    The seemingly inexorable trajectory of ever-worsening 
foreign cyber attacks from Russia, China, Iran, and North 
Korea, shows us what online disinformation will look like from 
those adversaries.
    The same factors that shield them in cyber malevolence, the 
uncertainty of provable attribution, and the absence of 
directly caused actual injury or physical damage, will also 
work even more effectively to insulate them as they inevitably 
step up their disinformation campaigns.
    What if next time Russia or Iran seizes on a natural 
disaster, say, a hurricane or flood, and weaponized the crisis 
with false information online about the hurricane's path or 
expected river crestings, or even wrong instructions about 
escape routes?
    We don't need to wait until such a crisis or a disaster. 
The very fact that there are many sources contributing to 
disinformation means that we have multiple ways to stem it.
    I would be happy to respond to your questions about 
specific solutions, but I will concede that responding to the 
challenges of disinformation will not be easy, since it will 
require making difficult and controversial decisions about the 
responsibility of the private sector for our national well-
being, and about restrictions on speech.
    But it isn't impossible, and Congress, in concert with the 
private sector, should lead the way. Our national well-being 
depends on nothing less. Thank you for the opportunity to 
present my views to the subcommittee.
    [The prepared statement of Mr. Gerstell can be found in the 
Appendix on page 34.]
    Mr. Langevin. Thank you very much, Mr. Gerstell. Thank you 
for your testimony, and we appreciate having you here.
    We will now receive testimony from Ms. Nina Jankowicz. Ms. 
Jankowicz is a disinformation fellow at the Center--excuse me 
for a second--yeah, it is--Ms. Jankowicz is a disinformation 
fellow at the Wilson Center, and is the author of ``How to Lose 
the Information War: Russia, Fake News, and the Future of 
Conflict.''
    Ms. Jankowicz, thank you for being here. You are now 
recognized to summarize your testimony for 5 minutes.

  STATEMENT OF NINA JANKOWICZ, DISINFORMATION FELLOW, WILSON 
                             CENTER

    Ms. Jankowicz. Thank you Chairman Langevin, Ranking Member 
Stefanik, distinguished members of the subcommittee, it is an 
honor to testify before you today.
    I am the daughter of a veteran. My father, an aerial 
reconnaissance officer in Vietnam, died in 2010 from 
complications from multiple myeloma which he contracted as a 
result of his exposure to Agent Orange during his service. I 
know he would be thrilled to see me testifying before you today 
in the service of truth.
    I spent my career on the front lines of the information 
war. We all now seem to recognize that the threat exists, but 
as I told your colleagues on the Appropriations Committee in 
2019, the United States has been a tardy, timid, or tertiary 
player, stymied by domestic politicization.
    Unfortunately, nearly 2 years later, we are in the same 
place. So it bears repeating. Disinformation is not a partisan 
issue. As we witnessed throughout the COVID-19 pandemic, and on 
January 6th, it affects public health, safety, and our 
democratic process. It is crucial that Congress understand 
this. Otherwise, we remain vulnerable.
    How did we get here? In part, we haven't understood the 
scope of the problem. The U.S. thinks of disinformation as a 
string of one-off occurrences that warrant attention only in 
the moment. We haven't created a comprehensive, long-term 
defense plan, and there is too little recognition of the need 
to shore up domestic vulnerabilities.
    Russia, China, and other authoritarian states know how to 
exploit this. They take advantage of American inaction, 
engaging in perpetual information competition, which has three 
characteristics.
    First, adversaries understand information competition is 
the new normal, and they are constantly probing for societal 
fissures to exploit. We have seen this with conspiracy theories 
about the origins of COVID-19 and the efficacies of Western 
vaccines. And Russia, of course, has an ongoing campaign to 
exacerbate racial tensions in the U.S.
    Second, they use all channels available--government and 
nongovernment, online and offline. China, for example, uses a 
wide range of state bodies, not just traditional national 
security bodies, to influence Western opinions about protests 
in Hong Kong, and more recently, to paint a positive picture of 
life in Xinjiang.
    Third and finally, they use perpetual information 
competition to target alliances and international 
organizations. For instance, Russia waged a campaign to prevent 
Ukraine from signing an association agreement with the European 
Union in 2016.
    In short, hostile state information operations increase 
domestic tension, and decrease American resilience. To meet the 
challenge of perpetual information competition, the Department 
of Defense should organize itself around a posture of enduring 
information vigilance, a concept I developed with my colleague 
in the U.K. Cabinet Office, Henry Collis.
    It is composed of the three Cs. The first is capability. We 
should remember the old military adage: Don't operate the 
equipment, equip the operator. The DOD workforce should be able 
to proactively monitor and identify informational 
vulnerabilities.
    Section 589E of the 2021 NDAA [National Defense 
Authorization Act], which trains Active Duty personnel, their 
families, and civilian DOD employees in detecting information 
operations, is an excellent starting point. Such a training 
program could also be rolled out to all civil servants across 
the Federal Government.
    The second C is interagency coordination. DOD and the wider 
USG [United States Government] must break out of our siloed 
national security thinking. To remedy this, the National 
Security Commission on AI recommends the creation of a joint 
interagency task force to coordinate intelligence and 
information-sharing around IO [information operations].
    I agree that the Federal Government requires a central mode 
for monitoring disinformation and coordinating policy, ideally 
in the White House, but my research across Europe suggests we 
also need the involvement of nontraditional security 
departments.
    In the long term, the key to combating disinformation lies 
with departments focusing on education, arts, and health, at 
Federal and local levels, as well as building a thriving, 
pluralistic media environment and teaching civics.
    The third C is international cooperation. This includes 
better sharing of information to identify threats and 
formulation of effective responses with allies.
    Toward this goal, the NSCAI [National Security Commission 
on Artificial Intelligence] suggests an international task 
force, led by the Global Engagement Center [GEC] at the State 
Department. However, the GEC's agreement is too large, its 
budget too small, and its reputation within the interagency and 
international communities too uncertain to add such a task to 
its portfolio.
    It currently produces open-source intelligence analysis, in 
addition to its coordination, policymaking, and analytic roles. 
And I recommend that intelligence-gathering rest with 
analytics, not policy bodies.
    The GEC's limited resources are better allocated in 
coordinating with embassies and other agencies in establishing 
and implementing policy and program priorities.
    Finally, while the idea of a task force for international 
coordination is a noble one, the U.S. must recognize that we 
are arriving late to this party. We should augment efforts that 
are already underway by close allies such as the U.K.'s 
international partnership for countering state-sponsored 
disinformation, and the G7 Rapid Response Mechanism.
    Enduring information vigilance cannot be built overnight. 
It requires a long-term commitment that will likely outlast the 
current political class, but the result will be a more 
resilient society.
    The United States must act not only as the staunchest 
defender and guarantor of democratic values among our allies 
abroad, but actively lead by example, underlining that 
disinformation knows no political party, and that America is 
committed to reversing the normalization of disinformation in 
our own political discourse.
    Once again, thank you for this opportunity, and I look 
forward to your questions.
    [The prepared statement of Ms. Jankowicz can be found in 
the Appendix on page 47.]
    Mr. Langevin. Very good. Thank you, Ms. Jankowicz.
    We will now receive testimony from Dr. Herb Lin. Dr. Lin 
studies cyber policy, information warfare and influence 
operations, and is a senior research scholar at Stanford 
University. He is the author of ``Bytes, Bombs, and Spies.''
    Dr. Lin, you are now recognized to summarize your testimony 
for 5 minutes.

 STATEMENT OF HERBERT LIN, SENIOR RESEARCH SCHOLAR, CENTER FOR 
  INTERNATIONAL SECURITY AND COOPERATION, STANFORD UNIVERSITY

    Dr. Lin. Thank you, Chairman Langevin, Ranking Minority 
Member Stefanik, and distinguished members. Thank you for 
inviting me to testify today. I am speaking for myself today, 
and not on behalf of any institution.
    The general thrust of my remarks is that Department of 
Defense is poorly structured and equipped to cope with the 
information warfare threat facing the U.S. as a whole. However, 
the DOD can make a meaningful contribution in addressing part 
of the problem.
    We usually believe in a clear distinction between peace and 
war. Today, we are not in a shooting war with Russia or China, 
but we are not at peace either. Our adversaries prosecute the 
state of ``not peace'' in many ways, including cyber-enabled 
information warfare.
    Such warfare presents several new challenges. First, the 
Constitution is the foundation of U.S. Government. Deeply 
embedded into the Constitution is the concept of a marketplace 
of ideas. Here ideas publicly compete with each other, and 
truth emerges from public debate of ideas.
    But this concept emerged at a time when information was 
hard to obtain. Today the internet and social media have 
brought a deluge of information so great that no one can 
possibly access or process all of the information needed to 
evaluate any given idea.
    The second challenge is that the information marketplace 
presumes that people process information rationally, 
thoughtfully, and deliberately. However, psychological science 
has demonstrated that people often do not do so. Instead, they 
often make fast, intuitive judgements based on how they feel 
from their gut, even though everyone is, in fact, capable of 
thoughtful deliberation.
    Such judgements--fast intuitive judgements from the gut--
are usually adequate for the kinds of personal decisions found 
in everyday life, but they are inadequate when the consequences 
for error are high.
    Moreover, many of our tech companies have learned that 
supplying content that plays to our worst habits of nonrational 
thought is the way to increase user engagement which, in turn, 
increases their profitability.
    Third, the boundaries between foreign and domestic sources 
of information chaos are blurring. Russians and Americans may 
not be working side by side to sow disorder, mistrust, and 
polarization in the United States, but the scope, nature, and 
effect of their activities, even if separately conducted, are 
largely indistinguishable.
    That means, any effective effort against Russian activities 
will inevitably have collateral effects against American 
activities that are similarly oriented.
    In sum, the information warfare threat to the United States 
is different than from past threats, and has the potential to 
destroy reason and reality as the basis for societal discourse, 
replacing them with rage and fantasy.
    Perpetual civil war, political extremism waged through the 
information sphere and egged on by our adversaries is every bit 
as much of an existential threat for American civilization and 
democracy as any military threat imaginable.
    Why can't DOD defend effectively against the information 
warfare threat? Fundamentally, it is because the information 
warfare threat requires a whole-of-society response, and DOD 
cannot, and is not in a position to, orchestrate such a 
response.
    More specifically, DOD policy directives prohibit 
information operations directed at U.S. audiences, regardless 
of the intent underlying them, and that includes activities 
intended to protect U.S. audiences against foreign information 
warfare operations.
    But there are also cultural constraints. DOD culture is 
oriented towards defense against physical threats--planes, 
missiles, and the like. But DOD was never designed to defend 
against nonphysical threats. Joint doctrine does not even 
acknowledge the possibility that the U.S. Armed Forces could be 
the target of adversary psychological operations.
    Nevertheless, despite existing policy and culture, DOD is 
well-positioned to assess the information warfare threat for at 
least one segment of the U.S. Government, namely the Armed 
Forces and their families.
    Every member of the U.S. military swears an oath to support 
and defend the Constitution of the United States against all 
enemies, foreign and domestic, but the vast majority receive no 
education, no instruction, on what these words mean.
    The fiscal year 2021 Defense Authorization Act called 
attention to the need to protect U.S. military personnel and 
their families from foreign malign influence and disinformation 
campaigns, that was the previously mentioned section 589E, and 
both Secretary Austin and the Congress have expressed concerns 
about extremism in the U.S. military, which is facilitated by 
exposure to foreign disinformation campaigns.
    These points suggest the need for DOD to provide 
substantial in-house training for military personnel on the 
meaning of their oaths and on civics education as a 
prerequisite foundation for such training.
    That concludes the oral portion of my testimony. Thank you 
for the opportunity. I am happy to answer any questions.
    [The prepared statement of Dr. Lin can be found in the 
Appendix on page 60.]
    Mr. Langevin. Thank you very much, Dr. Lin. Appreciate you 
being here as well.
    We will now receive testimony from Dr. Joe Kirschbaum.
    Dr. Kirschbaum, welcome back, and thank you and your team 
for all the recent support. Dr. Kirschbaum is the Director of 
the Government Accountability Office Defense Capabilities and 
Management Team. Dr. Kirschbaum, you are now recognized to 
summarize your testimony for 5 minutes.

     STATEMENT OF JOSEPH W. KIRSCHBAUM, DIRECTOR, DEFENSE 
  CAPABILITIES AND MANAGEMENT TEAM, GOVERNMENT ACCOUNTABILITY 
                             OFFICE

    Dr. Kirschbaum. Chairman Langevin, Ranking Member Stefanik, 
and members of the subcommittee, I am pleased to be here today 
to discuss the vital role of the Department of Defense's 
operations in the information environment.
    Throughout history, militaries and states have sought 
advantage through actions intended to affect the perception and 
behavior of adversaries. As we have noted today, our 
adversaries, particularly China and Russia, are taking 
advantage of emerging information technology to offset the 
United States conventional warfighting advantages.
    Although we focused on the Department of Defense, to 
reiterate, as an element of U.S. national power, information 
operations, as a whole, are necessarily part of a whole-of-
government and whole-of-society effort.
    My testimony today describes the Department of Defense's 
information operations concepts, and DOD's actions to implement 
the 2016 strategy and address information operations 
challenges. This statement is based on reports we issued in 
late 2019 and our assessment of defense information-related 
documents.
    The terms for information operations--doctrinal terms--are 
many and varied. DOD has defined some, but inconsistency and 
potential confusion remains. Among the things the Department is 
actually working on right now is a more consistent set of 
information operations-related terms.
    To achieve greater effects in the information environment, 
combatant commanders can plan and execute operations that 
combine multiple information-related capabilities.
    Such capabilities include military information support 
operations, what was traditionally known as psychological 
warfare; military deception; cyberspace operations; 
electromagnetic warfare; operation security; and special 
technical operations.
    There are, however, many other related capabilities, such 
as public affairs, civil-military operations, and intelligence 
capabilities.
    A good example of an information operation is the effort by 
the Allies in 1944 to convince the Germans that the attack on 
occupied Western Europe would come at a place other than the 
actual target of Normandy.
    Operation Fortitude involved a number of what we would now 
call information-related capabilities. These included creation 
of fictitious military units, with all the requisite paperwork, 
associated radio transmissions and traffic, and assigning a 
real U.S. Army General--in this case, George S. Patton--to 
command those units.
    It also involved the creation of mock aircraft and landing 
craft located in southeast England, and many other intelligence 
and military deception techniques.
    While this is on a grand scale, defense planners today can 
do the same kinds of things to integrate more than one 
information-related capability to achieve desired end states.
    DOD's 2016 Strategy for Operations in the Information 
Environment was intended to significantly enhance their ability 
to conduct information operations today. However, the 
Department did not fully implement that strategy, leaving 
approximately 80 percent of the enumerated tasks incomplete.
    Among the largest omissions was the absence of an 
implementation plan, or an investment framework. The Department 
instead shifted focus to develop a joint concept of operations 
and a capabilities-based assessment. Both worthy efforts. It 
then started to develop a new strategy, which remains in 
development.
    We also found gaps in DOD's leadership, oversight, and 
management. The Department assigned most responsibilities to 
the Under Secretary of Defense for Policy. However, delegating 
many of those responsibilities down to a lower level and 
failing to formalize authorities exacerbated the dispersal of 
leadership and focus.
    As you pointed out, Mr. Chairman, congressional direction 
has prompted movement in the Department. In fact, most 
movement. Examples include the new information operations 
cross-functional team, which may mitigate some of the problems 
we identify, and designation of the Under Secretary of Defense 
for Policy as the principal information operations adviser, 
reporting directly to the Secretary of Defense.
    Ultimately, however, the leadership the principal adviser 
exercises, and the support the Department gives them in 
implementing Department-wide strategy and vision, will be 
critical.
    DOD has integrated information-related capabilities in some 
military operations but has not addressed key planning, 
coordination, and operational challenges. This is important for 
ensuring that DOD integrates the information dimension into 
routine operational planning.
    DOD resisted our recommendation to conduct a comprehensive 
posture review in order to assess challenges. However, once 
again, Congress subsequently required the Secretary of Defense 
to conduct such a posture review.
    DOD told us they have taken initial steps to conduct this 
review, but did not provide an estimated completion date.
    In summary, there are opportunities for improved DOD 
leadership, recognition of information as a joint function, and 
better preparing the military to conduct information operations 
and counter our adversaries.
    I look forward to continuing to work with this committee, 
and the Department, to help it address these challenges and 
make the most of these opportunities.
    Chairman Langevin, Ranking Member Stefanik, and members of 
the subcommittee, this completes my prepared statement, and I 
am happy to respond to any questions.
    [The prepared statement of Dr. Kirschbaum can be found in 
the Appendix on page 80.]
    Mr. Langevin. Very good. Thank you, Dr. Kirschbaum, and I 
want to thank all of our witnesses for your testimony today. 
You do a great service to the subcommittee and to the committee 
at whole, writ large, by appearing today and giving us your 
perspective.
    Dr. Kirschbaum, let me start with you. So Congress has 
consistently encouraged the Pentagon to focus on these issues, 
including requiring the DOD to create a principal information 
operations adviser. Has the Pentagon sufficiently elevated 
dedicated information operations leadership?
    Dr. Kirschbaum. Mr. Chairman, I would say yes and no. So, 
in brief, what has happened with the diffusion of leadership, 
for example, most of the responsibilities for information 
operations was delegated down to the level of the Deputy 
Assistant Secretary for Special Operations and Combating 
Terrorism.
    As that title indicates, that is a lot to work on, and so, 
incorporating information operations into that very small staff 
has generated issues. While very capable, they are not at the 
right level, in a lot of cases, to achieve some of the results 
because of that lack of leadership.
    Now, the Department has gone back and identified the Under 
Secretary of Defense for Policy as the principal information 
operations adviser in the hopes that keeping it at that level 
will elevate importance.
    And the comparison, of course, is made to the situation 
with the principal cyber adviser. There are some differences 
that we are a little concerned about, seeing how the Department 
carries through with that.
    For example, the principal cyber adviser had a deputy who 
could leverage a deputy assistant secretary who was focused 
solely on cyber operations. The Under Secretary of Defense for 
Policy, as you appreciate, is doing just a few things. So, 
focusing on information operations will be important to see 
what level of resources, what level of attention it gets, 
assuming it is at that right level, assuming they are able to 
assign a deputy with the right focus, and, then, follow through 
with the right structural, procedural impetus in order to make 
sure momentum continues.
    Mr. Langevin. Thank you. Thank you for that answer. Mr. 
Gerstell, can you further explore why foreign-enabled malign 
influence and disinformation are a national security threat? 
And how will emerging technologies, like artificial 
intelligence, increase this threat?
    Mr. Gerstell. Thank you, Mr. Chairman. So, I think we have 
rich evidence of the fact that foreign-inspired disinformation 
is a real national security threat. The 2016 elections were 
certainly a good example of that with, as you know, the Senate 
Intelligence Committee issued a five-volume bipartisan report 
finding that Russia actively intervened in our elections in an 
effort to influence them in 2016.
    It is hard to say for sure exactly what the result would 
be, but anybody would think that tampering with our democratic 
process must--must--by definition, be a national security 
issue.
    We have certainly seen how foreign disinformation from 
China and Russia, which just this week, once again, was touting 
the virtues of their Sputnik vaccine, and degrading the virtues 
and qualities of the American Pfizer and other COVID vaccines, 
clearly disinformation that is going to hurt our public health, 
the ability of Americans to get vaccinated. Again, another 
effect on national security.
    If we want a very specific example, just quickly, back in 
last September, when there were terrible wildfires in Oregon in 
the Northwest, Russia jumped on a couple of misleading and 
false statements that were set forth in some QAnon accounts and 
really weaponized them. They, in a concerted, coherent way, 
amplified them and turned them into a detailed, rich story of 
falsehoods about who started the wildfires, claiming that 
Antifa protesters were doing it.
    It reached a point, because of what Russia was doing, that 
civilians actually set up roadblocks in Oregon, in effort to 
stop these perceived but erroneous protesters who, of course, 
weren't there. It actually hurt people who were trying to flee 
the fire, so much so, that the Douglas County Sheriff and the 
FBI [Federal Bureau of Investigation] pleaded with the public 
to stop circulating these falsehoods.
    So we have seen how foreigners can take an existing 
division and create national security problems here on our 
soil. It stands to reason, following your other question, Mr. 
Chairman, that using technology--artificial intelligence--to 
micro-target viewers and listeners will only exacerbate the 
problem. So that is why I said, I believe the problem has the 
potential for getting worse before it gets better.
    Mr. Langevin. And from your vantage point, what can the 
United States do to protect itself from both a technological 
and policy standpoint?
    Mr. Gerstell. I think there are a wide range of tools. As I 
said in my earlier comment, and I know the other panelists 
agree with me here, disinformation has many causes. So the fact 
that it has many causes means that we also have many ways of 
treating it, to use a--sort of a medical analogy. This is a 
chronic condition, a complex chronic condition. So it is not a 
disease that will be cured by one miracle drug.
    So, I think we have a rich opportunity to use a range of 
legal tools at our disposal, perhaps by tightening up section 
230 of the Communications Decency Act, perhaps by either 
causing the industry to self-regulate, or to regulate the 
ability of social media platforms to limit the virality of 
falsehoods to check them before they get spread too widely.
    We can take steps in our society to increase, as others 
have said, digital literacy, civic education, so that people 
will have a better understanding and will be better able to 
assess falsehoods.
    I think the most important thing--and I am echoing what Ms. 
Jankowicz just said, and you, Mr. Chairman, also--is, we need 
an integrated approach to this. Russia and China use an 
integrated approach, a whole-of-government and their private 
sector, to create these disinformation campaigns.
    There is an asymmetry. We don't. We need to do that, and 
that will be the key to success in this area.
    Mr. Langevin. Very insightful, well said, and I couldn't 
agree more. Thank you.
    My time is expired. I am going to now turn to Ranking 
Member Stefanik for her questions.
    Ms. Stefanik. Thank you.
    My question is for Dr. Lin. In the past, the special 
operations community and service members in the field of PSYOPs 
[psychological operations] and civil affairs had the most 
experience with information operations. It is going to be very 
important that the Department scale these skills to a wider 
force. How do we do that, and specifically how do we equip our 
cyber forces with the skills to conduct effective information 
operations?
    Mr. Moore. Mr. Lin, you are on mute still.
    Dr. Lin. All right. Thank you. Ranking Minority Member 
Stefanik, thank you for asking the question. I hate technology.
    How do we get the cyber forces to be better able to address 
the influence operations side of the house? That is a 
question--I addressed that in the paper that I submitted for 
the record, on dysfunction in the DOD about doctrine and so on.
    The short answer is that I believe that there needs to be a 
joint--something that is joint and standing, some effort, some 
entity, that pulls together the cyber people together and the 
PSYOPs people together, as equals.
    Cyber Command has the expertise in the information delivery 
side of the house. The PSYOPs people, the MISO people, have the 
responsibility of understanding content, and those two have to 
be put together.
    For me, trying to grow psychological expertise out of what 
are fundamentally a bunch of technical hackers, as good as they 
are, that is not their skill set. Their skill set is flipping 
bits, and so on.
    I speak as a former bit-flipper myself, and getting the 
psychological insights from others who are much more expert in 
that, I think, is the way to go.
    So there has to be a standing team, and the standing part 
is really important, because it recognizes the fact that this 
is an ongoing problem, not one of a specific campaign here or 
there.
    Ms. Stefanik. Yield back.
    Dr. Lin. I hope that answers your question.
    Ms. Stefanik. It did. Thank you. Yield back.
    Mr. Langevin. Thank you very much, Ranking Member Stefanik.
    Mr. Keating is now recognized for 5 minutes. Is Mr. Keating 
still with us? If so, you might be on mute.
    Okay. If Mr. Keating is not there, in the tradition of 
going Democrat, Republican, I will just go down the list to Mr. 
Morelle.
    Mr. Morelle. Thank you very much, Mr. Chairman. This is 
really a fascinating subject. And I am new to the committee and 
the subcommittee, so I am not entirely familiar with DOD's 
actions. But having listened now, and I hear that there is 
calls for more coordination, more information-sharing, greater 
intentionality of our focus, but I am still struggling, just as 
a layperson, to suggest what you have offered as 
recommendations that would actually stop the disinformation 
from seeping in. Given that we have an open and democratic 
society, given that we have social media, how do we actually 
stop this, other than--well, I am just sort of curious.
    What are the tactics and the strategies we use to prevent 
this from really undermining society here in the United States 
and really creating more divisiveness?
    Ms. Jankowicz. I am happy to jump in there. Thank you, 
Congressman, for that question. You are absolutely right. There 
is not very much that we can do to instantaneously correct this 
problem. Right now, and for the past 4 or 5 years, we have been 
playing what I call ``whack a troll,'' where we want to just 
focus on offensive content, harmful content, but really we need 
a much more systematic and, in fact, endemic solution.
    And our adversaries--Russia, China, Iran--have been playing 
the long game, they are playing a generational game. They are 
not necessarily interested in getting it right every time, but 
they know that if they can chip away at the surface, eventually 
they are going to get to the core of the polarization that they 
are seeking for, and keep us distracted so that they can do 
whatever it is that they are looking to do in their near 
abroads, domestically, with regards to human rights, et cetera, 
as well as achieve political goals.
    So that is why, in addition to focusing a little bit on 
content moderation, which is the topic du jour, right, in 
addition to making sure that our government bodies are putting 
out authoritative information, that it is trusted by the 
public, that is why we really need to start investing in what I 
call citizens-based responses.
    So all of the countries that I have studied in Central and 
Eastern Europe that have been dealing with Russian 
disinformation for much longer than we even recognized it 
existed, have all, of course, looked at the kinetic side of 
things. They have good cyber defenses, but they also invest in 
their people.
    And I know that is out of remit of this subcommittee, but 
it just speaks to what Mr. Gerstell, Mr. Lin, and Dr. 
Kirschbaum have all touched on, that we need a whole-of-society 
response, and we really need to get out of this siloed national 
security thinking, invest in libraries, invest in public media, 
so that people have trustworthy sources of information to go 
to, and invest in awareness and civics, so that folks 
understand their role in the democratic process, because 
ultimately, that is what disinformation is trying to 
undermine--people's participation.
    Mr. Morelle. Look, yeah, I appreciate that, and I certainly 
don't want to be argumentative. I read recently Anne 
Applebaum's, the Twilight of Democracy, which is a frightening 
volume, similar kinds of lines of communications. But what 
troubles me is, I can certainly envision foreign adversaries 
starting to spread, through social media and otherwise, 
arguments that a Presidential election, for instance, was 
stolen from the American public, and despite a lot of 
investigation, no evidence ever emerges that such a thing 
happened.
    And yet, you can imagine potentially a third of the 
American public believing that no matter, and that really gets 
at the foundations of American democracy. I think I would like 
to believe that that wasn't possible, but frankly, I feel like 
I just lived through this nightmare.
    And, so, I appreciate what you are saying, and I don't 
disagree with you, I am just really, really concerned that 
there may not be an answer. And I don't know that it is the 
Department of Defense's job. I don't even know how they would 
begin to do this, but having listened to all three of you, I 
just struggle with, like, okay, so what, if anything, can we do 
here?
    And I apologize, I am using up a lot of time, but if the 
other two witnesses want to respond, I would love to hear your 
thoughts as well.
    Dr. Lin. I would say, starting with education of the Armed 
Forces is a big step forward. Getting the people whose job it 
is to protect us and defend the Constitution, teaching them 
what it means to do that, getting them some real education, 
that is a meaningful step forward----
    Mr. Morelle. I am not sure--I mean, I don't mean to 
disagree with you. I think that is a great suggestion. We 
couldn't even get Members of the House of Representatives to 
defend the Constitution this past November against a suggestion 
that an election was stolen with no evidence that that is the 
case. I am not sure--if we can't get the Congress to do it, I 
don't know how we would get members of the United States 
military to do it. But again, I don't mean to be argumentative. 
I am just frustrated, and I think probably all of you are with 
where we find ourselves.
    Mr. Gerstell. Congressman Morelle, if I may add to that----
    Mr. Morelle. Sure.
    Mr. Gerstell [continuing]. I certainly share your 
frustration. I suspect probably everyone on both sides of, 
metaphorically, of the witness table, so to speak, feels that. 
But the Supreme Court has been very clear that Americans have a 
First Amendment right to receive foreign disinformation, no 
matter how outrageous it is.
    Some philosophers talk about the paradox of tolerance, 
which is that a society that is very tolerant and open to lots 
of views, also potentially has the seeds of its own 
destruction, of course, because someone could criticize the 
very society. So you are right.
    I think the best analogy, just very quickly, is the 
cybersecurity one, which is, I think cybersecurity experts will 
tell you that at the end of the day, we are probably never 
going to be able to completely eliminate cybersecurity attacks 
from a sophisticated foreign adversary.
    Instead--and we should certainly work on that, but instead, 
what we need to do is limit their effectiveness and their 
scope. And I think it is the same thing with disinformation. We 
are not going to stop it where it starts, overseas, but we can 
limit its effectiveness on our soil.
    Mr. Morelle. I have well exceeded my time. Mr. Chairman. 
Thank you for your indulgence, as I am glad you gave the 
gentleman an opportunity to answer, and I yield back.
    Mr. Langevin. Sure. Thank you, Mr. Morelle.
    Now I would like to recognize Mr. Moore for 5 minutes.
    Mr. Moore. Thank you, Chairman and Ranking Member. It is 
clear, and I think I want to just--a sentiment that was given a 
few minutes ago, we can't even just keep this with respect to 
the Department of Defense. Cyberspace, this threat, is in every 
aspect of our lives, from banking, entertainment--I mean, 
across the board. So just to emphasize the importance of this, 
and when we do think about our defense-related work, our legacy 
platforms, our legacy weapons platforms, they still serve a 
valuable deterrent.
    But electronic warfare and cyber operations are central to 
the future fight. I will keep my questions geared towards that, 
and making sure we can be thinking about the future. And, so, I 
will start with a question to Mr. Gerstell.
    We have heard in this committee that the artificial 
intelligence capabilities of our adversaries are rapidly 
progressing to the point where it can only be combative with 
our own AI technologies. Can you just give us some perspective? 
Is the United States winning this AI arms race? If not, what 
steps need to be taken to increase our competitiveness?
    Mr. Gerstell. Sure. Thank you very much, Congressman. I 
think the best answer I could give would be to point to 
something that has already been alluded to, which is the final 
report of the National Security Commission on Artificial 
Intelligence, which has a rich series of recommendations for 
our Nation to invest in, ranging from everything from educating 
our workforce, to stepping up government investment, working 
with the private sector to increase AI, and perhaps--and also, 
including a series of laws, ultimately, and recommendations on 
limiting the use of AI for beneficial purposes and limiting its 
misuse.
    So we are in an arms race, so to speak, principally with 
China, on the area of artificial intelligence. They are busy 
amassing data, including data on Americans, that could be very 
significant when coupled with artificial intelligence and 
machine learning, and used against us in nefarious ways.
    So, we have our work cut out for us. I think there is a 
large series of recommendations that I would endorse of the 
Commission, and that would be a very, very important step for 
us to go down that road.
    Mr. Moore. Excellent. Thank you.
    On that same topic, Dr. Lin, Chinese and Russian militaries 
are structured to integrate information-related capabilities, 
and are absent of any genuine oversight, I will say. How can 
the DOD refine their current management structure to improve 
synchronization of information capabilities, while maintaining 
the merits of civilian control of the military, where we, as a 
Nation, will always, you know, have proper oversight to the 
extent possible, and knowing that we don't always get to fight 
against nations that don't value that as much as we do. But is 
there improvements that we can make to level the playing field?
    Dr. Lin. Well, one of the things that I--certainly one of 
the things that I have thought about is, for example, the 
distinction that this committee is very well aware of, the 
distinction between title 10 and title 50 authorities.
    A large part of this game is done in the intelligence 
world, sort of in the covert-action world. Systems operations 
are often covert, and it is an interesting question as to how--
whether--how and to what extent coordination between title 10 
and title 50 authorities, I have heard people say that you 
should--we need a title 60, you know, as a combination of the 
two, to better coordinate.
    It is very hard, as long as we are very concerned about 
authorities, to achieve the kind of coordination that you are 
talking about. Neither the Chinese, nor the Russians, are 
really worrying very much about who has the authority to do 
[inaudible]. It is hard to imagine [inaudible] whether 
something happens because one branch does it or another branch 
does it, but we care a lot about that.
    Mr. Moore. Okay. Excellent. Thank you. For a final 
question, Ms. Jankowicz, first off, I was touched by your 
comments on your dad, and I am sorry to hear that, but I am 
sure he is proud of you.
    Anything you wanted to highlight in this platform, just on 
some of the things that we are doing right, and as meetings 
that I have had recently with some of the cyber companies in my 
neck of the woods out in Utah, like small business and smaller 
operations are being more nimble, is there an opportunity to 
leverage those types of more--I guess I will just reuse the 
term nimble--organizations to help fight this battle going 
forward?
    Ms. Jankowicz. Yeah, absolutely. Thank you, Congressman, 
and thanks for your comments about my dad.
    I mean, I think, finally, the fact that we are recognizing 
this problem, that these hearings are happening more frequently 
is a good thing. And the fact that this is a bipartisan showing 
here in this committee warms my heart frankly, and the 
leadership that you all show is really important in setting an 
example for your constituents, for the media, for everyone. So 
kudos on that.
    I do really think we need a central node in the Federal 
Government, not only to work on the intelligence issues, which 
we heard from ODNI [Office of the Director of National 
Intelligence] is going to be happening soon within ODNI, but we 
need somebody to be setting policy, and I think that is where 
DOD and the GEC, and other bodies in DHS [Department of 
Homeland Security], like CISA [Cybersecurity and Infrastructure 
Security Agency], for instance, are kind of operating all in 
their own spheres. So I would like to see a lot more 
coordination.
    And on a local level, I think you are absolutely right. We 
need to really create and invest in more robust public-private 
partnership in this area, not just with the Big Tech firms, but 
with local businesses and with civil society organizations.
    You know, the most successful programs to counter 
disinformation that I have seen around the world have been ones 
that invest in those local connections, with local media, local 
civil society groups, local libraries, even local influencers 
and performers who can go there and deliver an authoritative 
message to folks that they are neighbors with, without, you 
know, the baggage of it coming from the Federal Government.
    So I think we need to think a lot more creatively, a lot 
more out of the box, and business, local business, is a great 
place to start with that.
    Mr. Moore. Thanks for the thoughtful comments, and I yield 
back. Thanks for that.
    Mr. Langevin. Thank you.
    Mr. Larsen is now recognized for 5 minutes.
    Mr. Larsen. Yeah, thanks, Mr. Chair. I appreciate it. 
Greetings from the Pacific Northwest, where you will not be 
surprised to know it is raining today. So thanks for the chance 
to say hello.
    My first question is for Dr. Kirschbaum. I usually embrace 
everything the GAO [Government Accountability Office] says. I 
want to preface my comments. I do want you to explain a little 
bit more on the recommendation. We are moving to criticism 
about the delegation that the U.S.--or under the theory defense 
policy makes on MISO operations, in particular, to special 
operations forces. I think your characterization that special 
operations forces focused quote, ``only on special operations 
and counterterrorism'' might have been accurate 10 years ago, 
is inaccurate today. In fact, there is a bit of a debate going 
on about the role special operations needs to play in great 
power competition, which, in part, includes information 
operations, but specific to special operations.
    So can you talk a little bit about how you approach that 
particular question, and then relate that to a broader comment 
about how the Pentagon is organized? And could you grade that 
for us, for information operations?
    Dr. Kirschbaum. Mr. Larsen, thank you so much for your 
question. So, first, I want to make sure that my comments are 
not misunderstood. You are correct that the idea of Military 
Information Support Operations, PSYOP. That is exactly where 
that user belongs. That is where that specialty is. It is in 
special operations, and then the combination for intelligence. 
That is true.
    The comment that I made really has to do with the decision 
by the Department to move information operations writ large 
into that space where you have very few people. And I have had 
the great opportunity to work with most of those people, and 
they get it, they understand what needs to be done. They have 
written a lot of the things in the direction that kind of point 
the way to where the Department is going. However, I think they 
are a little stymied in being able to get traction in the rest 
of the Department to look for.
    So, for example, when we talk about what you have to do to 
kind of inculcate info operations and understanding throughout 
the Department. It kind of goes to what Dr. Lin was talking 
about, you need a broader, joint understanding. And, so, you 
take advantage of those individual specialties, like MISO, you 
take advantage of cyber, you take advantage of all these other 
things, but you do it in a way that everyone understands how to 
integrate that, which is why I said it needs to be integrated, 
operationally, into the planning cells for the J-2s, the J-3s, 
and the J-5s at all the COCOMs [combatant commands].
    In terms of Department leadership, it really doesn't matter 
who has got the ball, as long as there is Department-wide 
emphasis and momentum. And that is what we have seen lacking. 
And depending on a very small number of people to carry the 
ball to implement the strategy, to carry out the capabilities 
base assessments, to do all the things we have asked them to do 
over and over again, it hasn't worked. They haven't got the 
traction throughout the Department. They have not gotten the 
support they need. That is where the potential for identifying 
the principal information operations adviser, keeping it at the 
level it is, and then rely on those existing staff, and giving 
them the support is hopefully the way to make that stick.
    Mr. Larsen. Yeah, maybe when either this subcommittee or 
the full committee has an opportunity to talk to Under 
Secretary of Defense [for] Policy Kahl about his view on this 
now that he has been approved by the Senate, or by the Senate, 
we can have a chance to talk to him.
    I noted that the clock didn't start exactly on time, but it 
was adjusted, so I will assume I do have a minute 40 left, and 
go to Ms. Jankowicz.
    Because the Pentagon is the Pentagon, and because it has to 
operate outside, not inside, the country, how should we look at 
fitting the Pentagon IO function in this largely--in a larger 
coordinated fashion with other government operations?
    Ms. Jankowicz. Thank you, Congressman. I think the 
important thing here, again, is the central node. So taking 
under account the defense intelligence gathering that is going 
on, sharing that in the interagency, making sure that 
priorities out in the field in our areas of conflict are lined 
up with what the Department of State is doing in their 
programming. And then again, I think the Department of Defense 
has an opportunity to really be the laboratory for educating 
the Federal workforce about information operations. They are 
certainly a targeting bio. Their families are. And there have 
been multiple studies about catfishing and other things against 
the Armed Forces.
    So, educate them and then roll that out more broadly to the 
rest of the Federal workforce. And I think it is the biggest 
opportunity that the Department of Defense has with this 
challenge.
    Mr. Larsen. Yeah, good, thanks. Thank you very much. And 
thank you, Chair Langevin. I appreciate it very much. I will 
yield back.
    Mr. Langevin. Thank you, Mr. Larsen.
    Let's see, Mr. Fallon, is recognized for 5 minutes.
    Mr. Fallon. Thank you, Mr. Chairman, I appreciate it.
    Mr. Langevin. Thank you, Mr. Fallon.
    Mr. Fallon. Can you hear me? Sorry.
    Mr. Langevin. Yeah go, ahead.
    Mr. Fallon. Oh, wonderful. Thank you. I wonder if the panel 
can answer some questions. One of which is, amongst rule of law 
Jeffersonian democracies in the world, what countries are the 
gold standard? [Inaudible] emulating vis-a-vis cyber 
disinformation?
    Ms. Jankowicz. Well, I can jump in there, Congressman. In 
my research I look at a number of countries in Central and 
Eastern Europe, again, that have been dealing with this for 
decades now. Estonia is one I always like to bring up. Of 
course, it is quite a small country, only 1.3 million people. 
But in 2007, they were hit with a cyber attack as well as what 
I call beta disinformation, pre-social media, at the hands of 
the Russians that caused a riot, that caused one person to die. 
And the cyber attack, of course, took down their banking as is 
well known, and many of their other E-governance operations in 
Estonia.
    And that was a real wake-up call, along with kind of a 
reinvigoration during the annexation of Crimea in 2014. And as 
a result, the Estonian Government is really invested in cyber 
operations, they have invested in Russian language media, to 
reach out to that disenfranchised population. And they have 
invested in really building trust between the Estonian 
Government and the ethnic Russian population there.
    And I think that is a great model for a whole-of-society, a 
whole-of-government solution. And if fluffy little Estonia can 
do it, I think that the United States of America should be able 
to do something similar as well.
    Dr. Lin. I was just going to say that Finland also is 
another example of whole-[inaudible]-country, whole-
[inaudible]-society approach to disinformation. They have been 
dealing with it for a lot longer than most of the other 
countries in the world. And they emphasize this throughout 
society, and it is very much a part of their educational 
regime.
    Mr. Fallon. Is it fair to say that Russia is the most 
adroit at this, or is China catching up, or are they on par?
    Dr. Lin. Different people have different judgements about 
that. I think the Russians are most pernicious because they--it 
is easier to tear down stuff than it is to build something up. 
And the Russians are extraordinarily good at tearing stuff 
down. And the Chinese are getting there, but for my money, it 
is the Russian threat that I am most concerned about right now.
    Mr. Fallon. I think the Russians had 600 years of practice 
in that regard. What are we doing as far as offensively to 
combat this? Because we don't need to--we just need to get out 
information in a lot of ways when we are talking about 
totalitarian regimes and giving it to their people. Are we 
taking specific--because you know, the old adage is the best 
offense--or the best defense is a good offense. Are you all 
aware of efforts that we have that we are making, and do we 
need to focus more on that as well?
    Dr. Lin. I just had a little bit in my written testimony. I 
think that the biggest policy question that we have to--that we 
have to address as a country, is how and to what extent, if at 
all, we should be adopting the techniques of the Russians in 
prosecuting information for their offense. I am going to point 
out that our offensive information worker efforts don't help 
defend the United States, and defense information warfare can 
only influence other populations.
    Do we want to adopt the tactics of the Russians in this? I 
am very uncomfortable about that as an American citizen. On the 
other hand, it is pretty clear that speaking the truth, just 
the truth, doesn't work very well. And Americans believe that 
speaking truth, that the truth will eventually win. Maybe 
eventually, but it sure doesn't--there is good evidence that it 
doesn't always win in the short term. And how far are we 
willing to go down that path? That is a very tough policy 
question that is way above my pay grade to answer.
    Mr. Fallon. Do you believe, the panel believe, that forming 
an information command would be something that we should 
explore?
    Dr. Kirschbaum. Mr. Fallon, this is Joe Kirschbaum. So I am 
not sure a command is necessary. The reason that your question 
piqued my interest is I remember more than 10 years ago, before 
Cyber Command was stood up, I remember having a conversation 
with someone in the Department of Defense, and someone asked me 
and said, What would be your biggest surprise after we are--
eventually stand up this U.S. Cyber Command? You know, however 
many years from now, and I forget what they asked me. And my 
answer to them was, my number one surprise would be if it is 
still called U.S. Cyber Command, because of the nature, you 
know, what we are talking now, the information environment 
involves so much more, and cyber is a part of it.
    So people have argued for, in fact, that maybe Cyber 
Command should be expanded. We are agnostic on that. We, 
obviously, don't have an opinion on that. But those are the 
kind of things to think about. It's, on the one hand, too broad 
to be just one organization, but you definitely got to make 
sure that everyone understands what that breadth means, and who 
is involved, and get them working the correct way. That is more 
important than establishing an organization.
    Mr. Langevin. Thank you very much. The gentleman's time has 
expired.
    Mr. Khanna is recognized now for 5 minutes.
    Mr. Khanna. Thank you, Mr. Chairman. And thank you to all 
of the panelists for your testimony. Many of you have spoken 
about the importance of the United States maintaining our 
strategic advantage in AI and in industries of the future. I 
wonder if any of the panelists have followed the bipartisan 
effort that Senator Schumer, Senator Young, Representative 
Gallagher, and I [have undertaken] with the Endless Frontiers 
Act, which would put $100 billion over 5 years in the National 
Science Foundation, and create a technology directorate to make 
sure America is collaborating with the private sector to lead 
in the industries of the future, a bipartisan bill that has six 
Republican Senators, a number of Republicans and Democrats on 
in the House. And I wonder if any of the panelists have 
comments about the importance of that legislation?
    Mr. Gerstell. Congressman, I would simply say that that is 
exactly the part of the effort that we talk about when we say 
we need a whole-of-society effort. And the National Commission 
on Artificial Intelligence, to which we have made many 
allusions, certainly, underscore the need for a highly trained 
and skilled workforce. And the legislation that you just 
described would be a significant step in that direction.
    The Office of the Director of National Intelligence in its 
Global Trends 2040 Report, talking about what future scenarios 
would look like, made great reference to the fact that it would 
be critically important for our country to have a really 
skilled workforce to be able to deal with the challenges of the 
digital revolution. So anything we can do in that regard is 
clearly going to have very significant dividends. That by 
itself isn't going to stop disinformation, no one suggests that 
it would, but it is part of the overall solution.
    Mr. Khanna. Let me ask you this: I was reading--I am going 
to ask two different questions. I read the report that Eric 
Schmidt and others did on the National Security Commission on 
Artificial Intelligence. So, I think one of the critical points 
in there is that right now, the AI traditionally has--it 
requires voluminous data. But when you are a child and you are 
learning, let's say, the word ``dog,'' it is not like we put 
give a child thousands of data points or pictures of dogs. They 
see a few dogs, and they learn the word ``dog,'' which suggests 
that the human mind is far more complex and sophisticated than 
current AI. And there is work being done at MIT [Massachusetts 
Institute of Technology] and other places to try to understand 
how the human mind actually comprehends with probabilistic 
modeling that would allow AI to operate without voluminous 
data.
    Could you speak to how much of a comparative advantage that 
would be over China, given that China has a data advantage if 
we are able to have AI that doesn't require as much data?
    Mr. Gerstell. I am not sure I have the expertise on that 
particular topic. I don't know if the other panelists do.
    Dr. Lin. I know enough about that to be dangerous. So 
please don't take my word as gospel. It is definitely worth an 
inquiry. I will just point out that the Chinese are aware of 
this, too, and they also understand the importance of 
understanding the neurophysiology of the human brain.
    And, so, I think that to assume that we could go down that 
path and the Chinese wouldn't, I think doesn't work. It is true 
that the Chinese have many data advantages, in some ways, and 
other places we have better data advantages. But to assume that 
the Chinese aren't aware of the importance of neurophysiology 
and so on in the human brain, I think is probably not correct.
    Mr. Khanna. We always have good insight. And I wasn't 
suggesting that China was unaware of--well, I do think leading 
research is being done in the U.S., but more that the data 
advantage that China has is enormous if we don't have 
alternative innovations.
    The final question I have is, I don't know if any of the 
panelists have studied what Finland has done. I was reading 
somewhere that they have this extraordinary intervention at the 
age of 6, because the Russian disinformation campaign was a big 
problem there. And that this digital literacy campaign has, 
presumably, or at least from what I have read, worked in having 
a more informed citizenry that doesn't fall for disinformation. 
A, is that true? Are any of you familiar with the program in 
Finland? And, B, do you have any ideas of what digital literacy 
would look like in the United States?
    Ms. Jankowicz. I am happy to take that one, Congressman. 
Yes, absolutely, that is true. It was not only on Comedy 
Central with Samantha B, but there are many academic studies of 
this as well. And the program starts as early as 5, actually, 
with students getting exposure to what is an ad versus what is 
your Saturday morning cartoon? So, really, not just media 
literacy, but general informational awareness.
    And I would say the United States needs to go one step 
farther when we are talking information literacy. We often 
think about this as something that we can fairly easily, even 
given our federal education system, do in schools. But I would 
say we need to reach voting age adults as well. And how can we 
do that? I mentioned libraries before. Libraries maintain a 
very high level of trust across partisan divides in the United 
States. We have a lot of them. They are looking for their 
raison de'etre in the 21st century. And I think this is a great 
vehicle to deliver this sort of training.
    In the Czech Republic, they have a similar program. I like 
to call this the peas-in-the-mashed-potatoes approach. It is 
targeted at elderly people, teaching them how to use their cell 
phones or iPads to Facetime their grandchildren, just basic 
computer skills. But they also sneak in some information 
literacy in there. And that, again, gets to the need to be 
creative with these sorts of approaches and think outside of--
outside of our normal education national security boxes.
    But the most important thing, not only having a nonpartisan 
messenger, but the curriculum itself needs to be nonpartisan, 
and make sure that we are giving the people tools that they 
need to support the information that they are trying to gather, 
to make decisions at the ballot box, to, you know, make 
economic decisions, et cetera. It shouldn't be motivated by any 
partisan agenda.
    Mr. Khanna. Well, thank you. I would look forward to 
working with you and maybe in a bipartisan way. I think that 
would be a very worthy project for the Congress, in a 
bipartisan way, if we can design a form of digital literacy for 
students and adults. And with that, Mr. Chair, I yield back my 
time.
    Mr. Langevin. Thank you, Mr. Khanna.
    Mrs. Bice is recognized for 5 minutes.
    Mrs. Bice. Thank you, Mr. Chairman. This is really for any 
of the panelists. You know, it is crucial for our Nation to 
have our own robust, offensive information operations 
capabilities in place to influence adversary actions, deceive 
enemies, and to try to stay ahead of the adversarial decision 
making in times of war. What role do you feel is proper for the 
military in this area?
    Dr. Kirschbaum. So, Mrs. Bice, the Department of Defense 
really--it is, at its heart, is an operational military role. 
So at the operational level of war, you know, it is below the 
strategic level. That is primarily what we have been looking 
at, what we are talking about. How to make sure that everyone 
at the combatant command level, the commander understands, as 
he or she is working with partners at the ambassador level, or 
regional allies and partners, understands what we are trying to 
achieve, and to get that done. So those are campaigns that we 
talked about that are taking place below the threshold around 
conflict all the time. The military has--that is the primary 
thing that we are talking here in terms of what the military's 
role is.
    Now, that whole-of-government approach that bring it up a 
level, strategy, where does the United States fit in with its 
allies and partners? That is a much broader--that whole-of-
government, whole-of-society. In this case, the Department 
should plug in to whatever efforts are being done and led out 
of places like the State Department or whatever organizations 
get created in the future. You know, during the Cold War, we 
had the United States Information Agency that organized a lot 
of those things; that orchestrated large campaigns to support 
information for our allies, our partners, and beyond into the 
Iron Curtain, for example. That is a huge undertaking that no 
longer exists. That is gone. That has been swept away. And we 
can't necessarily just recreate it, nor should we, but we think 
about how we do that. And the military would plug in to those 
efforts in addition to maintaining its own battlefield 
capabilities.
    Mrs. Bice. That is all I have, Mr. Chairman. I yield back. 
Thank you.
    Mr. Langevin. Very good. Is there any member on that hasn't 
been recognized yet that wants to be recognized?
    I think we have gotten to everybody.
    Okay. With that, I just want to thank our witnesses for 
your testimony today. It has been very insightful and very 
helpful to our work. I know that I had additional questions, 
and other members may have additional questions that we would 
like to submit for the record. If you could respond to those, 
it would be very helpful as well.
    So with that, again, thank you to our witnesses. I deeply 
value your expertise and your contributions to this important 
conversation in helping us to understand and get our arms 
around these challenges. With that, the hearing stands 
adjourned. Have a great weekend, everyone.
    [Whereupon, at 4:27 p.m., the subcommittee was adjourned.]

     
=======================================================================

                            A P P E N D I X

                             April 30, 2021
      
=======================================================================


              PREPARED STATEMENTS SUBMITTED FOR THE RECORD

                             April 30, 2021

=======================================================================

           
   [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

      
=======================================================================


              QUESTIONS SUBMITTED BY MEMBERS POST HEARING

                             April 30, 2021

=======================================================================

      

                   QUESTIONS SUBMITTED BY MR. MOULTON

    Mr. Moulton. I am disheartened by the dramatic drop in the public's 
trust and confidence in the U.S. military from 70% to 56% that you 
point out in your written testimony, Mr. Gerstell. Trust between the 
people and the military is vital to a democratic nation and to the 
health of an All-Volunteer Force. While this drop in confidence may be 
influenced by external disinformation, do you believe service members' 
own social media activity, personal or professional, may play a role in 
negatively impacting the public's views on the military? Are there 
policy recommendations you would make to the services to ensure the 
U.S. military retains the public's trust without impeding troops' 
freedom of speech?
    Mr. Gerstell. Thank you Representative Moulton for the opportunity 
to respond to your questions. I am not an expert on military matters so 
I will address this from the point of view of a former national 
security official who has studied online disinformation generally. As 
you know, a number of academicians, cyber researchers and think tanks 
have sought to determine the extent to which trust in societal 
institutions can be undermined--and thus democracy corroded--by 
disinformation and the corresponding expression of extremist views. 
Surveys indicate that reinforcing and amplifying factors play a key 
role in instilling and confirming hateful or erroneous beliefs in 
people exposed to extremist speech and false information. The identity 
of the communicators spreading the speech disinformation and 
corroboration and enhancement by opinion leaders are all factors in 
promoting the ``effectiveness'' of extremist speech and disinformation. 
It thus stands to reason that when the general public sees social media 
posts by members of the military espousing hateful or extremist 
positions that are aligned with what the public might be predisposed to 
accept based on prior exposure to disinformation from non-military 
sources, it inevitably combines to shape the public's view of the 
military. That type of reinforcing and corroborating action has a 
potent effect on influencing what people believe. In short, it's hard 
to believe that social media posts (positive and negative) by members 
of the military don't have any effect on the public's perception of our 
armed forces. As you note, it is of course vital that our military 
enjoys the strong approval and trust of the American public, for 
purposes of recruiting, assistance to veterans and obviously support in 
times when our troops are in harm's way. Social media activity by 
members of the military that do not reflect well on that institution 
can have an insidious and ultimately pernicious effect on this level of 
needed approval and trust. Countering problematic speech is difficult 
given how strongly our nation prizes freedom of speech, and it is 
sometimes hard to draw the line between improper hateful expressions 
that should be curtailed for the good of society, and merely 
distasteful if not repugnant opinions. But the mere fact that it's 
difficult to draw the line doesn't mean we should abandon any effort in 
this regard. Indeed, we have legal room to maneuver in this area; the 
law allows stricter regulation of the armed services than the general 
public, and the First Amendment is not absolute (to be clear, this is 
not to suggest any diminution of the latter's scope). Secretary 
Austin's stand-down day was an important substantive as well as 
symbolic step, and clearly the military can do more with internal 
training and education. But many young men and women come to the 
military with little knowledge of how our government works or the 
underlying values upon which our democracy was founded, because of the 
almost total dearth of civic education in high school and lower grades. 
Fixing that problem alone would help minimize extremism in the 
military.
    Mr. Moulton. Mr. Gerstell, you have advocated for an integrated 
disinformation center within the Federal Government, aligning the many 
departments and agencies that have a role in information digital 
communications and creating a central node for responsibility over this 
issue. The NSCAI has made a similar recommendation. Can you describe in 
more detail what you envision this center to look like? What 
authorities or capabilities would this center need to be effective?
    Mr. Gerstell. Representative Moulton, the establishment of an 
integrated ``disinformation'' center, bring together all relevant parts 
of the federal government as well as the private sector, is one of the 
most crucial steps we can take in tackling the problem of 
disinformation.
    While purely domestically generated disinformation is indeed a 
problem, it is made much worse by amplification and expansion by 
foreign adversaries that exploit the natural divisions in our society; 
and of course, those foreign parties themselves are often the initial 
source of the disinformation. Thus, my comments below will focus on 
foreign-propelled disinformation.
    To determine how best to counter foreign disinformation, we need to 
first understand how our foreign adversaries create and spread 
disinformation. Those adversaries, especially Russia and China, engage 
in coordinated, integrated disinformation campaigns involving many 
elements of their governments. For example, when China decided to push 
the falsehood that its system of government was more successful at 
fighting the COVID19 pandemic than ``weak, corrupt Western 
governments,'' the messaging started at the top, from the Ministry of 
Foreign Affairs, and was disseminated in a concerted way through the 
Twitter accounts of over 130 Chinese diplomats stationed around the 
world; Chinese-controlled news media and websites picked up the line 
and spread it too, and then seemingly corroborated it with further 
postings on social media and secondary news stories about how the 
message was reverberating around the globe. Russia's disinformation 
campaigns fomented by the GRU and other organs of the Russian state are 
if anything even more coordinated, so as to create the impression of an 
overwhelming number of ``independent'' news sources and social media 
accounts all espousing the Russian disinformation. In addition to 
creating inauthentic Facebook, Twitter and YouTube accounts owned by 
false personas (often with AI-generated fake profile pictures), the 
Russians might also enlist private sector proxies, such as the Internet 
Research Agency in St. Petersburg, to further promote the Russian 
falsehoods. The Russians careful monitor our domestic social media, 
seizing tendentious statements, conspiracy theories, and outright 
falsehoods, and then amplify and elaborate on them through their 
integrated disinformation machine.
    This system of whole-of-government campaigns to promote online 
malicious disinformation is so different from our American values and 
the way our government operates abroad, that we have difficulty in 
appreciating the effectiveness of our adversaries' endeavors. And yet, 
to be successful in countering it, we must be equally integrated, and 
not regard online disinformation as a one-off expression on a 
particular social media account, or as something that can be simply 
rebutted with a press release from a government agency.
    Thus, to fully apprehend, let alone effectively counter, the scope 
of foreign disinformation aimed at us, we need the active cooperation 
of the major social media platforms, the intelligence community and law 
enforcement to share current information about the sources and scale of 
disinformation campaigns. Artificial intelligence can clearly play a 
major role here in analyzing massive amounts of data on social media, 
combining information about foreign cyber activity from government and 
private sector sources, and in other ways assisting in the overall 
effort to identify and respond to disinformation. We would then be able 
to rebut falsehoods at an earlier stage, and that would entail 
consistent messaging from the White House, the State Department, the 
Departments of Defense, Justice and others. Our federal government has 
historically been reluctant to correct errors circulating in news 
media, let alone social media (partly out of First Amendment concerns 
and the restricted role of government relative to the private sector). 
But the efforts, for example, of the Department of Homeland Security in 
rebutting false claims--both domestic and foreign-sourced--of election 
fraud in last year's elections show how the federal government can make 
its voice heard in impactful ways. Moreover, if the federal government 
provides more detailed information to the news media, think tanks, 
cyber researchers and the like, they can be part of a national effort 
to stem disinformation.
    While it is possible that some additional legal authorities may be 
needed on the margins (for example, mandatory reporting by private 
sector companies of foreign cyber maliciousness), the reality is that 
we can make much progress now, without new legislation, if the 
executive branch makes this a high priority and directs agencies to 
work together in a coherent way. Among other things, the intelligence 
community should be told that disinformation is a higher priority 
national security threat, additional resources should be dedicated for 
that purpose, and a greater effort can be made to declassify relevant 
information to assist social media companies in identifying and 
stopping foreign online malice.
    These steps by the federal government, working with the private 
sector, are within our grasp and will help reduce the scope and 
influence of online disinformation. Obviously, the problem is complex, 
and other societal elements such as more civic education must be part 
of an overall solution--but the federal government can and should take 
the first critical steps now.
    Mr. Moulton. While I am concerned about military readiness, 
disinformation is clearly not just a military problem. As we face 
increasing efforts to mislead the American public and sow distrust and 
disunity, we see social media companies dodge substantive efforts to 
block disinformation's spread. If disinformation is not or cannot be 
eliminated, how would you advise we instead make ourselves harder 
targets? Ms. Jankowicz, you advise bringing local and Federal 
Government entities in health and education into the discussion. Can 
you describe in more detail how these departments and agencies might 
contribute to increased public digital literacy, which is clearly a 
matter of national security in addition to public health and public 
safety?
    Ms. Jankowicz. Thank you for the question, Mr. Moulton. Building 
societal resilience at home is one of the most important aspects of 
responding to disinformation. Our adversaries use pre-existing fissures 
in our society--such as economic inequality, systemic racism, and hot-
button issues like gun rights--to drive us further apart. Their efforts 
are amplified by broad-based misunderstandings of how the traditional 
and social media ecosystem operates. It can be difficult for national 
institutions to deliver resonant messages to the most vulnerable 
populations, however. Those that already distrust government are 
unlikely to be convinced by a public service announcement encouraging 
them to ``take care before they share.'' This is where local government 
can play a critical role in building awareness of the tools and tactics 
of disinformation and building information literacy and civics more 
broadly. They can also serve as the connective tissue between funding 
sources and the organizations best positioned to deliver such 
interventions. I emphasize bringing state and local departments of 
health, education, arts, as well as local libraries to the forefront of 
America's counter-disinformation effort, because they know their local 
communities, their vulnerabilities, and the issues important to them 
best. In my research in Central and Eastern Europe, I have come across 
several local initiatives built on such bespoke local expertise. They 
include:
      In Estonia, where ethnic Russians and Russian-speakers 
are vulnerable to Kremlin-backed disinformation, the Integration 
Foundation offers free courses in Estonian language, cultural 
activities, and consultations about citizenship requirements both in 
Tallinn and Narva, a city on the border with Russia, where much of the 
ethnic Russian population is concentrated.
      In the Czech Republic, recognizing that the elderly are 
particularly susceptible to disinformation but hesitant to engage with 
counter disinformation programming, organizations attempting to build 
media literacy in the local population offered basic computer literacy 
training (such as how to use FaceTime to stay in touch with your 
grandchildren) and snuck in basic information literacy tenets to the 
curriculum. I call this the ``peas in the mashed potatoes'' approach.
      In the Republic of Georgia, one organization trains 
artists (singers, actors, musicians, comedians) from outside of the 
capital, Tbilisi, in recognizing and responding to disinformation. The 
artists then travel to their home region and put on a show 
incorporating what they've learned. This is ``infotainment'' at its 
best, delivered by influencers with credibility in an engaging and 
accessible format.
    In the United States, state and local governments might fund 
similar programs. They could develop information literacy curricula to 
be delivered by local librarians (still highly trusted across the 
political spectrum). They might identify local civil society groups to 
partner with influencers with connections to the locality to act as 
trusted third-party messengers. In times of health emergencies, rampant 
democratic vulnerabilities, or developing public safety issues, such 
trusted conduits can be invaluable in getting authoritative information 
out to the public. It is important to recognize this approach is, by 
necessity, long-term. As I often remark, we cannot fact-check our way 
out of the crisis of truth and trust in which we find ourselves. But we 
can slowly build citizens' ability to recognize disinformation and 
introduce friction into the sharing process. Just like most Americans 
now know to ignore spam emails from purported Nigerian princes 
promising to make them millionaires, we can train them to spot and 
resist sharing the dubious information they encounter online. I am 
including several links to other writing I have done on this topic 
below. Thank you for the opportunity to testify on these important 
issues.
    Mr. Moulton. A third of troops have reportedly declined the Covid 
vaccine, undermining our troops' readiness well before we have entered 
into conflict. As I wrote in a recent Time magazine op-ed, this issue 
has demonstrated the ability of targeted disinformation campaigns to 
undermine troops' confidence in the emerging science and technology 
that underpin national security. How do you advise we protect troops 
from ongoing targeted disinformation campaigns and protect military 
readiness?
    Dr. Lin. I agree entirely with your position that disinformation 
can be (and is indeed sometimes) a threat to military readiness. 
However, the DOD is not in a position to protect troops from all 
sources of disinformation, simply because everyone, including troops, 
can obtain information from multiple sources. That said, the DOD does 
have control over a variety of information sources to which the troops 
may be exposed.
    For example, cable television is available on many if not all 
bases. One could reasonably ask the question--which cable TV channels 
(or shows carried on those channels) broadcast large amounts of 
disinformation that are relevant to national security? For example, DOD 
would be fully within its prerogatives to forbid military bases from 
carrying RT (formerly Russia Today) on cable TV--and indeed, I have no 
knowledge that RT is carried on cable TV at any U.S. military base. But 
certain domestic cable channels have also carried programming with 
disinformation that threatens national security, such as disinformation 
related to Covid vaccines--and DOD has no obligation to make those 
channels (or shows) available on military bases either, even though 
off-base, everyone, including troops, has the right to access them as 
they see fit.
    The same goes for Internet access provided on base. To the extent 
that the troops use DOD facilities to access the Internet, there is no 
reason that DOD should not block access sites that are known to provide 
substantial amounts of disinformation that threaten national security, 
even though DOD cannot forbid the troops from accessing such sites 
using their own resources (such as personal smart phones that they pay 
for themselves).
    Both of these measures regarding cable TV and internet access on 
base require DOD to determine the nature of disinformation that is 
threatening to national security and to identify the channels and sites 
that are the most common purveyors of such disinformation. This will be 
an ongoing challenge rather than an assessment that can be done once 
and then left alone.
    Such measures alone will not make a substantial dent in the problem 
that you describe. Over the longer term, I refer back to my testimony 
in which I call for DOD to take a more active role in training the 
troops on what it means to support and defend the Constitution against 
all enemies, foreign and domestic. Such training presupposes an ability 
to engage in critical thought and to have information literacy skills, 
and to the extent that these skills need to be strengthened in the 
troops, DOD has an obligation to address them in its training efforts.
    Mr. Moulton. Dr. Kirschbaum, it is my understanding that each of 
the services defines information warfare in varying ways, and therefore 
staffs and plans for information warfare differently. Does this limit 
our ability to effectively execute information warfare in a joint 
environment?
    Dr. Kirschbaum. There are indeed differences in how the services 
define and use terms related to operations in the information 
environment. The term ``information warfare'' technically is no longer 
part of joint doctrine and hasn't been since 2006. In its former 
definition, it covered activities DOD would need to perform to 
influence the actions of adversaries as well as the protection of our 
own information. It had both offensive and defensive elements. However, 
the context for its place in joint doctrine suggested that information 
warfare was something done in the early phases of a crisis or conflict. 
In other words, the perception might be that information warfare was 
something done only when there was a war. The broader term 
``information operations,'' on the other hand, had accepted that such 
activities could occur in peace and war. Some services or individuals 
continue to use the term ``information warfare.'' The U.S. Navy, for 
example, has embraced the term in naval doctrine while also recognizing 
how its' sister services (U.S. Marine Corps and U.S. Coast Guard) use 
different terms and definitions (Operations in the Information 
Environment and Information Operations, respectively). The Navy's term 
implies more of a wartime set of activities, while the other terms 
imply broader application. But there is significant overlap. As we have 
discussed in this hearing, one of the fundamental challenges we face 
today is that activities, competition, and conflict are occurring every 
day globally. Our adversaries have prepared for this and view the 
information environment as a useful arena to pursue and secure their 
interests while degrading our own. This is particularly true in the 
area short of armed conflict (often referred to as the ``gray zone''). 
Our adversaries operate freely in this space as we struggle to define 
the lines between peace and war where there may be none. While there is 
a large degree of generality and vagueness to the idea of the 
information environment, it is important to avoid confusion between the 
services and, more importantly, among combatant commanders about the 
importance of the information environment and our ability to operate 
effectively--in offense or defense. These definitions and other lexicon 
issues must be addressed if DOD is going to develop a cohesive, 
holistic, and joint strategy for Information (as a joint function); the 
Information Environment (i.e. current battle space); and activities, 
capabilities, operations, and security functions that will be employed 
in that battlespace. It is our understanding that officials within DOD 
understand this--as they have tried to thread this needle for years 
while struggling to update the 2016 DOD Strategy for Operations in the 
Information Environment. This will be an important part of the 
department's ongoing discussions about the right terms and the right 
context to ensure that the entire joint force can adequately plan for, 
and operate, in the information environment every single day.

                                  [all]