[House Hearing, 117 Congress]
[From the U.S. Government Publishing Office]


                        ACCELERATING DISCOVERY:
                   THE FUTURE OF SCIENTIFIC COMPUTING
                      AT THE DEPARTMENT OF ENERGY

=======================================================================

                                HEARING

                               BEFORE THE

                         SUBCOMMITTEE ON ENERGY

                                 OF THE

                      COMMITTEE ON SCIENCE, SPACE,
                             AND TECHNOLOGY
                        HOUSE OF REPRESENTATIVES

                    ONE HUNDRED SEVENTEENTH CONGRESS

                             FIRST SESSION

                               __________

                              MAY 19, 2021

                               __________

                           Serial No. 117-16

                               __________

 Printed for the use of the Committee on Science, Space, and Technology
 
[GRAPHIC NOT AVAILABLE IN TIFF FORMAT]


       Available via the World Wide Web: http://science.house.gov
       
                               __________

                    U.S. GOVERNMENT PUBLISHING OFFICE                    
44-540PDF                 WASHINGTON : 2021                     
          
-----------------------------------------------------------------------------------       
       
       

              COMMITTEE ON SCIENCE, SPACE, AND TECHNOLOGY

             HON. EDDIE BERNICE JOHNSON, Texas, Chairwoman
ZOE LOFGREN, California              FRANK LUCAS, Oklahoma, 
SUZANNE BONAMICI, Oregon                 Ranking Member
AMI BERA, California                 MO BROOKS, Alabama
HALEY STEVENS, Michigan,             BILL POSEY, Florida
    Vice Chair                       RANDY WEBER, Texas
MIKIE SHERRILL, New Jersey           BRIAN BABIN, Texas
JAMAAL BOWMAN, New York              ANTHONY GONZALEZ, Ohio
BRAD SHERMAN, California             MICHAEL WALTZ, Florida
ED PERLMUTTER, Colorado              JAMES R. BAIRD, Indiana
JERRY McNERNEY, California           PETE SESSIONS, Texas
PAUL TONKO, New York                 DANIEL WEBSTER, Florida
BILL FOSTER, Illinois                MIKE GARCIA, California
DONALD NORCROSS, New Jersey          STEPHANIE I. BICE, Oklahoma
DON BEYER, Virginia                  YOUNG KIM, California
CHARLIE CRIST, Florida               RANDY FEENSTRA, Iowa
SEAN CASTEN, Illinois                JAKE LaTURNER, Kansas
CONOR LAMB, Pennsylvania             CARLOS A. GIMENEZ, Florida
DEBORAH ROSS, North Carolina         JAY OBERNOLTE, California
GWEN MOORE, Wisconsin                PETER MEIJER, Michigan
DAN KILDEE, Michigan                 VACANCY
SUSAN WILD, Pennsylvania
LIZZIE FLETCHER, Texas
VACANCY
                                 ------                                

                         Subcommittee on Energy

                 HON. JAMAAL BOWMAN, New York, Chairman
SUZANNE BONAMICI, Oregon             RANDY WEBER, Texas, 
HALEY STEVENS, Michigan                  Ranking Member
JERRY McNERNEY, California           JIM BAIRD, Indiana
DONALD NORCROSS, New Jersey          MIKE GARCIA, California
SEAN CASTEN, Illinois                RANDY FEENSTRA, Iowa
CONOR LAMB, Pennsylvania             CARLOS A. GIMENEZ, Florida
DEBORAH ROSS, North Carolina         PETER MEIJER, Michigan
                        
                        
                        C  O  N  T  E  N  T  S

                              May 19, 2021

                                                                   Page

Hearing Charter..................................................     2

                           Opening Statements

Statement by Representative Jamaal Bowman, Chairman, Subcommittee 
  on Energy, Committee on Science, Space, and Technology, U.S. 
  House of Representatives.......................................     9
    Written Statement............................................    10

Statement by Representative Randy Weber, Ranking Member, 
  Subcommittee on Energy, Committee on Science, Space, and 
  Technology, U.S. House of Representatives......................    11
    Written Statement............................................    12

Statement by Representative Eddie Bernice Johnson, Chairwoman, 
  Committee on Science, Space, and Technology, U.S. House of 
  Representatives................................................    13
    Written Statement............................................    14

Statement by Representative Frank Lucas, Ranking Member, 
  Committee on Science, Space, and Technology, U.S. House of 
  Representatives................................................    14
    Written Statement............................................    16

                               Witnesses:

Dr. J. Stephen Binkley, Principal Deputy Director, Office of 
  Science at the Department of Energy
    Oral Statement...............................................    18
    Written Statement............................................    20

Dr. Georgia Tourassi, Director, National Center for Computational 
  Sciences at Oak Ridge National Laboratory
    Oral Statement...............................................    29
    Written Statement............................................    31

Dr. Karen Willcox, Director, Oden Institute for Computational 
  Engineering and Sciences at The University of Texas at Austin
    Oral Statement...............................................    44
    Written Statement............................................    46

Dr. Christopher Monroe, Co-Founder and Chief Scientist, IonQ, 
  Inc.
    Oral Statement...............................................    56
    Written Statement............................................    59

Dr. Seny Kamara, Associate Professor, Brown University
    Oral Statement...............................................    63
    Written Statement............................................    65

Discussion.......................................................    70

             Appendix I: Answers to Post-Hearing Questions

Dr. J. Stephen Binkley, Principal Deputy Director, Office of 
  Science at the Department of Energy............................    94

Dr. Georgia Tourassi, Director, National Center for Computational 
  Sciences at Oak Ridge National Laboratory......................   100

Dr. Karen Willcox, Director, Oden Institute for Computational 
  Engineering and Sciences at The University of Texas at Austin..   101

            Appendix II: Additional Material for the Record

Letter submitted by Representative Frank Lucas, Ranking Member, 
  Committee on Science, Space, and Technology, U.S. House of 
  Representatives................................................   108

 
                        ACCELERATING DISCOVERY:
                   THE FUTURE OF SCIENTIFIC COMPUTING
                      AT THE DEPARTMENT OF ENERGY

                              ----------                              


                        WEDNESDAY, MAY 19, 2021

                  House of Representatives,
                            Subcommittee on Energy,
               Committee on Science, Space, and Technology,
                                                   Washington, D.C.

     The Subcommittee met, pursuant to notice, at 11:03 a.m., 
via Zoom, Hon. Jamaal Bowman [Chairman of the Subcommittee] 
presiding.
[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

     Chairman Bowman. This hearing will now come to order. 
Without objection, the Chairman is authorized to declare a 
recess at any time.
     Before I deliver my opening remarks, I wanted to note 
that, today, the Committee is meeting virtually. I want to 
announce a couple of reminders to the Members about the conduct 
of this hearing. First, Members should keep their video feed on 
as long as they are present in the hearing. Members are 
responsible for their own microphones. Please also keep your 
microphones muted unless you are speaking. Finally, if Members 
have documents they want to submit for the record, please email 
them to the Committee Clerk, whose email address was circulated 
prior to the hearing.
     Good morning, and thank you to all of our witnesses who 
are joining us virtually to discuss the importance of 
scientific computing at the Department of Energy (DOE). This 
hearing is one of a series on research and development (R&D) 
activities sponsored by the Department of Energy's Office of 
Science. Today, we will be examining the current status and 
needs of DOE's scientific computing programs, as well as the 
research, development, and workforce training required to 
ensure that DOE and the Nation maintains its leadership in this 
crucial area.
     Stewardship of DOE's scientific computing ecosystem is led 
by the Office of Science's Advanced Scientific Computing 
Research program, or ASCR. ASCR is also DOE's main sponsor of 
research in foundational areas such as applied mathematics and 
computer science. This year, ASCR was funded at just over $1 
billion, about 1/7 of the total Office of Science budget.
     DOE possesses some of the most powerful supercomputers in 
existence. It will deploy the Nation's first exascale system 
this year, signaling an exciting new era in the field of 
scientific computing. Housed at several national laboratories, 
DOE's supercomputers help researchers analyze huge data sets 
and test complex computational models, greatly accelerating the 
pace of discovery in the design of life-saving medical 
treatments, advanced manufacturing, and the prediction of 
climate systems, among many other fields of research. DOE's 
supercomputing ecosystem serves as a critical resource for 
academic and industry users from the U.S. and around the world. 
I am looking forward to discussing with our witnesses the real-
world applications of these incredible systems, and how 
Congress can ensure that they are continuously maintained and 
improved.
     It is also critically important for DOE to support 
research that will lay the groundwork for future computing 
capabilities. We are fast approaching the point at which the 
computing architectures we have relied upon for decades will 
reach their physical and economic limitations. Therefore, ASCR 
must continue to invest in the applied mathematics, computer 
science, and the game-changing technology development 
activities that will enable powerful new paradigms like quantum 
computing.
     As we craft a forward-looking Office of Science 
authorization bill, I will be looking to our witnesses for 
insights into how we in Congress can ensure that these 
activities are robustly supported. As we will explore today, 
scientific computing holds tremendous promise for accelerating 
scientific discovery. But we need to use these capabilities 
responsibly, ethically, and to advance the public good. For 
example, as computing and artificial intelligence become more 
powerful, we must ensure that algorithms are designed to 
protect people's privacy and eradicate bias. We must also stop 
these tools from fortifying the structures of systematic 
racism, as we have seen happen with things like predictive 
policing and facial recognition technology. This will only 
become more important as DOE's supercomputing capabilities are 
used to process, analyze, and store sensitive information, such 
as biomedical datasets.
     Let's also discuss how to retain a strong role for the 
public sector here, to fully tap into computing's potential to 
help solve humanity's most pressing problems, from curing 
diseases to addressing the climate emergency. And let's involve 
the public, especially marginalized communities, in shaping the 
development and aims of new technologies like these so that all 
can share in the benefits equally. As you will hear from our 
witnesses today, we need to pursue an agenda of scientific 
computing for the people.
     Finally, as I have said before, research and 
infrastructure funding represent just one piece of the puzzle. 
We need a skilled and diverse workforce to maintain the 
vitality of DOE's scientific computing ecosystem long into the 
future. I am particularly interested in leveraging programs 
such as the Computational Science Graduate Fellowship to forge 
closer connections between the Department and minority-serving 
institutions. We can all agree on the need for greater 
diversity, equity, and inclusion across our research 
enterprise.
     I want to again thank our excellent panel of witnesses 
assembled today, and I look forward to hearing your testimony.
     [The prepared statement of Chairman Bowman follows:]

    Good morning, and thank you to all of our witnesses who are 
joining us virtually today to discuss the importance of 
scientific computing at the Department of Energy.
    This hearing is one of a series on research and development 
activities sponsored by the DOE's Office of Science. Today, we 
will be examining the current status and needs of DOE's 
scientific computing programs as well as the research, 
development, and workforce training required to ensure that 
DOE, and the nation, maintains its leadership in this crucial 
area.
    Stewardship of DOE's scientific computing ecosystem is led 
by the Office of Science's Advanced Scientific Computing 
Research program, or ASCR. ASCR is also DOE's main sponsor of 
research in foundational areas such as applied mathematics and 
computer science. This year, ASCR was funded at just over a 
billion dollars, about one-seventh of the total Office of 
Science budget.
    DOE possesses some of the most powerful supercomputers in 
existence. It will deploy the nation's first exascale system 
this year, signaling an exciting new era in the field of 
scientific computing. Housed at several national laboratories, 
DOE's supercomputers help researchers analyze huge data sets 
and test complex computational models, greatly accelerating the 
pace of discovery in the design of life-saving medical 
treatments, advanced manufacturing, and the prediction of 
climate systems, among many other fields of research. DOE's 
supercomputing ecosystem serves as a critical resource for 
academic and industry users from the U.S. and around the world. 
I am looking forward to discussing with our witnesses the real-
world applications of these incredible systems, and how 
Congress can ensure that they are continuously maintained and 
improved.
    It is also critically important for DOE to support research 
that will lay the groundwork for future computing capabilities. 
We are fast approaching the point at which the computing 
architectures we have relied upon for decades will reach their 
physical and economic limitations. Therefore, ASCR must 
continue to invest in the applied mathematics, computer 
science, and the game-changing technology development 
activities that will enable powerful new paradigms like quantum 
computing. As we craft a forward-looking Office of Science 
authorization bill, I will be looking to our witnesses for 
insights into how we in Congress can ensure that these 
activities are robustly supported.
    As we will explore today, scientific computing holds 
tremendous promise for accelerating scientific discovery. But 
we need to use these capabilities responsibly, ethically, and 
to advance the public good. For example, as computing and 
artificial intelligence become more powerful, we must ensure 
that algorithms are designed to protect people's privacy and 
eradicate bias. We must also stop these tools from fortifying 
the structures of systemic racism, as we have seen happen with 
things like predictive policing and facial recognition 
technology. This will only become more important as DOE's 
supercomputing capabilities are used to process, analyze, and 
store sensitive information, such as biomedical datasets.
    Let's also discuss how to retain a strong role for the 
public sector here, to fully tap into computing's potential to 
help solve humanity's most pressing problems - from curing 
diseases to addressing the climate emergency. And let's involve 
the public, especially marginalized communities, in shaping the 
development and aims of new technologies like these - so that 
all can share in the benefits equally. As you will hear from 
one our witnesses today, we need to pursue an agenda of 
scientific computing for the people.
    Finally, as I have said before, research and infrastructure 
funding represent just one piece of the puzzle. We need a 
skilled and diverse workforce to maintain the vitality of DOE's 
scientific computing ecosystem long into the future. I am 
particularly interested in leveraging programs such as the 
Computational Science Graduate Fellowship to forge closer 
connections between the Department and Minority-Serving 
Institutions. We can all agree on the need for greater 
diversity, equity, and inclusion across our research 
enterprise.
    I want to again thank our excellent panel of witnesses 
assembled today, and I look forward to hearing your testimony. 
With that, I yield back.

     Chairman Bowman. Finally, I want to note that it is a busy 
day on the Hill, and I may have to step out briefly to ask 
questions on another Committee.
     With that, I now recognize Mr. Weber for an opening 
statement.
     Mr. Weber. Well, thank you, Mr. Chair, and I'll be glad 
to, you know, conduct the hearing while you're gone.
     I do want to thank you for hosting this hearing and to our 
esteemed witness panel for being here this afternoon or 
technically, I guess, this morning. I'm excited to hear about 
the critical advanced scientific computing research and 
development activities being carried out through the DOE, 
Department of Energy's Office of Science.
     The Advanced Scientific Computing Research program, ASCR 
as you referred to it, is one that enjoys bipartisan support as 
a priority within the Office of Science. For the past 30 years, 
researchers within this program have led advances in 
mathematics and computing that form the foundation for those 
complex models and simulations. These developments, in turn, 
have translated to increased knowledge and understanding of 
everything from bioenergy and climate change to Alzheimer's 
disease and health models.
     Today, ASCR hosts some of the world's most powerful 
supercomputers and a high-speed network that moves enormous 
volumes of scientific data at light speed. In the rapidly 
evolving fields of quantum computing and artificial 
intelligence, ASCR is dedicated to maintaining U.S. 
competitiveness and leadership. The program also supports DOE's 
goal of completing the world's first exascale computing system 
this year and a second system within the next year. As our 
competitors race to develop exascale systems on their own, 
DOE's strong support of advanced computing research within ASCR 
is essential to maintaining U.S. leadership in this field.
     And it's more than just hardware that needs additional 
focus, I might add. We need significant modifications to 
today's tools and techniques to deliver on the promise of high-
performance computing. Researchers are in need of a new suite 
of software tools, programming models, and applications to 
enable effective use of exascale systems. Without software and 
application R&D, we will simply have high-powered machines 
collecting dust.
     Additionally, in order to fully and effectively support 
innovation in next-generation science, DOE must also support 
and also encourage cross-cutting research initiatives within 
the Department, as well as other Federal agencies. Within the 
Office of Science alone, ASCR resources and capabilities can be 
used to drive innovation in computational chemistry and 
nanomaterials for energy applications, improve simulations of 
fusion energy reactors, and enhance our ability to predict 
changes in the global climate with next-generation Earth system 
models.
     Other Federal agencies could also capitalize on these 
unique, world-leading resources. As authorized by the Energy 
Act of 2020, the Department of Veterans Affairs (VA) is 
partnering with DOE to use high-performance computing in 
analyzing massive amounts of health data. This data analysis 
will help the VA better understand diseases and improve 
veterans' overall quality of life.
     We should seek to build upon and expand partnerships like 
this so that the entire Federal Government benefits from ASCR's 
tools, as well as its technologies. At the end of the day, 
we're all supporting one thing: U.S. leadership in science, 
technology, and innovation. There is no Federal entity in a 
better position to lead this change than DOE's Office of 
Science.
     That's why I am pleased that we are very close to 
finalizing legislation that provides strong support and long-
term guidance for the Office of Science. We're making sure that 
rubber meets the road, and that the U.S. research enterprise is 
well-equipped with all available resources to successfully 
overcome the generational challenges they face.
     I want to again thank my colleagues for their bipartisan 
outreach and collaboration. And I want to thank the witnesses 
for offering and bringing their effort, their input.
     Thank you, Mr. Chairman, and I yield back.
     [The prepared statement of Mr. Weber follows:]

    Thank you, Chairman Bowman, for hosting this hearing, and 
thank you to our esteemed witness panel for being here this 
afternoon. I am excited to hear about the critical advanced 
scientific computing research and development activities being 
carried out through the Department of Energy's (DOE) Office of 
Science.
    The Advanced Scientific Computing Research Program, or ASCR 
program, is one that enjoys bipartisan support as a priority 
within the Office of Science. For the past thirty years, 
researchers within this program have led advances in 
mathematics and computing that form the foundation for complex 
models and simulations. These developments, in turn, have 
translated to increased knowledge and understanding of 
everything from bioenergy and climate change to Alzheimer's 
disease and health models.
    Today, ASCR hosts some of the world's most powerful 
supercomputers and a high-speed network that moves enormous 
volumes of scientific data at light speed. In the rapidly 
evolving fields of quantum computing and artificial 
intelligence, ASCR is dedicated to maintaining U.S. 
competitiveness and leadership. The program also supports DOE's 
goal of completing the world's first exascale computing system 
this year and a second system within the next year.
    As our competitors race to develop exascale systems of 
their own, DOE's strong support of advanced computing research 
within ASCR is essential to maintaining U.S. leadership in this 
field. And it's more than just hardware that needs additional 
focus.
    We need significant modifications to today's tools and 
techniques to deliver on the promise of high-performance 
computing. Researchers are in need of a new suite of software 
tools, programming models, and applications to enable effective 
use of exascale systems. Without software and application R&D, 
we will simply have high-powered machines collecting dust.
    Additionally, in order to fully and effectively support 
innovation in next- generation science, DOE must also encourage 
cross-cutting research initiatives within the Department and 
with other Federal agencies. Within the Office of Science 
alone, ASCR resources and capabilities can be used to drive 
innovation in computational chemistry and nanomaterials for 
energy applications, improve simulations of fusion energy 
reactors, and enhance our ability to predict changes in the 
global climate with next generation Earth System Models.
    Other federal agencies could also capitalize on these 
unique, world-leading resources. As authorized by the Energy 
Act of 2020, the Department of Veterans Affairs is partnering 
with DOE to use high performance computing in analyzing massive 
amounts of health data. This data analysis will help the VA 
better understand diseases and improve veterans' overall 
quality of life.
    We should seek to build upon and expand partnerships like 
this so that the entire federal government benefits from ASCR's 
tools and technologies. At the end of the day, we are all 
supporting one thing: U.S. leadership in science, technology, 
and innovation. There is no federal entity in a better position 
to lead this charge than DOE's Office of Science.
    That is why I am pleased we are very close to finalizing 
legislation that provides strong support and long-term guidance 
for the Office of Science. We are making sure rubber meets the 
road, and that the U.S. research enterprise is equipped with 
all available resources to successfully overcome the 
generational challenges they face.
    I want to again thank my colleagues for their bipartisan 
outreach and collaboration. And I want to thank the witnesses 
for offering their input on our efforts. Thank you, Mr. 
Chairman, and I yield back the balance of my time.

     Chairman Bowman. Thank you so much, Mr. Weber.
     The Chair now recognizes the Chairwoman of the Full 
Committee, Ms. Johnson, for an opening statement.
     Chairwoman Johnson. Thank you very much, Mr. Bowman, for 
holding this hearing, and Ranking Member Weber. And I also want 
to thank our witnesses for your participation, and I have 
enjoyed reading your thoughtful written testimony.
     The Department of Energy has long been a leader in 
advancing new energy technologies, as well as the fundamental 
and the foundational sciences of physics, chemistry, 
engineering and math and computational science that support 
energy innovation. High-performance computing, or 
supercomputing, is one area the Department has led for decades, 
and DOE shows no signs of slowing down. The Department 
currently stewards two of the top three fastest supercomputers 
in the world. And as we will learn more about from our 
witnesses here today, the United States is on track to finish 
building the first exascale computer in the world this year. 
These systems serve as critical resources for academic and 
industrial users and are a key component of our economic 
competitiveness, scientific leadership, and national security.
     In the past, high-performance computers were needed almost 
solely for specialized scientific and engineering applications. 
Now, as we enter the world where thousands of devices all 
around us are generating millions of bytes of data every 
minute, high-performance computers can be used to fundamentally 
improve our quality of life. Public policies play a critical 
role in supporting the advancement of these capabilities and 
enabling our society and economy to directly benefit from them. 
Additional Federal investments in high-performance computing 
will enable the development of new industries, grow our 
technology economy, and advance our technological leadership 
internationally.
     All that said, as we continue to support the development 
and use of these breakthrough technologies, we almost--we must 
also do everything we can to ensure that we are doing this in a 
responsible and ethical manner even in the face of competition 
from our adversaries.
     I thank you again for being here, and I look forward to 
this important discussion today, and I yield back.
     [The prepared statement of Chairwoman Johnson follows:]

    Thank you, Chairman Bowman, for holding this hearing, and I 
also want to thank this excellent panel of witnesses for your 
participation and thoughtful written testimony.
    The Department of Energy has long been a leader in 
advancing new energy technologies, as well as the foundational 
sciences of physics, chemistry, engineering, mathematics, and 
computational science that support energy innovation.
    High performance computing, or supercomputing, is one area 
the Department has led in for decades, and DOE shows no signs 
of slowing down. The Department currently stewards two of the 
top three fastest supercomputers in the world. And as we will 
learn more about from our witnesses here today, the United 
States is on track to finish building the first exascale 
computer in the world this year.
    These systems serve as critical resources for academic and 
industrial users, and are a key component of our economic 
competitiveness, scientific leadership, and national security.
    In the past, high performance computers were needed almost 
solely for specialized scientific and engineering applications. 
Now, as we enter a world where thousands of devices all around 
us are generating millions of bytes of data every minute, high 
performance computing can be used to fundamentally improve our 
quality of life.
    Public policies play a critical role in supporting the 
advancement of these capabilities, and in enabling our society 
and economy to directly benefit from them. Additional federal 
investments in high performance computing will enable the 
development of new industries, grow our technology economy, and 
advance our technological leadership internationally.
    All that said, as we continue to support the development 
and use of these breakthrough technologies, we must also do 
everything we can to ensure that we are doing this in a 
responsible and ethical manner. Even in the face of competition 
from our adversaries.
    Thank you all again for being here, and I look forward to 
this important discussion today. With that I yield back.

     Chairman Bowman. Thank you so much for your opening 
statement, Madam Chairwoman.
     The Chair now recognizes the Ranking Member of the Full 
Committee, Mr. Lucas, for an opening statement.
     Mr. Lucas. Thank you, Chairman Bowman, for hosting this 
hearing, and thank you to all our witnesses for being with us 
this afternoon.
     Earlier this month, the Energy Subcommittee held a hearing 
on the Department of Energy's Office of Science, which 
emphasized the essential role of DOE in our Federal research 
enterprise and highlighted our shared support of these 
programs.
     Today, we have an opportunity to examine the activities of 
another Office of Science program in Advanced Scientific 
Computing Research, or ASCR. Advanced computing research and 
infrastructure is the backbone of scientific discovery, not 
just at the Department of Energy but at U.S. research 
institutions nationwide. Through the ASCR program, DOE supports 
the development of tools and technologies in high-performance 
computing, applied mathematics, advanced networks, data 
analytics, and next-generation computing initiatives. It also 
hosts some of the most advanced computing resources in the 
world at its national laboratories.
     There is a great potential for Federal agencies and U.S. 
industry partners to leverage ASCR's unique computing 
resources. With adequate support, DOE's program will 
revolutionize our relationship with advanced technology and our 
capacity for scientific progress. This work is vital to our 
clean energy economy, our national security, and our leadership 
in science and technology.
     Yet we know that our international competitors like China 
are outpacing us in basic research investment and are closing 
the gap in key computing focus areas like artificial 
intelligence and quantum sciences. Expanding our capacities in 
these fields requires a strategic effort with strong Federal 
investment and active public-private partnerships.
     That's why in this Congress I've introduced legislation to 
address those challenges. My bill, the Securing American 
Leadership in Science and Technology Act, SALSTA, roughly 
doubles funding for ASCR over the next 10 years.
     Another bill I introduced, the Quantum User Expansion for 
Science and Technology Act, QUEST, establishes a program at the 
Department of Energy to expand public-private partnerships for 
quantum resource use and encourage greater participation in the 
development of quantum information sciences (QIS).
     Mr. Chairman, at this time, I'd like to ask unanimous 
consent to submit for the record a letter from the Quantum 
Industry Coalition on the need to maximize the value of the 
U.S. quantum industry and the role that DOE and its national 
laboratories can play in this high-priority work.
     Chairman Bowman. Without objection.
     Mr. Lucas. Thank you, Mr. Chairman.
     I'm also proud to join my colleague and the Ranking Member 
of the Investigations and Oversight Subcommittee, Jay 
Obernolte, on a bill to strengthen the other high-priority 
computing research program carried out at the Department. This 
week, Representative Obernolte introduced the Next Generation 
Computing Research and Development Act, which authorizes 
various DOE advanced scientific computer programs. These will 
support beyond excellent energy computing, computing workforce 
development, and applied mathematics and software development 
activities. This bill, along with the QUEST Act and SALSTA, is 
an important step to move forward in improving our Nation's 
global standing in science and technology.
     We know that maintaining U.S. leadership will require a 
shared commitment to prioritize DOE and its Office of Science, 
and nowhere is this clearer than in the advanced computing 
space. The United States relies on computing capacities that 
only the Department of Energy can provide. We know that the 
Nation who takes the lead in advanced computing will set the 
stage for the next generation of technologies and technology 
standards. We cannot afford to fall behind in this race.
     Last week, I was encouraged by the progress made by our 
friends in the Senate to recognize the important role the 
Department of Energy plays in advancing U.S. innovation. But 
DOE and the national labs shouldn't be an afterthought when we 
consider the U.S. research enterprise. They're integral to our 
scientific progress. That's why Chairman--woman Johnson and I 
have been working on bipartisan Office of Science legislation 
that will make a strong commitment to the Department of Energy 
and its work, including successful programs like ASCR.
     This legislation to support research at the Department of 
Energy will go hand-in-glove with the NSF (National Science 
Foundation) For the Future Act, which supports basic research, 
STEM education, and technology transfer at the National Science 
Foundation. Together, these research bills will solidify the 
long-term stability of our international leadership in science.
     I once again want to thank our witnesses for being here 
today. I look forward to a productive discussion. Thank you, 
Chairman Bowman, and I yield back the balance of my time.
     [The prepared statement of Mr. Lucas follows:]

    Thank you, Chairman Bowman for hosting this hearing, and 
thank you to all our witnesses for being with us this 
afternoon.
    Earlier this month, the Energy Subcommittee held a hearing 
on the Department of Energy's Office of Science which 
emphasized the essential role of DOE in our federal research 
enterprise and highlighted our shared support of its programs.
    Today, we have an opportunity to examine the activities of 
another Office of Science program in Advanced Scientific 
Computing Research, or ASCR. Advanced computing research and 
infrastructure is the backbone of scientific discovery, not 
just at the Department of Energy but at U.S. research 
institutions nationwide. Through the ASCR (``Oscar'') program, 
DOE supports the development of tools and technologies in high 
performance computing, applied mathematics, advanced networks, 
data analytics, and next-generation computing initiatives. It 
also hosts some of the most advanced computing resources in the 
world at its national laboratories.
    There is great potential for federal agencies and U.S. 
industry partners to leverage ASCR's unique computing 
resources. With adequate support, DOE's program will 
revolutionize our relationship with advanced technology and our 
capacity for scientific progress. This work is vital to our 
clean energy economy, our national security, and our leadership 
in science and technology.
    Yet we know that our international competitors like China 
are outpacing us in basic research investment and are closing 
the gap in key computing focus areas like artificial 
intelligence and quantum sciences. Expanding our capacities in 
these fields requires a strategic effort with strong federal 
investment and active public-private partnerships.
    That's why, this Congress, I've introduced legislation to 
address these challenges. My bill, the Securing American 
Leadership in Science and Technology Act (SALSTA), roughly 
doubles funding for ASCR over ten years. Another bill I 
introduced, the Quantum User Expansion for Science and 
Technology Act (QUEST) Act, establishes a program at the 
Department of Energy to expand public-private partnerships for 
quantum resource use and encourage greater participation in the 
development of quantum information sciences.
    Mr. Chairman, at this time I'd like to ask unanimous 
consent to submit for the record, a letter from the Quantum 
Industry Coalition, on the need to maximize the value of the
    U.S. quantum industry, and the role that DOE and its 
national laboratories can play in this high-priority work.
    I'm also proud to join my colleague and Ranking Member of 
the Investigations and Oversight Subcommittee, Jay Obernolte, 
on a bill to strengthen other high-priority computing research 
carried out by the Department.
    This week, Representative Obernolte introduced the Next 
Generation Computing Research and Development Act, which 
authorizes various DOE advanced scientific computing programs. 
These will support beyond-exascale and energy efficient 
computing, computing workforce development, and applied 
mathematics and software development activities. This bill, 
along with the QUEST Act and SALSTA, is an important step 
forward in improving our nation's global standing in science 
and technology.
    We know that maintaining U.S. leadership will require a 
shared commitment to prioritize DOE and its Office of Science. 
And nowhere is this clearer than in the advanced computing 
space.The U.S. relies on computing capabilities that only the 
Department of Energy can provide. We know that the nation who 
takes the lead in advanced computing will set the stage for the 
next generation of technologies and technology standards. We 
cannot afford to fall behind in this race.
    Last week, I was encouraged by the progress made by my 
friends in the Senate to recognize the important role the 
Department of Energy plays in advancing U.S. innovation. But 
DOE and the National Labs shouldn't be an afterthought when we 
consider the U.S. research enterprise. They're integral to our 
scientific progress. That's why Chairwoman Johnson and I have 
been working on bipartisan Office of Science legislation that 
will make a strong commitment to the Department of Energy and 
its work-including successful programs like ASCR.
    This legislation to support research at the Department of 
Energy will go hand-in-glove with the NSF For the Future Act, 
which supports basic research, STEM education, and technology 
transfer at the National Science Foundation. Together, these 
research bills will solidify the long-term stability of our 
international leadership in science.
    I once again want to thank our witnesses for being here 
today. I look forward to a productive discussion. Thank you 
Chairman Bowman and I yield back the balance of my time.

     Chairman Bowman. Thank you, Mr. Lucas, for your opening 
statement.
     If there are Members who wish to submit additional opening 
statements, your statements will be added to the record at this 
point.
     At this time, I would like to introduce our witnesses. 
First, Dr. J. Stephen Binkley is the Acting Director and 
Principal Deputy Director in the Office of Science at the U.S. 
Department of Energy. Prior to his experience in various 
leadership positions in DOE, Dr. Binkley has held senior 
positions at DOE's Sandia National Laboratories and the 
Department of Homeland Security. He has conducted research in 
theoretical chemistry, materials science, computer science, 
applied mathematics, and microelectronics.
     Next, Dr. Georgia Tourassi is the Director of the National 
Center for Computational Sciences at the Oak Ridge National 
Laboratory. She also holds appointments as an Adjunct Professor 
of Radiology at Duke University and as a Professor of the 
Bredesen Center Data Science Program at the University of 
Tennessee at Knoxville.
     Next, Dr. Karen Willcox is Director of the Oden Institute 
of Computational Engineering and Sciences, Associate Vice 
President for Research, and Professor of Aerospace Engineering 
and Engineering Mechanics at the University of Texas at Austin. 
She holds the W.A. ``Tex'' Moncrief, Jr. Chair in simulation-
based engineering and sciences and the Peter O'Donnell, Jr. 
Centennial Chair in Computing Systems.
     Dr. Christopher Monroe is Co-Founder and Chief Scientist 
at IonQ Inc. and the Gilhuly Family Distinguished Presidential 
Professor of Electrical and Computer Engineering and Physics at 
Duke University. He is an atomic physicist and quantum engineer 
specializing in the isolation of individual atoms as the core 
of a quantum computer.
     Last but certainly not least, Dr. Seny Kamara is an 
Associate Professor of Computer Science at Brown University 
where he codirects Brown's Computing for the People Project and 
the Encrypted Systems Lab. He is also affiliated with Brown's 
Center for Human Rights and Humanitarian Studies, the Data 
Science Initiative, and the Policy Lab. Kamara is a principal 
scientist at MongoDB, a company that provides one of the most 
widely used platforms to store and process data.
     Thank you all for joining us today. As our witnesses 
should know, you will each have 5 minutes for your spoken 
testimony. Your written testimony will be included in the 
record for the hearing. When you all have completed your spoken 
testimony, we will begin with questions. Each Member will have 
5 minutes to question the panel. We will start with Dr. 
Blinky--Blinkley, excuse me. Dr. Binkley, please begin.

              TESTIMONY OF DR. J. STEPHEN BINKLEY,

                   PRINCIPAL DEPUTY DIRECTOR,

         OFFICE OF SCIENCE AT THE DEPARTMENT OF ENERGY

     Dr. Binkley. OK. Thank you, Chairman Bowman and Ranking 
Member Weber. I'm pleased to come here before you today to 
discuss the scientific computing capabilities of the Department 
of Energy, including the forthcoming exascale systems.
     DOE computing traces its roots back to the Manhattan 
Project where extensive use was made of computers. During the 
1950's, John von Neumann, the pioneer in computing, advocated 
for a program that would advance computer development. Over the 
years, ever more powerful computing capabilities were developed 
at the national laboratories beginning with the Lawrence 
Livermore and Los Alamos National Laboratories.
     DOE and its predecessor agencies have supported applied 
mathematics and computer science, along with major investments 
in computer hardware and computational science that have been a 
major driver of progress in high-performance computing, 
spurring the U.S. computing industry forward. DOE computing 
applications have expanded from their original national defense 
focus to a broad portfolio of scientific research and 
significant use by industry beginning with the establishment of 
the leadership computing facilities at Argonne and Oak Ridge 
National Laboratories in 2004.
     Today, DOE computing is a partnership between the National 
Nuclear Security Administration (NNSA) and the Department's 
Office of Science. Our two organizations are working hand-in-
hand to advance high-performance computing, including the 
exascale computing project.
     The strategic importance of high-performance computing has 
grown enormously. High-performance computing has become an 
essential pillar not just of America's national security but 
also of our leadership in science. DOE's supercomputing has 
brought major computational and driven advances in a wide range 
of fields such as climate science, fusion energy, and high 
energy and nuclear physics, materials science, chemistry, 
particle accelerator design, and biology, to name a few.
     Over the last 6 years, we have been very focused on 
achieving exascale computing. The first exascale computing 
system is scheduled for delivery at the Oak Ridge National 
Laboratory to be complete by October of this year. The second 
system will go to Argonne National Laboratory in 2022, and a 
third system to Lawrence Livermore National Laboratory in 2023.
     Exascale has the capability to deepen our understanding of 
climate change and hasten the development of clean energy. 
Partnerships between the Office of Science and NNSA with major 
computing and microelectronic vendors have been key in the 
development of exascale. A series of five partnership programs 
have brought DOE-supported researchers to work hand-in-glove 
with U.S. high-performance computing vendors, including AMD 
(Advanced Micro Devices), Cray, IBM (International Business 
Machines), Intel, Nvidia, and HPE (Hewlett Packard Enterprise) 
to overcome the key technical hurdles in exascale. In total, 
DOE has invested $460 million in this effort alone, matched by 
at least an additional $307 million contributed by industry.
     Current and planned upgrades to Office of Science 
scientific user facilities, including light sources, neutron 
scattering sources, nanoscale, and genomic facilities will 
bring more sophisticated and precise observations and vastly 
larger data outputs. Artificial intelligence and machine 
learning will play a key role in this.
     AI also holds the promise of more sophisticated and 
autonomous facility operations. It has the potential to monitor 
observations and adjust instrument operations in real time to 
further enhance efficiency and utilization of the facilities. 
DOE's ESnet provides ultrahigh broadband connectivity across 
the DOE laboratories as connectivity will be increasingly vital 
as facility operations are controlled computationally.
     We are looking forward and beyond exascale to new 
frontiers such as quantum information science. Leadership in 
science remains indispensable to our high-performance--to the 
country's prosperity, and high-performance computing is key. 
Continued stewardship and development of the skilled HPC 
workforce is essential. Our Computational Science Graduate 
Fellowship program is one such activity. Since its 
establishment in 1991, the program has sponsored over 450 
fellows from more than 60 universities. DOE's response to the 
COVID-19 pandemic demonstrates the enormous value of DOE's 
high-performance computational research resources.
     In summary, opportunities for accelerated scientific 
discovery will be enabled by current--the current era of high-
performance computing marked by the advent of exascale systems 
and the rapid development of AI and machine learning.
     And I'll end there.
     [The prepared statement of Dr. Binkley follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
     Chairman Bowman. Thank you so much, Dr. Binkley.
     Dr. Tourassi, you're now recognized.

               TESTIMONY OF DR. GEORGIA TOURASSI,

                   DIRECTOR, NATIONAL CENTER

                   FOR COMPUTATIONAL SCIENCES

                AT OAK RIDGE NATIONAL LABORATORY

     Dr. Tourassi. Chairman Bowman, Ranking Member Weber, 
Chairwoman Johnson, Ranking Member Lucas, and distinguished 
Members of the Committee, thank you for the opportunity to 
appear before you today. My name is Georgia Tourassi. I lead 
the Department of Energy's Oak Ridge Leadership Computing 
Facility, OLCF, at the Oak Ridge National Laboratory in 
Tennessee. I'm a biomedical engineer and a computational 
scientist by education and training.
     High-performance computing has been the cornerstone for 
the Nation's scientific advancement, technology innovation, 
competitive advantage, and economic prosperity. Its impact on 
global competitiveness has long been embodied by the saying, 
``You must out-compute to outcompete.'' With advances in data 
technologies, machine learning, and AI, this saying can be 
amended to, ``You must learn faster to outcompete.''
     OLCF has been a global leader in high-performance 
computing for nearly 30 years. Currently, OLCF hosts Summit, 
the Nation's most powerful supercomputer for open science. 
Summit is in high demand for modeling and simulation, data 
analytics, and AI to better understand climate change, develop 
new ways to produce clean energy, design advanced materials, 
advance public health, and overall push the frontiers of 
science. The request for time on Summit is up to five times 
more than the hours available.
     Our facility is both deliberate and responsive to national 
needs. In the past year, I have experienced firsthand our 
staff's Herculean efforts to be both fast and offer world-
leading computing resources and computational and data 
expertise in the fight against the coronavirus. Through the 
COVID-19 High-Performance Computing Consortium, Summit and our 
competent staff, using world-leading AI, helped accelerate 
discovery, understand the virus, and inform management of the 
pandemic response. I would like to thank Congress for the CARES 
Act funds OLCF received to augment Summit and help support the 
COVID-19 research community.
     Now, in 2021, OLCF is at the brink of delivering the first 
exascale computer in the United States called Frontier. This 
supercomputer will perform calculations up to eight times 
faster than Summit and will keep the United States at the 
forefront on high-performance computing. To prepare, the DOE's 
Exascale Computing Project is developing critical applications 
across many scientific and technical disciplines to run on 
Frontier on day 1.
     In addition, exascale will offer training opportunities to 
grow a more high-tech and computationally savvy workforce in 
our Nation. It is imperative for the United States to expand 
and enhance the national research computing ecosystem.
     DOE has asked us to deliver Frontier 1 year earlier than 
planned, and we're focusing our efforts on meeting that 
schedule. Once the system is delivered, we will need to 
properly fund the operation and applications to solve complex 
real-world problems in partnership with leading research 
institutions, industry, and other Federal agencies.
     We need to continuously invest in new technologies such as 
AI and accelerated computing methods to maintain our 
competitive advantage and ensure our global leadership. We need 
to make investments in a national data infrastructure that 
makes the most of our high-performance computing and national 
data assets. The COVID-19 pandemic demonstrated the vital 
importance of having established interagency programs and data 
integration ahead of these anticipated crises and the utility 
of high-performance computing and AI for rapid, complex, real-
world data analysis.
     The DOE leadership computing facilities are uniquely 
positioned to support and integrate its research 
infrastructure, combining our leadership computing with 
national experimental facilities and Federal data assets to 
deliver unprecedented technological, scientific, and economic 
advantages to the Nation.
     We know high-performance computing is high on other 
nations' priorities. We know that China, Japan, and the 
European Union are all investing heavily in exascale computing, 
and this has implications for both our national security and 
our overall global competitiveness. We cannot afford to be left 
behind.
     Thank you again for the opportunity to testify. I welcome 
your questions on this important topic.
     [The prepared statement of Dr. Tourassi follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
     Chairman Bowman. Thank you so much, Dr. Tourassi.
     Dr. Willcox, you are now recognized.

                TESTIMONY OF DR. KAREN WILLCOX,

                  DIRECTOR, ODEN INSTITUTE FOR

             COMPUTATIONAL ENGINEERING AND SCIENCES

              AT THE UNIVERSITY OF TEXAS AT AUSTIN

     Dr. Willcox. Thank you, Chair Bowman, Ranking Member 
Weber, Chair Johnson, Ranking Member Lucas, and Members of the 
Subcommittee.
     Today, I have three main points. First, the future of 
scientific computing must be interdisciplinary. Second, the DOE 
ecosystem that supports mission-driven basic research in 
scientific computing is a national scientific treasure. And 
third, the future of scientific computing hinges critically on 
the availability of a highly skilled workforce passionate about 
addressing the Nation's challenges in science, security, and 
sustainability.
     So first, on the interdisciplinary future of scientific 
computing, the pace at which scientific computing can 
accelerate discovery and innovation will be limited by the rate 
at which we address foundational challenges that currently 
limit the complexity, scale, and trustworthiness of 
computational tools. This requires scientific computing 
research that draws on many fields, including computer science, 
computational science, the mathematical sciences, the domain 
sciences, and engineering.
     Particularly important is the role of the field of 
computational science. Computational science differs from 
computer science because at its core, computational science 
involves developing mathematical models and simulations rooted 
in physical and mechanistic principles.
     As we look to the future of scientific computing, the 
boundaries between computational science and computer science 
are becoming increasingly blurred. The future of scientific 
computing will involve new approaches that span the two fields 
such as AI and machine learning, and indeed the DOE has been at 
the forefront of defining notions such as AI for science and 
scientific machine learning.
     However, when it comes to AI approaches in science and 
engineering, we must be careful not to chart our course based 
entirely on the successes of data science and machine learning 
in vastly different domains such as social media and online 
retail. We must instead recognize that energy, environmental, 
and nuclear challenges by their very nature require predictions 
that go well beyond the available data. There's a critical need 
to quantify uncertainty and to make informed decisions that 
account for risk. The future of scientific computing will only 
address these needs through a balanced investment in the 
foundational mathematical sciences and in computational 
science, along with data science and computer science. And we 
must also not underestimate the criticality of continuing to 
invest in experimental research and development since advancing 
discoveries through computational models really requires 
validation.
     That brings me to my second point on the value of the 
DOE's mission-driven basic research ecosystem and its role in 
addressing these challenges. DOE supports for basic research at 
the national labs and at the Nation's universities has fostered 
interdisciplinary computing research in a way that community-
driven basic research has struggled to achieve. And as one 
example, I highlight the Mathematical Multifaceted Integrated 
Capabilities Centers, or MMICCs, of the DOE applied math 
program. These centers focus on applied math basic research but 
strongly driven by application needs. For example, our AEOLUS 
(Advances in Experimental Design, Optimization and Learning for 
Uncertain Complex Systems) MMICC is addressing the basic 
mathematical research needs for advanced materials and additive 
manufacturing.
     The MMICC program has been transformational in how it has 
shaped my own basic research portfolio, and one of the critical 
elements, first, the size of the center is large enough to 
bring together a diverse team that includes mathematicians, 
computer scientists, computational scientists, engineers, and 
domain experts spanning universities and national labs. This in 
turn enables a much-needed holistic approach for a complex 
system.
     Second, the long funding horizon provides the stability to 
invest in challenging, high-payoff basic research ideas.
     And third, the mission-driven nature challenges my 
mathematical research to target problems that are of high 
relevance to practitioners if the focus on basic research 
permits us to lay long-lasting foundations.
     My final point is that achieving this future vision for 
scientific computing hinges critically on the availability of a 
highly skilled workforce. The challenges in front of us are 
twofold. First is training the workforce with the 
interdisciplinary skills that cut across the mathematical 
sciences, computing, and domain sciences, and second is 
ensuring a strong, diverse pipeline of highly trained 
professionals who remain committed to scientific and 
engineering domains rather than being lured away by more 
lucrative positions in commercial and business sectors. A 
critical part of this training is the immersive research 
experiences enabled by basic research grants such as the MMICC 
program I described earlier. Maintaining a strong investment in 
DOE basic research funding for universities while also 
continuing to support the collaborative and academic alliance 
programs at the national labs is absolutely critical to 
addressing the Nation's future workforce needs.
     Thank you, and I look forward to your questions.
     [The prepared statement of Dr. Willcox follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
     Chairman Bowman. Thank you so much, Dr. Willcox.
     Dr. Monroe, you are now recognized.

              TESTIMONY OF DR. CHRISTOPHER MONROE,

           CO-FOUNDER AND CHIEF SCIENTIST, IONQ, INC.

     Dr. Monroe. Good morning. Thank you, Mr. Chairman, Members 
of the Subcommittee, for this opportunity to testify before you 
today. I'm here on behalf of IonQ, a company that builds 
quantum computers. IonQ is headquartered in College Park, 
Maryland, and was spun out of the University of Maryland and 
Duke University about 5 years ago. I'm also a Professor of 
Electrical and Computer Engineering and Physics at Duke 
University. I have over 2 decades of experience in the field of 
quantum computing technology from both academic and industrial 
perspectives, and I'm here to talk about the future of 
computing in terms of quantum information.
     Quantum computers, as you may have heard, are--they're as 
revolutionary as they are challenging to grasp and build. Their 
might, given these challenges, demand special attention. As you 
know, the 2018 National Quantum Initiative, or NQI, was 
initiated by the House Committee on Science, Space, and 
Technology to ensure that the United States remains at the 
forefront of this technology. The NQI endowed the Department of 
Energy, the National Science Foundation, and the National 
Institute of Standards and Technology (NIST) with coordination 
from the Departments of Defense (DOD) and the intelligence 
community to stimulate foundational research in quantum 
computing and other quantum technologies and translate this 
technology from laboratory to industry.
     So how does a quantum computer work? It's essentially not 
hard. It's just that quantum computers follow laws of physics 
that have no analogy in everyday life, so it's confounding. 
It's not exactly that it's hard. Information in quantum 
computers can exist in superposition; that is, multiple values 
can be stored and processed simultaneously in a single memory 
device. But each time you expand a quantum computer by just a 
single bit--we call them quantum bits or qubits--its power 
essentially doubles. So with just 300 quantum bits, that's a 
pretty small chunk of matter, a quantum computer can process 
more possibilities than there are atoms in the entire universe. 
This massive parallelism in quantum computers allows certain 
computations to be performed that could never be accomplished 
using regular computers.
     So a few far-reaching applications for this new mode of 
computing include optimization of complex problems dealing with 
huge amounts of data, including logistics and things like 
pattern recognition; secondly, molecular and materials design 
for energy, medical and defense applications; and finally, 
security, including secure communication, encryption, and 
decryption or code-breaking.
     IonQ has collaborative projects in all of these areas, and 
one thing that's interesting in this field is it's such an 
early stage of this technology that it's----
     Chairman Bowman. Mr. Monroe, time has run out. Can you 
just finish that last point?
     Dr. Monroe. Yes. I think somehow the clock never started, 
I noticed.
     Chairman Bowman. Yes, that's probably right. That's weird.
     Dr. Monroe. I think I was talking for about 2 minutes 
but----
     Chairman Bowman. Yes, let's put 2 minutes on the clock if 
we can.
     Dr. Monroe. OK. I'll go quickly. So it's critical that 
quantum computer builders co-design applications with the 
systems they build. It's really not physics anymore, but all of 
the physical and chemical sciences, all of the engineering 
fields, computer science, algorithm design, economics, and even 
social sciences. So it's no surprise that one of the most 
important applications in quantum computers is energy and that 
the Department of Energy is an important player in advancing 
this field.
     So IonQ machines and those built by others are still too 
small to beat regular computers in these types of problems, but 
we're just at the beginning of this commercial phase, and this 
situation will change very soon.
     The core of a quantum computer is exotic, and its key 
attribute is isolation. It involves devices either cooled to 
nearly absolute zero temperature--that's negative 460 degrees--
or in the case of our technology at IonQ, we use individual 
atoms suspended in a small vacuum chamber and poke with laser 
beams. I should also note that this technology was developed at 
the National Institute of Standards and Technology in the 
1990's where I worked with David Wineland in developing the 
first quantum logic gate.
     So, as exotic as this is, the core technology is not 
necessarily the main challenge. The real challenge as I see it 
is creating the workforce to understand how to deploy quantum 
systems. I like to say at the universities we typically don't 
build components for people to use, but industry so far has 
been a little bit slow to develop this technology because they 
don't have a basis in quantum. And so this is, I think, 
historically where government laboratories can play a role. At 
IonQ, our systems are now available on cloud servers, and at 
Duke University we're setting up a quantum--Duke Quantum Center 
that will be a scientific user facility that will serve the 
scientific use cases. And this is very important for the field.
     So I want to conclude. I think I--it was a lot faster than 
I thought it would be. But now is a critical time for DOE, NSF, 
NIST, DOD, and the intelligence community to redouble and 
coordinate their efforts in translating quantum computers to 
the real world. One example is the QUEST program that 
Congressman Lucas mentioned that would subsidize access to 
industrial quantum computers. Another is endowing the NSF with 
a technologically driven division mandate. This is a 
particularly good way to ensure that this emerging technology 
gets used in the field.
     I'm a member of many advisory boards in Europe and in 
Asia, and I'm well aware of the coordinated investments 
overseas. The United States must lead the race to build quantum 
computers and other quantum technologies, and I think that the 
programs and continued stewardship of the National Quantum 
Initiative by DOE and the other agencies I mentioned are 
critical to continued leadership.
     I again think the Committee and Chairman for his 
leadership and for the opportunity to testify today even though 
I took, I think, 4 minutes. Thank you.
     [The prepared statement of Dr. Monroe follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
     Chairman Bowman. Thank you, Dr. Monroe. Apologies about 
the issues with the clock.
     Dr. Kamara, you are now recognized.

                 TESTIMONY OF DR. SENY KAMARA,

             ASSOCIATE PROFESSOR, BROWN UNIVERSITY

     Dr. Kamara. Thank you. Chairman Bowman, Ranking Member 
Weber, Chairwoman Johnson, Ranking Member Lucas, and 
distinguished Members of the Committee, I appreciate the 
opportunity to testify at today's hearing on the future of 
scientific computing at the Department of Energy.
     By the end of the year, the Oak Ridge National Lab will 
receive the world's first exascale supercomputer. This computer 
will be able to process 10 to the 18 or one quintillion 
operations per second. It is hard to overstate how difficult 
this is to achieve and what an accomplishment it is. This 
considerable leap in computing power will open the doors to new 
discoveries and significantly impact a multitude of fields, 
including medicine, meteorology, cosmology, and artificial 
intelligence.
     It is clear that the world-class research and high-
performance computing that has been conducted by U.S. 
universities, national labs, and industry in order to achieve 
exascale computing will affect our lives for the foreseeable 
future. But as we enter the era of exascale computing, I would 
like to provide a word of caution. I'm sure we can all agree 
that computing and the technologies it enables have had a 
tremendous impact on society. Because of this, it is easy to 
assume that technological progress always leads to positive 
outcomes and that new technologies benefit everyone equally. 
But this is not the case. Technology, like policy, can have 
disparate impact. It can enable positive outcomes for some and 
cause great harms to others.
     Consider, for example, advances in facial recognition 
which allow us to log into our smartphones faster but also 
enables suspicion-less mass surveillance with the progress in 
computer vision and robotics that enables new drones that can 
deliver medicine to hard-to-reach rural areas or missiles at 
the push of a button by somebody sitting in a room thousands of 
miles away. We must always remind ourselves that technology is 
not inherently good and does not always benefit everyone 
equally by default. In fact, we need to think hard about the 
harms technology can cause and work even harder to mitigate 
those harms.
     One of the many important applications of exascale 
computing is artificial intelligence and machine learning, for 
example, to predict how a cancer patient might respond to a 
particular treatment. But as we know, thanks to the work of 
scholars like Cathy O'Neil, Joy Buolamwini, and Timnit Gebru 
and to outlets like Pro Publica, machine learning algorithms 
can be biased and can exhibit different behaviors on different 
populations. And as has been widely documented, these biases in 
machine learning most often harm people of color and those from 
marginalized communities.
     So while we should appreciate that thousands of world-
class scientists and engineers across the country are 
diligently working toward making exascale machine learning for 
cancer a reality, we also have to ask how many are working on 
ensuring that these cancer treatment prediction models work for 
people of all genders and of all races?
     The investments we are making in exascale computing will 
improve national security, the U.S. economy, and industry, but 
will everyone benefit equally from this investment? Will the 
13-year-old girl from Washington Heights, New York, benefit 
from this investment as much as the tech, energy, and 
pharmaceutical industries? Will there be as much effort to use 
these supercomputers in the fight against sickle-cell anemia as 
other diseases?
     Exascale computing is not only an incredible achievement 
but it's an incredible resource with the power to shape our 
lives and those of future generations. As such, we must be 
careful and thoughtful about how we make use of it. In 
particular, it is incumbent upon us to make sure that we deploy 
and use this resource in a manner that is fair and inclusive 
that benefits not only the powerful but those who have 
historically been marginalized by society and by technology.
     Thank you, and I look forward to answering your questions.
     [The prepared statement of Dr. Kamara follows:]
    [GRAPHICS NOT AVAILABLE IN TIFF FORMAT]
    
     Chairman Bowman. Thank you, Dr. Kamara.
     At this point, we will begin our first round of questions. 
The Chairman now recognizes himself for 5 minutes.
     Dr. Kamara, I'm going to start with you. Thank you so much 
for your testimony and for your attention to making sure that 
all people benefit equally from investments in scientific 
computing and other new technologies. And I understand that you 
are cofounding a new research institute at Brown University 
called Computing for the People. Can you talk more about why we 
should incorporate these kinds of questions into the R&D agenda 
from the very beginning? What can we do as Congress to design 
research programs that will prevent harmful applications down 
the road, keep the needs of marginalized communities and all 
people in mind, and prepare the computing workforce to engage 
with these issues as well?
     Dr. Kamara. Yes, so, as I said in my statement, it's clear 
that computer science and technology have had a huge impact, 
but the reality is that, as a field, we haven't really centered 
the problems of marginalized groups. It's just not something 
that comes naturally to the field. And there's many reasons for 
that. Some include the lack of diversity in computing, which is 
something that is well-documented. And so there's no natural 
way for--you know, for computer science research and technology 
research to really address those problems. And so this is why 
we're building this institute at Brown, and the motivation is 
to really make it a priority, right, understanding the problems 
that marginalized communities face and how technology can help 
and really focusing on that as our main motivation. So that's 
what we're doing.
     The way that Congress can help is both in funding this 
kind of research and in asking questions, right, making sure 
that the needs of all people are addressed by technology and by 
computing research, just as we're doing in this hearing today.
     Chairman Bowman. I was muted. Sorry about that. How much 
is a deep dive understanding and analysis of implicit bias in 
algorithms and computer science a part of this work that you're 
referring to?
     Dr. Kamara. Yes, it's crucial. So when we design--and 
it's--and I also want to highlight that it's not only at the 
level of research. It's also the level of education. So when we 
teach our students computer science and we teach them how to 
design algorithms, we're not teaching them how to think about 
bias and how to address it. We're not teaching them how to 
think about energy efficiency and how to design algorithms that 
are not only fast but also that minimize the amount of energy 
consumption. So these are all things that we want to do in our 
institute. And basically we want to integrate what we call 
responsible computing into computer science education and into 
computer science research as well.
     Chairman Bowman. OK.
     Dr. Kamara. Learning how to find biases in data, learning 
how to find biases in algorithms is a crucial part of that.
     Chairman Bowman. Thank you very much.
     Dr. Tourassi, thanks for your testimony as well. I was 
intrigued by your suggestion for investing in a national data 
infrastructure that could be housed at DOE labs and that could 
play an important role in democratizing AI. Can you say more 
about what this would look like?
     Dr. Tourassi. Absolutely. We know that the explosive 
growth of AI is based on the three pillars, the supercomputing, 
the algorithms, and data. We talk a lot about investments on 
the supercomputing side and algorithms, but data is the fuel 
that will make the engine--the airplane fly. And, as I pointed 
out in my written and oral testimony, the past few years have 
taught us the importance of having a data infrastructure that 
makes the most of our Federal data assets, examples with the 
partnership with the National Cancer Institute, as well as with 
the Veterans Administration.
     Building on that thread, because that infrastructure was 
in place and the interagency partnership was in place with the 
VA, we were able to pivot fast to address challenges with the 
COVID-19 pandemic because the infrastructure was in place to 
accept data, new data related to COVID-19 cases in the veteran 
population to start doing within 48 hours large-scale 
epidemiological studies and observational studies.
     So this is what I meant in my statement that this was an 
important lesson learned that we need to be proactive and to 
put all resources necessary to support that infrastructure. DOE 
has a long history of building and sustaining successfully data 
infrastructures and enable--enabling the broader scientific 
community to make the most of it.
     Chairman Bowman. Thank you so much, Dr. Tourassi.
     I now recognize Mr. Weber for 5 minutes.
     Mr. Weber. Thank you, Chairman. I appreciate that. And I'm 
going to go back to you, Dr. Tourassi. We're going to keep you 
on the hot seat for a little longer.
     First of all, I'd like to congratulate you on all your 
hard work at Oak Ridge as Director of its Leadership Computing 
Facility. I'm sure you just didn't wake up one day and decide 
you were going to get into something like that. You had 
probably been studying and working at that a long time. And I 
know I speak for all of us when I say I cannot wait to see 
Frontier in action this fall, so congratulations.
     You've noted, Doctor, that Frontier will on day one of its 
operation no less, be ready to run applications across two 
dozen scientific and technical disciplines. Can you take a 
moment for us not-so-technical people maybe to explain why, 
when it comes to the race to the world's fastest supercomputer 
that the hardware development can't be our only focus? Talk to 
us, please.
     Dr. Tourassi. Absolutely. As you said, the tool by itself 
is not the enabler. We need to see the scientific impact. And 
scientific impact is measured in many different ways. First of 
all, by the breadth of application domains that benefit by the 
tool, how much faster we can do scientific discovery across 
that breadth of applications, and also how efficiently we use 
the tool.
     So in partnership with the Exascale Computing Project, 
there is a portfolio of applications that have many critical 
mission areas, some of which I highlighted in my report. And we 
are working very closely with the exascale computing team to 
make sure that the necessary software tools will be in place to 
enable the scientific community that will need them, as I said, 
from day one. Some of these applications in climate change, in 
renewable energy, in advanced manufacturing, in biology and 
public health because we know that these are pressing 
application areas with great societal impact.
     Mr. Weber. Absolutely, thank you. How does the utility or 
usability of DOE's supercomputer like Summit, for example, 
we're talking about competitors. How does it compare to some of 
our competitors' supercomputers?
     Dr. Tourassi. So, clearly, these metrics are not monitored 
and publicly known, but we can say for sure for the United 
States and the leadership computing facilities not only we are 
oversubscribed, we operate in the high 90's efficiency. 
Anecdotally, some of our competitors are certainly behind, so 
they have the tool, not necessarily the most effective use of 
the tool. But, as I said, this is a race. We cannot just relax 
in that race. We need to keep moving forward and making sure 
that we are spearheading the specific domain.
     Mr. Weber. Well, thank you for that. I'm quite sure China 
has all the information on everybody's capabilities, so we know 
that.
     So we do want to keep up with our competitors. We'd love 
to have that information. And point out to us why it really 
matters. I know we want to be first in the race, but what other 
examples would you use other than just time and being first? 
Why does it matter that we get ahead of our competitors? Give 
us some examples that tie to our country. Can you do that?
     Dr. Tourassi. Well, certainly the issues of national 
security are extremely important, and we know that 
supercomputers have played a very important role in addressing 
applications in the national security space, and we expect that 
space to grow even further.
     As the other panelists and you all mentioned the issues of 
adversarial use of the technologies we develop, so we'd need to 
be very much aware that this is not only in our horizon, it is 
present.
     Mr. Weber. Well, perfect. That's kind of where I hoped you 
were going.
     I'm going to jump over to you, Mr. Binkley--or Dr. 
Binkley, I'm sorry. As I mentioned in my opening statement, I 
believe that in order to fully realize the potential of the 
world-leading computing capabilities stewarded by our national 
laboratories, the Office of Science must prioritize expanding 
ASCR research partnership with other Federal agencies. Very 
quickly, Department of Energy's position to conduct scientific 
computing research to resolve diverse challenges, how so in 
your opinion? I'm going to be over time a little bit.
     Dr. Binkley. So we have in fact over the last half a dozen 
years really put priority on growing the ASCR program. It was 
clearly in recognition of the need to develop and then deploy 
exascale. And, as has been elaborated on here in the last 5 or 
6 minutes, concurrent with the development of the exascale 
computer systems has been a very concerted effort to develop 
applications that are ready--that will be ready on the day the 
machine is first turned on.
     Mr. Weber. Yes.
     Dr. Binkley. The other thing that we've done is we have 
worked through the National Science and Technology Council 
processes to coordinate closely with other Federal agencies, 
including the National Science Foundation, NASA (National 
Aeronautics and Space Administration), NOAA (National Oceanic 
and Atmospheric Administration), et cetera, and have tried to 
encourage the other agencies to also make investments in the 
high-performance computing capabilities for their specific 
applications. And in the case of the National Institutes of 
Health, as Dr. Tourassi has pointed out, we--one of the 
exascale applications is focused on cancer problems stemming 
from the National Institutes of Health. And I'll stop there.
     Mr. Weber. All right. Well, thank you. I appreciate that. 
And then, Mr. Chairman, thank you for the indulgence.
     Staff. Ms. Bonamici is next.
     Ms. Bonamici. Thank you so much to Chair Bowman and 
Ranking Member Weber, and thank you to the witnesses.
     In the district I represent in northwest Oregon, 
researchers at Intel are developing the foundation for exascale 
computers, commercially viable quantum systems, and also 
partnering with the Department of Energy to advance other high-
performance computing technologies. And I know these efforts 
will help us transition to clean energy economy, better predict 
extreme weather events, strengthen preventative medicine, 
improve emergency response, and more.
     So, Dr. Binkley, it's sometimes challenging to 
conceptualize the benefits of the Department of Energy's work 
on scientific computing. I am the Co-Chair of the House Oceans 
Caucus, so I especially appreciated your specific example of 
the work Lawrence Berkeley National Laboratory (LBNL) and Los 
Alamos and the University of Bristol are doing to provide more 
accurate pictures of retreating ice sheets contributing to sea 
level rise. How will future exascale capabilities strengthen 
our understanding and response to the climate crisis, and how 
can Congress better support this work expanding SciDAC 
(Scientific Discovery through Advanced Computing) partnerships 
across the DOE?
     Dr. Binkley. So let me take your last question first. 
The--with the incoming Biden-Harris Administration, there has 
been a reorganization within the Department of Energy that 
brings the applied energy programs back into the Under 
Secretary for Science organization, and that gives us very 
close coupling with the applied energy programs. That is 
between the Office of Science and the applied energy programs. 
And I'm hopeful that that will set the stage where we can 
expand SciDAC to reach the other parts of the Department.
     Going back to the question about climate simulations, our 
Earth systems model, which has been in development now since 
about 5 or 6 years and is slated to be, you know, up and 
running on the exascale computer in the fall, we are 
systematically increasing the resolution of that model by use 
of the--you know, the power of the exascale computers to have 
higher resolution, more predictive models of climate effects 
and then, you know, we will be able to predict areas that are 
going to be problematic in the future. You know, that will 
become, I think, a fairly standard tool for predictive earth 
systems modeling. I'll stop there.
     Voice. You are muted.
     Ms. Bonamici. Sorry about that. Dr. Binkley, thank you. I 
do want to get in another question.
     Dr. Kamara, we know that technology is not developed or 
used in a vacuum. The growing body of evidence suggests that, 
left unchecked, digital tools can absorb and replicate systemic 
biases that are ingrained in the environment in which they are 
designed. For example, one hiring algorithm often referenced 
for its apparent biases identified high-performers as anyone 
named Derek who played lacrosse even though those features had 
no connection for the actual jobs which the firm was screening. 
Unfortunately, digital tools are opaque in their design and 
operation, and it's hard to hold it accountable. So what steps 
can Congress take to prevent harmful bias in the research, 
development, and commercialization of supercomputing 
technologies? And will advances in neuromorphic computing help 
to end/or minimize bias or simply make it harder to detect?
     Dr. Kamara. Yes, so there are a lot of ways that bias can 
creep into algorithms, and there's a lot of research going on 
on how to mitigate that, so a lot of people are thinking hard 
about that. And that's great, but what we also have to be 
careful of is that just because we can automate something 
doesn't mean that we should, right? So it's not just--it's not 
enough to just say, OK, well, let's design unbiased algorithms. 
We can make some effort, you know, toward that, but we also 
have to ask ourselves should we be using algorithms, you know, 
in these particular cases, right? So should we be using 
algorithms for--you know, for estimating the probability that 
someone is going to commit a crime? Is that the right use of 
this technology? OK. So there's a lot of different sort of 
aspects of this that we have to think about.
     And the way Congress can help is in funding more research 
on fair algorithms and also on providing some structure for 
auditing algorithms. That is also an important component of 
this is that we need to be able to say, OK, well, you know, if 
an algorithm is going to be deployed, right, if we are going to 
make the decision to deploy some kind of algorithm in a 
particular use case, then we really should have rules in place 
for how we're going to audit those algorithms, how we're going 
to make sure that they don't discriminate and that they are not 
biased.
     Ms. Bonamici. Great. And your thoughts on neuromorphic 
computing and what that means to bias and minimizing bias?
     Dr. Kamara. Well, that's a little bit outside of my scope 
of expertise, but I would definitely say that, you know, these 
different types of sort of--as we--as we diversify the kinds of 
algorithms that we're using, it's going to be harder and harder 
to detect bias, right? So we also have to really be really, 
really mindful of this. It's easy to think that, well, just 
because an algorithm is running or some kind of algorithm--
there's many different types of--that, you know, there's sort 
of this veneer of objectivity, but this really isn't the case, 
right, and so we have to be really vigilant against that.
     Ms. Bonamici. Thank you, Mr. Chairman. I yield back.
     Staff. Mr. Baird is next.
     Mr. Baird. Thank you, Mr. Chairman, and thank you, Ranking 
Member Weber, for holding this session. It's very interesting 
and exciting to me. Of course, my background is agriculture, 
and so I'm going to relate agriculture to this quantum 
computing.
     I recently introduced a bill, 2961, that really says that 
the Department of Energy Biological Innovation Opportunities 
Act, and that's making sure that DOE has sufficient 
infrastructure to be able to provide access to quantum 
computing to researchers and university people, as well as the 
industry. So I was trying to mention that because it'll be a 
part of Ranking Member Lucas's SALSTA bill and keeping us in 
the leadership around the world.
     But, Dr. Monroe, you noted in your written testimony that 
translating quantum computing to practical applications will 
create opportunities for the workforce. I would also suggest 
that it will provide students and researchers the ability to 
analyze large-scale, complex, practical situations. And an 
example of this is the National Institute of Food and 
Agriculture's Genome to Phenome Initiative. It's designed to 
look at plant materials and give plant researchers the ability 
to analyze those plants and look for the visual characteristics 
and tie that to the genome. And that all is designed to improve 
our ability to raise food and feed a hungry world in the 
future.
     So, Dr. Monroe, would you care to elaborate--you were 
talking about practical applications and the need for that. 
Would you care to elaborate on what quantum computing can do 
for us in agriculture and how you think that fits into the 
progress of our world?
     Dr. Monroe. Yes, thank you for that question. So, yes, 
sorry for my garbled statement. I think I had to kind of jump 
around a little bit on the hot seat there.
     So quantum computers appear to be good at solving generic 
optimization problems, that is, problems that have way too many 
inputs that we can't sample all of the configurations on a 
conventional computer. Those are called combinatorial 
optimization problems, and we take guesses with even high-
performance computers. And as I hinted, there are problems out 
there that we'll never be able to solve because the number of 
configurations is just way too big.
     And so you mentioned agriculture, but I would maybe 
broaden in it to pharmaceutical, energy, and, you know, gas and 
oil industry. The need to understand the structure of 
molecules, this is something computers are also very poor at 
because even a small molecule, if it has more than a few 
hundred electrons--and this is--you know, even a caffeine 
molecule has 100 electrons--we can't easily model that molecule 
to see how it interacts with others to form catalyzers, better 
fuels, to form better drugs even.
     But at the same time there's a logistics problem even in 
big pharma if you have 10,000 compounds, which combination of 
10 of them make a good drug for something. We may have models 
of that, but, once again, we can't optimize that.
     And on materials science side--and this is closely related 
to what you had mentioned in plant science and agriculture--can 
we develop new materials that harvest sunlight much more 
efficiently than known materials today? It's--this--for the 
same reason. We can't compute or simulate the behavior of these 
very fundamental things.
     So, you know, quantum computing is just starting, but 
that--those are the types of problems that are naturally 
attackable by a quantum computer. And I'll put it this way. 
Those problems will never be solved using conventional 
computers. If they are to be solved, they will demand a quantum 
computer. So it is a very researchy field now, but we're 
starting to launch into building devices. And the Department of 
Energy over the last half-dozen years has played a big role in 
their laboratories in trying to translate that research, basic 
research to product. And I think industry is starting to play a 
role as well, very big-named industries that you've heard of.
     Mr. Baird. You're exactly right. You know, this genome to 
phenome, they take these spray rigs that are 60 foot wide with 
cameras down into look at each and every plant, and there's no 
way in my background in the past that we could have had the 
ability to analyze that much data, so thank you for that.
     And I see I've got about 13 seconds left, and I have other 
questions I would like to ask the other witnesses, but I thank 
all of them for being here and appreciate this opportunity, and 
I yield back.
     Staff. Mr. McNerney is next.
     Mr. McNerney. Well, first of all, I want to thank the 
witnesses. It's great to hear your viewpoints and share the 
excitement about what's going on with these exascale computers.
     Dr. Monroe, you mentioned some of the potential 
applications, including pattern recognition in huge data sets, 
molecular and material design, and secure communications. The 
DOE and several other agencies are engaged in the National 
Quantum Initiative, which was authorized by Congress in 
December 2018. This initiative includes basic research 
activities and establishes large-scale Quantum Information 
Science Research Centers. How have these activities and the DOE 
research ecosystem in general helped strengthen the Nation's 
quantum computing industry?
     Dr. Monroe. Thanks for the question. Indeed, I think DOE 
has shown in force their establishment of five very highly 
scoped centers distributed throughout the country that each 
have a separate mission and aim in the general field of quantum 
information, not just quantum computing but quantum sensing, 
things like quantum simulation. And these--I think the DOE 
laboratories are a great place, think about this, because the 
DOE labs in a sense are that ideal combination of having 
seasoned engineers, device people, but also not being a 
stranger to the weird laws of quantum physics that underlie 
these devices. So things are just getting started.
     I--as a disclaimer, I am a key contributor to a DOE 
quantum center headquartered at Lawrence Berkeley National 
Laboratories in your State, and it's a very exciting 
collaboration that will--our consortium will build systems, and 
we're interested in sort of the system engineering of--when you 
build a big system, it's a--sort of breathing thing that's not 
just a sum of the individual parts. And with the DOE laboratory 
at LBNL and also Sandia National Laboratory, New Mexico is 
involved, we hope to really further the field.
     But I should also mention the National Science Foundation 
and NIST are big players. They have historically been at the 
field since the beginning in quantum, and they--you know, one 
beauty of our national system in funding science is that we 
have many agencies, all with different missions, and, you know, 
DOE obviously I think the centerpiece are the laboratories. 
NSF, they are more vertically organized in a certain sense. 
They can bring together physicists with computer scientists and 
everything in between to come to bear on this. So I think, you 
know, collaboration between those agencies is really what's 
going to keep the American in the lead in this field.
     Mr. McNerney. Excellent. Dr. Binkley, you first discussed 
the importance of strengthening the core research that feeds 
into technologies like quantum sensors and networks. Would you 
discuss that a little bit, please?
     Dr. Binkley. Yes, the--one of the major premises of the 
DOE activities in quantum information science is to be really 
focused on the fundamental science first and then follow that 
through with technology developments to get practical 
applications. And so, you know, there--if you look across the 
Office of Science portfolio of activities, you know, it's 
chemical and material sciences--well, know, those are systems 
that are governed by the laws of quantum mechanics. It's 
nuclear physics and high-energy physics governed by the laws of 
the standard model. And, essentially, quantum mechanics 
permeates through all of the physical sciences activities that 
are supported by the Office of Science. And, you know, that is, 
I think, a major focus of how we're organizing our QIS 
activities.
     Mr. McNerney. Well, thank you. Moving on, Dr. Monroe, in 
your written testimony you note the proposed QUEST program. How 
much annual funding is needed for this program? Do you think it 
would make sense to ramp up the program from a relatively 
smaller amount over the first few years?
     Dr. Monroe. I think there's some amount of sense there. I 
think quantum computing systems that are capable, that itself 
is ramping up right now. I think things started maybe 4, 5 
years ago, very small systems, and these are provided by 
companies you've heard of, some maybe you haven't like my 
company IonQ but also, you know, Google, IBM, other companies 
are putting these devices out there. They're very expensive 
and, you know, I speak from experience that industry, the bet 
that quantum computing will have a commercial payoff, it's--you 
know, it's going to be a long-term bet, and it's not easy for 
companies to play that risk, and I think this will allow the 
Department of Energy and the U.S. Government to help connect 
these systems to users and help subsidize the use of those 
machines.
     And I'll also say--and one thing that--the wildcard to me 
is that the killer application for quantum computing, I don't--
I'm not sure when it happens. I'm not--I can't predict what it 
will be, but I'm pretty sure that it will happen out of left 
field from somebody that thought of a problem that I don't know 
about or none of us makers know about, and getting them 
connected to the system will just hasten progress in the field 
because as soon as we hit that killer application, I think it's 
just going to explode.
     So, you know, your mentioning of ramping up the access 
program. I certainly think that's maybe not a bad play, but I 
think it needs to get going. We need more industrial 
involvement in this field, less risk. Thank you.
     Mr. McNerney. Well, I've run over my time, so I'm going to 
yield back at this point.
     Staff. Ranking Member Lucas is next.
     Mr. Lucas. Thank you. As I've made clear, I believe that 
ensuring strong support for the Department of Energy and its 
world-leading national lab system is essential for our global 
leadership in science and technology. And this is one of my 
highest priorities as Ranking Member of the Science Committee.
     So I ask my first question to Dr. Monroe and then I'd like 
the rest of the panel's comments, too. And bear in mind that 
part of our responsibility as Ranking Member and my colleague 
as Chairman of this Committee is not only to try and create the 
right policy here in the Science Committee but also we have to 
be able to persuade our colleagues in the body as a whole and 
ultimately the American taxpayers that we are on the right 
track.
     So I ask the following question in that vein. In your 
respective areas of expertise, what would it mean to U.S. 
leadership in advanced scientific computing if we fail to 
provide adequate support for DOE Office of Science? Start with 
you, Dr. Monroe, and whoever in the panel would care to touch 
on that.
     Dr. Monroe. Sure, I'll----
     Mr. Lucas. And this is a message from my colleagues who 
may not be spending time listening to this Committee hearing, 
which flabbergasts me, I acknowledge.
     Dr. Monroe. Sure, I'll answer very briefly. In my own 
field of quantum computing that, again, we're at the early 
stages of this field, and it is absolutely critical that 
industry eventually take it over. And I think the DOE labs in 
particular but also university laboratories supported by NSF 
and NIST under the NQI is essential that they are able to make 
the investments necessary to keep the United States ahead of 
the world in this field. We're ahead now. We have mighty 
industry just waiting in the wings to take it over. We have to 
translate it. It's absolutely critical given the coordinated 
investments from folks around the world.
     Mr. Lucas. Anyone else care to take a stab?
     Dr. Tourassi. So I could go next. Certainly if we----
     Mr. Lucas. Please.
     Dr. Tourassi [continuing]. Start with investments, 
essentially, we are going to stagnate scientific innovation. We 
will stop innovating not only across basic sciences but also 
across applied sciences. Since we are using quantum computing 
as an example, as Dr. Monroe said, this is the beginning of a 
promising and disruptive technology, but it will take more than 
a decade for that technology to become reality for all of us. 
What happens in between? And it is classical computing that 
will lead to those innovations of materials, new materials that 
are needed to advance quantum computing. So it is a relay, and 
we cannot just pause and wait for the next big thing to come 
out from another nation.
     Dr. Binkley. I could go next. There's another dimension to 
this that I think is really important to keep in mind, and that 
is that to be first in innovation, to be first in economic 
security, and so on, national security, you know, it's really a 
race for getting the best and brightest people into the United 
States to do new research in areas like quantum information 
science that have not been a topic of heavy investment before. 
You know, it's essentially--the best tools attract the best 
people, and that works in our advantage in an international 
context. And so having, you know, the best workforce ultimately 
is the--something that we need to strive for.
     Dr. Willcox. I echo Dr. Tourassi's comments and to say 
that under this the Office of Science would absolutely stagnate 
and in fact set back our ability to tackle the critical 
challenges in energy security and environment particularly, and 
the Nation, the world just can't afford that.
     Mr. Lucas. With that, I thank you. You make very 
compelling cases. My time is winding down. I would observe one 
other the thing. We are very sensitive in Congress these days 
about not only international competition but how we protect 
U.S. research from theft, at the same time encouraging 
transparency and a cooperative environment. Anyone in a few 
seconds who'd like to touch on that, I'm more than pleased to 
hear what you have to say. But that's the struggle we're also 
facing, how to give you the tools, create the environment, but 
at the same time preserve your good work and the work being 
supported by the American taxpayers.
     Dr. Binkley. Well, I would add, Congressman Lucas, that's 
something that we are really focused on. In fact, the meeting I 
was in just prior to this hearing was a National Science and 
Technology Council meeting focused on research security. And 
the--I can attest to the fact that a number of--in fact, all of 
the major science funding agencies of the Federal Government 
are really focused on this problem these days, and it's a very 
difficult problem. You know, we're trying to develop policies 
that will protect the U.S. interest. It's a very thorny issue. 
And in the Department of Energy, we've already--because we have 
such a large research establishment in our national labs, you 
know, we have over the last 3 or 4 years begun implementation 
of policies to help protect the results of research from being, 
you know, taken illicitly.
     Mr. Lucas. Thank you, Doctor, and thank you to the entire 
panel. My time is expired. I yield back, Mr. Chairman.
     Staff. Ms. Stevens is next.
     Ms. Stevens. Great, thank you so much. Congresswoman Haley 
Stevens from southeastern Michigan and a major fan of 
supercomputers and supercomputer technology and today's 
hearing. I couldn't be more grateful.
     In particular, I've had the privilege of working on a 
supercomputer program called the National Digital and 
Engineering Manufacturing Consortium working in partnership is 
a public-private partnership between Purdue, OSU (Ohio State 
University), and a handful of small businesses in the Great 
Lakes area funded through the Economic Development 
Administration.
     Also, I think I'm getting some feedback, Weber. That might 
be you. I think I'm getting Weber's chit chat, which I love 
hearing, but, you know, I want----
     Chairman Bowman. Mr. Weber, please mute your mic.
     Ms. Stevens. OK, great, thanks. Thank you, Mr. Chair. 
Thanks.
     So we get this NDEMC (National Digital Engineering & 
Manufacturing Consortium) program. We saw it with Jaco Plastics 
was able to develop a new product line resulting in 100 new 
jobs.
     And, Dr. Tourassi, first of all, it's always a privilege 
to have somebody from Oak Ridge at a science hearing, and we 
have a tremendous amount of respect for your capabilities and 
partnerships out of Oak Ridge and all that you have 
represented. And your hearing touched a little bit on this, but 
I was just wondering if you could shed some additional light--
and this could also be open to anybody, but I'm just feeling 
like Oak Ridge might have some insight into this with 
supercomputer technologies, public-private partnerships, and 
also abilities to lead to the creation of jobs either at small 
enterprises or large and what else we could be doing in 
Congress to advance these opportunities.
     Dr. Tourassi. So, as you know, building the supercomputers 
represent the strong partnership with our vendors, and from 
that point on, it's what kind of science we enable, right? 
These are the scientific innovations that will lead to 
technology transfer, and they will lead to also job growth 
opportunities.
     We talk a lot about artificial intelligence these days and 
about what artificial intelligence will do to our workforce, 
both positive and negative effects. And secondly, when I'm--I'm 
an optimist and a proponent of developing a computational 
savvy, aware, and ready workforce. And I see that all of the 
leadership computing facilities being pivotal in that space as 
well. It is not only how to use the computers, it is not only 
how to do the science, but how exactly we develop the workforce 
that can support the data infrastructure that we mentioned 
earlier. And that is a different flavor of data--of the 
workforce that is absolutely needed for our Nation. So I would 
encourage all of you to keep thinking along those lines as 
well. It takes a lot of effort to collect, curate, and manage 
data. These are technical expertise. And I know that we don't 
have enough in the Nation, so we should be building that 
workforce and create opportunities for smaller companies to 
play a role in that.
     Ms. Stevens. Yes, and I--you know, we could go on here, 
too, because part of why we had this NDEMC program, the 
National Digital Engineering Manufacturing Consortium, was 
particularly because of cost. And that was about 10 years ago 
now, so I don't know if any of our other panelists while I have 
about a minute left could shed some light on--in addition to 
Dr. Tourassi's really great statements around workforce and 
some of the barriers to entry because of workforce and human 
capital, but any barriers to entry that we need to be 
considering for small to midsize enterprises with supercomputer 
technology and costs in the year 2021? I don't know if Dr. 
Willcox or Dr. Monroe have any insights into that.
     Go ahead, Dr. Monroe.
     Dr. Willcox. Go ahead, Dr. Monroe.
     Dr. Monroe. OK, yes. Yes.
     Ms. Stevens. You unmuted first.
     Dr. Monroe. OK. Yes, thank you for the question. Indeed, 
so IonQ is a sub-100-person company notably about to go public 
in the next month, and of course we're still building machines 
and looking for use cases. And, as a small company, I think 
it--you know, while this hearing is predominantly about high-
performance conventional computing, as it turns out, I think 
was mentioned that you do need high-performance computers to 
not only optimize and run but use quantum computers. They have 
to be developed in parallel. And this is why actually we have--
we do have a relationship with Oak Ridge, a working 
relationship for--they can run small algorithms on our 
machines, and we in exchange can maybe use some of the advanced 
computing machinery located in Tennessee.
     So, indeed, it's important. These are capital-intensive 
purchases to--you know, to be able to have your own array of 
GPUs (graphics processing units) on your own site. We don't 
have to do that all the time, but it's very important in my own 
field that we're able to get access to very high-performance 
computing machines, and we do rely on DOE for some of the----
     Ms. Stevens. Right, you're relying on DOE and our labs and 
maybe universities. Well, I'm out of time, but this was great. 
Thank you, Mr. Chair, great hearing.
     Staff. Mr. Feenstra----
     Ms. Stevens. I yield back.
     Staff. Oh, forgive me. Forgive me, Ms. Stevens. Mr. 
Feenstra is next.
     Mr. Feenstra. Well, thank you. Thank you, Chairman Bowman 
and Ranking Member Weber. I want to thank you to the witnesses 
for their testimony and in sharing their extensive research and 
experience. It is truly outstanding, all the leaps that we're 
getting into on the cutting-edge of this technology.
     I want to direct this question to Dr. Tourassi, and thank 
you for being here. I mean, it's just a pleasure to see you. 
Accurate weather forecasting is incredibly important in my 
district not only for the farmers in Iowa as they predict the 
growing seasons and the harvest but also for covering severe 
weather events for the public. Dr. Tourassi, in your testimony 
you highlighted the use of Summit to achieve a milestone in 
global weather forecasting simulations. Can you elaborate on 
the milestone and how it will help in the future of weather 
forecasting and how the advent of exascale supercomputing will 
also be able to build on this?
     Dr. Tourassi. Absolutely. And this is something that Dr. 
Binkley addressed earlier on. Simulating modeling climate and 
weather forecasting is one of the most complicated problems 
because we deal with many different physical systems 
interacting with each other. So--and this is something very 
similar that happens with the biomedical and the biological 
space. If you try to understand only one dimension of the 
problem, you just miss the big picture. This is what high-
performance computing and supercomputing has done throughout 
the years because it enables modeling and simulation and 
weather forecasting by increasing the complexity of the 
components we add in the equation. We can take into account 
atmospheric patterns. We can take into account soil information 
or ocean information, and that's how these models, with the 
support of Summit and in the future with Frontier, will provide 
models that are more detailed, and they are more precise. Now, 
of course, it's an oxymoron in some ways to say we will do 
weather forecasting in a more precise way because the 
atmospheric system itself is a chaotic system. Still, though, 
we're decreasing the uncertainty according with which we do 
these predictions.
     The way the community has been moving--and again, I'm 
speaking a little bit at a high level because I'm a biomedical 
scientist--we know that when you're dealing with a problem that 
is very difficult to predict, effectively, you try to create a 
number of models and just throw them in a pile and see what 
they do. Think of it as an ensemble approach. And then we 
aggregate the results of these models.
     As the supercomputers are getting bigger and bigger, what 
are we able to do? First of all, run more complex models. 
Second, run more of them, therefore increasing the precision of 
our predictions. And this is what we are experiencing with 
Summit, and this is what we will see with Frontier even more.
     Mr. Feenstra. Wow, thank you so much for that answer. I've 
got one more question quickly. Dr. Binkley, as part of 
Argonne's Early Science Program, a team from Iowa State and 
Ames Lab is rewriting a software program called NWChem for the 
exascale era. The new program could provide an increase in the 
size of chemical systems that can develop new methods for 
converting biomass into biofuels. This could be groundbreaking. 
Can you talk about how exascale computers can be used for 
breakthroughs such as in the many different areas including 
biofuels?
     Dr. Binkley. Well, certainly, the example that you're 
citing, NWChem, which is one of the foremost quantum chemistry 
codes, it's been around for a number of years, and it was 
developed originally at the Pacific North--yeah, Pacific 
Northwest National Laboratory, and, you know, making an effort 
to bring that software set into operation on the exascale 
computers is a major step, and it's something that our Basic 
Energy Sciences Program has been supporting.
     There are a number of other codes that I think are going 
to have transformational effects. There--we--there are fusion 
simulation models that are under development for use on the 
exascale computers. There's also software for understanding 
seismic events. You know, there is--if you look at the list of 
the 24 exascale applications that are slated for first use, a 
number of them I think are poised to have breakthroughs.
     Mr. Feenstra. Well, thank you so much. And my time is up. 
Thanks, everyone, for being on the panel. This is great 
information. I yield back.
     Staff. Mr. Casten is next.
     Mr. Casten. Thank you so much. I really appreciate our 
witnesses being here, and I have some questions for you, Mr. 
Monroe. My district is just north of Argonne National Lab and 
just to the east of Fermi, which means I can't claim any of it 
but spent a lot of time at both. And of course that's where a 
lot of the really groundbreaking deployment of quantum 
computing is happening.
     I want to first just be a little bit nerdy with you. When 
you were listing some of the types of questions that you can 
uniquely answer with quantum computing, I think you mentioned 
optimization problems, encryption, decryption, some issues 
around protein chemistry. All of these sort of classes--sort of 
NP hard problems where the solution space is huge and you can 
just jam through a lot of data, as you look at sort of the 
opportunities, is it uniquely to those classes of problems that 
just you can churn through a lot more data fields a lot more 
quickly, or are there other classes of problems that you're 
also excited about?
     Dr. Monroe. Maybe I'll answer it in two ways. Thanks for 
the question. I'm always glad to put the nerd hat on. So the 
one disclaimer I'll say is that it's very hard to prove that a 
quantum computer can find the absolute optimal configuration of 
variables in some complex model. However, it could be a 
heuristic, meaning that it could do better than any 
conventional computer. You don't really need proof. All you 
need to do is show, well, it gave me a better answer. It gave 
me a shorter path----
     Mr. Casten. That's correct.
     Dr. Monroe [continuing]. Of somewhere out subject to 
certain constraints. You don't need to prove it. And that's 
sort of why quantum computers have to be deployed. You need to 
build them and deploy them as soon as possible and sort of see 
how well they perform.
     So the other way I might answer this question--this might 
be too universal--is that I think every application in a 
quantum computer can be cast in terms of an optimizer. 
Optimization is a very general thing. Even code-breaking 
application, which is actually factoring numbers into their 
primes, that's a very hard problem. It's easy to multiply, it's 
hard to do the inverse, to factor. You can cast that in terms 
of an optimization problem. It's optimizing the two numbers 
that actually multiply to give the original number, and that 
has, you know, revolutionary impacts on security.
     But one thing I want to conclude here with is you 
shouldn't think--I hesitate to describe quantum computers as 
big data machines. They don't process big data in a way that 
you can get access to all the big data. You can't compute 
massive--you can't model every molecule in the atmosphere and 
predict exactly what the weather is going to be. You need 
problems that sort of have a very small--a very simple answer 
at the end like there's a model of climate I have in the upper 
atmosphere. I don't know how to solve it. I don't know what 
conditions will validate this model. It's those problems that 
quantum computers can do. They take lots of--they sample lots 
of data, but you only get very--there's like a winnowing down 
to getting only a very small amount of information at the end, 
so it's a little bit subtle. And so high-performance computers 
are one-to-one machines. They can compute brute force function 
evaluation with all these inputs and all these outputs. Quantum 
computers do something a little differently.
     Mr. Casten. Fascinating. I'd love to follow up on that 
with more time than we have, but let me leave with a question 
to you but invite all of our witnesses to respond with your 
thoughts. The--when I've gone down and toured these facilities 
at our labs, I'm always struck by the fact that part of the 
excitement is these NP hard huge data set problems for all 
classes of supercomputing. And then the other conversation we 
always end up getting into is the ethical questions around what 
does it mean to have a computer that's capable of asking 
questions that our brains aren't capable of thinking of and 
what's the appropriate boundaries there? And I'd welcome any of 
your thoughts on what we should be doing as a legislative body 
to--you know, maybe we're doing enough already to put those 
ethical boundaries in place and really understand how do we 
make sure that--you know, that we don't--we don't hit whatever 
that moment was in Terminator where the computers are smarter 
than we are and we can't figure out why.
     Dr. Monroe. Well, I'll answer very briefly there and leave 
it to the other witnesses. This is a little bit passive, but I 
think the only--I think there's an impetus for us to get there 
no matter what because if we don't understand some 
revolutionary form of computing and we don't get there, others 
will. And so I think that that's some type of ethics. At least 
we're at the forefront of that new technology. I agree with you 
it can be vexing when you get a new technology, how to use it 
ethically and so forth, but, you know, it's very important to 
get there and not just ignore it and decide not to get there. 
And if we do that, others will.
     Mr. Casten. Well, thank you. I'm out of time but would 
welcome any thoughts that the rest of the panelists have in 
writing afterwards. Thank you, I yield back.
     Staff. Mr. Obernolte is next.
     Mr. Obernolte. Thank you very much, Mr. Chairman, and 
thank you to our witnesses for a fascinating hearing. Also 
thank you for allowing me to participate. I know I'm not a 
regular Member of the Subcommittee, but this is a subject that 
I find extremely interesting.
     I think it's important to recognize just how much this 
concept of beyond exascale computing has the potential to 
change computer science and its contribution to humanity in 
general. I mean, we're talking about computers that can 
actually perform more calculations per second than the human 
brain can, we can approach these artificial intelligence 
problems in ways that we just hadn't been able to in the past, 
so it's a very exciting thing to be part of, certainly 
something that we as a Federal Government need to be 
stimulating investment and research into.
     So a question for Dr. Binkley. When you were talking in 
your testimony about beyond exascale computing, you mentioned 
that quantum computing is probably going to be the lead in 
that. And my question to you is do you think that quantum 
computing is the only technology that's capable of beyond 
exascale power, or are there other technologies, traditional 
technologies that might also be capable of that?
     Dr. Binkley. Well, I don't think that quantum computing is 
going to lead to the elimination of the type of computing that 
can be achieved on current supercomputers, including exascale. 
I mean, as Chris Monroe has pointed out, quantum computers are 
very specialized in the type of calculations that they can do. 
Conventional computers, including exascale, use numerical 
methods and follow principles of sort of orderly input and 
orderly output. And so I think that what is going to come next 
after exascale, if you look at the typical exascale computer, 
each node in the computer has both a conventional CPU (central 
processing unit) chip and some type of a graphical processor. 
And it's the combination of those two that allows one to attain 
10 to the 18 floating-point operations per second.
     Experiments are beginning now to look at other types of 
computing elements that can be included into a supercomputer, 
and so other types of GPUs, one could imagine neuromorphic 
chips that could be incorporated. I think there's a lot of 
research that can be done in those areas, and I think that the 
lineage that has led us up to exascale still has more steps to 
go through as we go forward.
     Mr. Obernolte. Well, thank you, Dr. Binkley. As a computer 
scientist myself, I find this extremely interesting and really 
inspirational.
     Before I go, if I could just make everyone aware of a 
piece of legislation that I've introduced. It's H.R. 3284 that 
would direct the Department of Energy to establish a program 
for capabilities beyond exascale. And one thing that I think we 
need to do more talking about is the need to research energy-
efficient computing because if you look at the current 
generation of exaflop-capable supercomputers, one thing that 
strikes me is the pure amount of power that they consume. The 
El Capitan supercomputer that's being installed at Lawrence 
Livermore Laboratories I believe is going to have a power 
consumption on the order of 40 megawatts, so the amount of 
energy they consume and the heat that they consume is going to 
quickly become a barrier to our ability to employ these kinds 
of technologies. So as part of this bill I'm suggesting that we 
also establish a program to stimulate research into energy-
efficient computing.
     And then of course I know the testimony was mentioned in 
support for the Computational Science Graduate Fellowship 
Program. That's a program that I am very supportive of. I think 
that at the same time that we are stimulating research into 
these technologies, we also need to make sure that our academic 
population and our workforce is prepared to employ these 
technologies. It's going to be no good to make the tools if 
we're not also stimulating the kind of workforce that's going 
to be able to help us take that to the next level. And this 
bill would also do that. So I would certainly invite everyone's 
support and participation on that bill.
     And I'll yield back, Mr. Chair, but thank you very much 
for letting me be part of this discussion.
     Staff. Mr. Lamb is next.
     Mr. Lamb. Thank you, Mr. Chairman. And I apologize up 
front if I have any internet connection problems here. It 
hasn't been the best day for me on that.
     The first thing that I wanted to ask was for Mr. Monroe. 
There's a statement on the Department of Energy's website that 
says--that right now we're at the same point in quantum 
computing that scientists in the 1950's were with computers. Do 
you consider that an accurate statement as far as kind of 
placing in context where we might be in the development of 
quantum computing?
     Dr. Monroe. Yes, thanks for the question. It's an 
interesting comparison, and I buy into that with the caveat 
that we're sort of on this experiential curve on progress and 
so, you know, 10 years back in the 1950's is maybe like 2 years 
now, so, you know, all of us students of history, it's 
wonderful to see the progression from vacuum tubes to germanium 
transistors to silicon and then silicon we built silicon out. 
It took 20 or 30 years for that to happen. We don't expect that 
to be the same with quantum. I think we are on the cusp of 
several different technologies, superconducting circuits. At 
IonQ we work with individual atoms. We know how to scale these 
things. We have to put the, you know, engineering effort into 
it.
     So the comparison is a very good one because we are at 
very low levels now. We're not--we don't have high--we don't 
have Windows for quantum yet. You know, we're working with 
individual gates, very low-level stuff that they were doing in 
the 1950's with silicon transistors. But we also know that 
software exists for classic computers, conventional computers 
that we can also deploy and accelerate in quantum. So, yes, 
it's 1950's in quantum, but, you know, we're sort of moving 
much faster.
     Mr. Lamb. Good, thank you. And you put in your testimony 
that you thought the Endless Frontiers Act would be helpful to 
our efforts to go even further in quantum computing. Could you 
maybe just say a little bit more concretely why that is? 
Because I'm a supporter of that bill. I think it's incredibly 
important. I do think there is some hesitation for those 
familiar with the NSF that it could somehow maybe detract from 
the core basic science mission of the NSF, so would you be able 
to address that and assuage those concerns?
     Dr. Monroe. Yes, I--my understanding is there are many 
different approaches to re-tasking the NSF to have a technology 
edge to them. My understanding is also it's not a zero-sum 
game, that the NSF will still have in its core mission the 
ability to do research, blue-skies research, research for 
research sake.
     You know, I want to link that to technology, especially 
quantum technology and why NSF is very well-suited to play a 
big role in the development of the tech, not just the science, 
and that is the future of quantum computing, we need 
applications, and right now they're coming in the name of 
science, mainly in universities. At Duke, in the University of 
Maryland, we're deploying our laboratory systems to do models 
of black holes and wormholes believe it or not, things that 
maybe a company would never do. And it's those scientific 
applications that are happening right now. They're also 
happening at Department of Energy laboratories across the 
country.
     Companies won't pay for this. They're not going to build a 
device to do black holes. So having an agency like the NSF that 
has an eye on the blue-skies fundamental science research, 
building and using quantum computers, making a user facility, 
for instance, is a really good idea because it's going to tide 
us over until industry really controls the building of these 
devices. And then we can use those devices for new science just 
like we're using exascale and high-performance computers at Oak 
Ridge and other DOE labs for current science.
     Mr. Lamb. I agree. I agree. Thank you.
     I do want to just sneak in one more question. Dr. 
Tourassi, thank you for everything you're doing, very excited 
for the launch of Frontier this year. Is there in somewhat 
simple layman's terms a way to describe a problem that we 
cannot solve until we get Frontier? I understand that it'll 
probably solve other problems faster. Are there new actual 
types of problems that this platform will allow us to solve 
that we couldn't before?
     Dr. Tourassi. That's a good question. Actually, what we 
are hoping that the next machines will enable is that pure 
integration of modeling and simulation with large-scale AI, 
bringing together models and observational data that we can get 
from our different experimental facilities and from the 
different Federal agencies. That level of computing has not 
been--has not happened yet.
     It goes back to the example that was given earlier about 
the phenome-genome association studies, which actually are a 
very challenging and attractive problem for exascale computing 
when we're looking at large populations of, let's say, humans 
when we want to do the--at the population level phenome-genome 
associations, that is which will drive new treatments, 
specialized treatments, precision medicine.
     Mr. Lamb. Great, thank you. I'm out of time. Mr. Chairman, 
I yield back.
     Staff. Ms. Ross is next.
     Ms. Ross. Thank you, Mr. Chairman, and thank you for 
holding this hearing and to all the panelists for joining us 
today.
     The Committee Members have heard this before, but to the 
panelists, I represent the Research Triangle area of North 
Carolina. I represent Wake County. And the Research Triangle 
area has a growing ecosystem of hundreds of innovative and 
collaborative companies, including science and technology 
firms, government agencies, academic institutions, including 
Duke, startups and nonprofits. The IBM Quantum Hub at NC State 
University is a cross-disciplinary center of quantum computing 
education, research development, and implementation. The IBM 
Quantum Hub at NC State works with researchers from Duke and 
UNC (University of North Carolina) Chapel Hill to help partners 
develop quantum teams and explore promising use cases and 
promotion of quantum computing in real-world applications.
     And so, Dr. Monroe, congratulations on the quantum 
computing user facility at Duke. And I want to see how these 
partnerships work. But there's a real-world application that 
we've been dealing with these past couple weeks on 
cybersecurity and how we might be able to collaborate between 
research institutions and the private sector and use quantum 
computing to maybe prevent some of the kinds of things that 
we're seeing, particularly when we're dealing with 20th century 
technology that hasn't moved into the 21st century. And I don't 
know if the Committee has dealt with this earlier. I was in 
another Committee meeting. But I'd love to hear from you about 
how quantum computing might help us with cybersecurity.
     Dr. Monroe. OK. Thank you for the question, Congresswoman. 
So indeed the known killer application of quantum computers is 
code-breaking, cracking the most popular data encryption 
schemes we have. It looks like that application is one of the 
hardest out there. We need much bigger machines to do that. 
That said, we also are--this has been said before. We're at 
such an early stage in the game. We know we can shrink the 
problem, we can make it more efficient through software by 
being more clever on how we structure not only the instructions 
that we run the quantum computer but the quantum computer 
itself, how we control it. And this is why at Duke, for 
instance, we're collaborating quite--we're starting a very 
large collaboration with the colleagues you mentioned at NC 
State, and they're experts in computer architecture. We make 
machines, and, you know, this is sort of the marriage that we 
really like the direction of that.
     Now, making code-breaking problems easier is something 
that won't happen in isolation. It won't happen by me or, you 
know, it won't happen by somebody that doesn't have a machine, 
so I think they have to work in unison. And, you know, I've--at 
IonQ we've worked closely with IBM. IBM also writes software 
that supports our system, the expression of our particular 
hardware. And, you know, IBM I think has really, you know, 
taken a lead at educating the public on using systems. They 
were the first to put their system on the cloud, and we've sort 
of followed a few years later. So it's a wonderful ecosystem 
down here in North Carolina, but, you know, also at the company 
IonQ, which is a Duke-Maryland startup inside the beltway in 
Washington, they--they're very close to the National Security 
Agency and, you know, we--we're--we understand that community's 
needs in this field.
     So it's--I don't know what else to say. It's a very 
exciting time for the field. We'll take problems anywhere they 
come from, but security is a big one, and I think one--one last 
comment, what's not mentioned so much is the energy grid. 
That's a huge logistics problem, and I was horrified to find 
out that apparently I think the way energy is coordinated 
across the country, it's not--maybe it's not done very smartly. 
It's not my field, I shouldn't comment on that, but we do know 
that's one of the most vexing optimization problems we have, 
and the news in the last few weeks, especially here in North 
Carolina, is that, boy, it would be nice if we made that a 
little smarter in the future. I'm not sure exactly how quantum 
will play a role there, but if it's an optimization problem, we 
want to take a crack at it.
     Ms. Ross. Well, you anticipated my second question, and 
that is how open are, you know, some of our big energy 
companies to your help?
     Dr. Monroe. Well, you know, at IonQ, we are--you know, 
we're not a company that can afford to talk to 50 different 
companies in parallel, but we do have conversations with a few 
energy companies that have teams on the ground that are ready, 
they want to deploy quantum for their uses. These could be oil 
and gas or, you know, related to big pharma or, you know, 
developing new fuels. It is a researchy time in the field right 
now, but they're very open to working with companies like ours, 
like IBM, and so forth and going in the future. So in terms of 
the energy grid, that's definitely not my area of expertise, 
yes, so, thank you.
     Ms. Ross. Thank you, Mr. Chair. I've exceeded my time, and 
I yield back.
     Staff. Dr. Foster is next.
     Mr. Foster. Thank you, and thanks to all of our witnesses 
here.
     I--well, first off, for those of you that don't know me, 
I'm Congress's Ph.D. physicist and also the--Congress's AI 
programmer, although I only have really dabbled in TensorFlow 
recently.
     The question I have was a more general one about where you 
see this whole field is going of advanced computing. It seems 
to me that it's likely to fragment, that traditionally we've 
had a whole big line of, you know, megaflops, big floating 
point pipelines to--for local partial differential equations, 
and that will continue for all the usual reasons from, you 
know, weather maps to nuclear weapons and everything in 
between. And so that will continue perhaps with an overlay of 
AI to do the grid scale optimization on the fly but largely 
just, you know, a similar thing.
     And then if you look at what's happening in AI, the state-
of-the-art AI engines are not in the Federal realm any more. 
They are--you know, if you're just high-speed execution of GPT-
3 and other AI interesting algorithms, you know, the best 
engines are from places like Google for that. And I don't 
believe that the Federal Government is likely to be able to 
keep up with that, that industry will continue to pull ahead. 
In that case, the Federal investment will largely be in paying 
for university groups to have access to these very advanced AI 
engines, and that's probably as ambitious as we should be on 
that front.
     And then, you know, farther down the chain of, I guess, 
narrower calculation of widths, you have things like 
neuromorphic engines, which may end up as not floating-point or 
even low-precision integer but single-bit computing engines for 
neuromorphic thing--and so those will be experimental for a 
while and may or may not become, you know, really important.
     And so from a Federal point of view we have to decide 
where to put--you know, there's a bipartisan agreement that we 
should something like double the Federal budget, but we have to 
understand, you know, whether to write, you know, a big new--a 
check for a big new initiative wholly in National Science 
Foundation, whether we should sort of scale up all of the 
existing enterprises and then see what they're able to produce 
in a few years. And a big part of that decision is whether we 
think we have to--whether we're going to--what the mixture of 
small research and big facilities is going to be because that--
anyway, so I'd be--I guess, Steve, we can start with you and 
what you see as DOE's role in that.
     Dr. Binkley. Well, clearly, Bill, the--our programs are 
really intended to advance the state-of-the-art in scientific 
computing writ large, and so that essentially encompasses all 
of the technology paths that you just outlined. I mentioned 
earlier that the--sort of the next obvious move post-exascale 
is to look at heterogeneous computing systems where, in 
addition to GPUs, one could have neuromorphic chips and other 
computational technologies, and those types of machines would 
provide avenues to solve essentially new problems.
     But also I think continued investment in quantum computing 
is going to be necessary. I don't--I do see a separation 
between sort of the classical Turing-type computation and 
quantum computation, and so there may--you know, that may 
eventually divide--you know, there may be a parting of the 
roads there. But at least at this stage in time I think we need 
to really look at the diverse portfolio of technologies and not 
foreclose any options.
     Mr. Foster. Yes, well, are there areas where you sort of 
believe we'll not be able to compete with industry? You know, 
I'm thinking, as I mentioned, specifically of AI. Are there 
areas there where it's clear already that the government is not 
going to be leading?
     Dr. Binkley. Well, when it--so there are certainly AI 
applications that are going to be dominated by the high-tech 
companies, Googles and so on. The one area, though, that I 
don't see Google moving into is there is--with the advent of 
the exascale machines and the type of GPUs they have, they are 
actually the most powerful AI engines that exist. And the--you 
know, this very, very high end of AI is something that I think 
is going to still be the domain of the leadership computing 
class facilities. You may disagree, but, I mean, I think----
     Mr. Foster. No, I--you know, I'd be interested because 
there's a different narrative coming out of Silicon Valley----
     Dr. Binkley. Yes.
     Mr. Foster [continuing]. And a lot of worry that small 
startups in AI simply can't compete because they do not have 
access to Googles and so on, Amazon's big AI engines so----
     Dr. Binkley. Yes.
     Mr. Foster. Now, does anyone else have a--Georgia or----
     Dr. Tourassi. Yes, I would like to add a couple of data 
points to make the case of why the supercomputers and exascale 
computers will actually be the powerful AI engines, and I would 
like to draw these experiences from the exascale computing 
project and particularly the one that was mentioned earlier to 
support cancer. When we started that effort, the specific 
application we were focusing was not a traditional application 
that anybody would have thought about supercomputers, but it 
was a challenge application for us to start building frameworks 
to prepare for the scientific community.
     Well, in the process, we actually faced the philosophical 
debate, do you fit hardware to algorithms or do you build 
algorithms, modify algorithms to fit hardware? Typically, we 
think of I have a specific application, I will prepare a 
certain type of hardware to fit that particular application. 
Well, in the case of a supercomputer in transit, we had to do 
the opposite. And what we discovered in the end is that for 
this particular application, by modifying the algorithm to make 
the most of the supercomputer, we exceeded performance, state-
of-the-art performance. So that opened a completely new way of 
thinking about using supercomputers for applications that 
traditionally were not thought suitable to supercomputing.
     The second example I would like to give was the very 
recent Gordon Bell award for the COVID-19 pandemic that brought 
together lots of modeling and simulation of the complete 
envelope, viral envelope of 305, you know, million atoms, but 
there was an AI workflow to accelerate that modeling and 
simulation, so that coupling of AI and modeling and simulation 
that was only possible on Summit. So I do believe that we see 
plenty of examples out there to support that these general-
purpose machines will be able to deliver a lot even in the AI 
space.
     Mr. Foster. All right, fascinating discussion. I've 
exceeded my time, so I guess I have to yield back.
     Chairman Bowman. Thank you, everyone. Before we bring the 
hearing to a close, I want to thank our witnesses for 
testifying before the Committee today. The record will remain 
open for 2 weeks for additional statements from the Members and 
for any additional questions the Committee may ask of the 
witnesses.
     The witnesses are excused, and the hearing is now 
adjourned. Thank you all. Have a good day.
     [Whereupon, at 1:06 p.m., the Subcommittee was adjourned.]

                               Appendix I

                              ----------                              


[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

                              Appendix II

                              ----------                              


                   Additional Material for the Record

[GRAPHICS NOT AVAILABLE IN TIFF FORMAT]

                                 [all]