In the spring of 1943, Hans Bethe, a theoretical physicist and professor at Cornell University, left Ithaca, New York, for a classified government site in Los Alamos, New Mexico. Once there, he led the theoretical division of the Manhattan Project, which developed the atomic bomb. Bethe was just one of dozens of academics pulled from elite American research universities into wartime service, applying their intellectual training to solve critical national security challenges. When the war ended, Bethe returned to Cornell, where he helped transform the university into a hub of Cold War–era research, working to invent—among other innovations—the synchrotron, one of the world’s first particle accelerators. That development, in turn, paved the way for the creation of advanced radar systems and semiconductors.
Bethe’s career path epitomized the long-lasting and mutually beneficial partnership between U.S. universities and the government. Before 1940, U.S. federal support for scientific research was minimal and mainly limited to agriculture and public health. But during World War II, the government turbocharged its funding for research and development and boosted it again during the Cold War. The government extended grants to a kaleidoscopic variety of academic efforts that included conducting basic physics experiments, developing materials to enable hypersonic flight, and inventing artificial intelligence algorithms. This funding often constituted the only reliable support for long-term high-risk projects that private industry, focused on near-term profits, typically neglects.
Now, President Donald Trump’s administration is moving to sever the link between academia and government by freezing billions of dollars in federal grants to top research institutions. This act may score political points among those accustomed to understanding academia as a left-leaning “ivory tower” insulated from ordinary Americans and private enterprise. But it reflects a dangerous misunderstanding of how the United States became militarily and commercially dominant in the first place. Research universities have long undergirded, in particular, the country’s national security through defense research, and they continue to train the pipeline of talent that powers both government and industry. Practically speaking, cutting their support does not represent a principled political stance—it is a friendly-fire assault on U.S. national security.
BETTER TOGETHER
The defense partnership that developed between universities and the federal government during World War II marked a turning point in the relationship between science and state in America. Before the war, most American scientific research was funded by foundations, university endowments, and private donations. In 1945, Vannevar Bush—the Raytheon founder who became vice president of the Massachusetts Institute of Technology and then directed the government’s Office of Scientific Research and Development, which sponsored wartime military R & D—prepared a report called Science, the Endless Frontier. Federal funding for research had already ballooned from $69 million in 1940 to $720 million in 1944. Bush, who had overseen much of the United States’ wartime scientific mobilization, argued that the United States must not stop boosting universities’ funding. In his report, he emphasized the importance of basic science research to the United States’ prosperity and security. Because modern war required “the use of the most advanced scientific techniques,” he wrote, “colleges, universities, and research institutes” would have to “meet the rapidly increasing demands of industry and government for new scientific knowledge,” and so “their basic research should be strengthened by use of public funds.”
This report became a blueprint for maintaining and expanding federal support for university research in peacetime. Institutions such as the Massachusetts Institute of Technology, the California Institute of Technology, and Stanford University quickly secured new federal grants and transformed themselves into hubs of scientific innovation, many with a direct connection to defense. MIT, for example, created the Research Laboratory of Electronics, which—supported by $1.5 million in annual funding from the Defense Department—expanded the university’s wartime research into microwave, atomic, and solid-state physics into engineering applications. By the late 1940s, grants from the Defense Department and the Atomic Energy Commission grants accounted for 85 percent of MIT’s research budget. This model—in which universities received federal funding for defense-oriented research—quickly spread, and by 1949, such grants made up 96 percent of all the public funding for university research in the physical sciences.
Federally funded university research became the backbone of U.S. global leadership.
The experiment in federally funding university research proved so successful that it became a permanent feature of U.S. government strategy. After the Soviet Union launched the Sputnik satellite in 1957, the United States responded by creating the Advanced Research Projects Agency (ARPA) to fund high-risk, high-reward scientific research—much of it conducted at universities. One early ARPA project, a collaboration with Stanford and UCLA, led to the development of ARPANET, the direct precursor to today’s internet. What began as a government investment in secure communication technology revolutionized the way the whole world exchanged information.
Universities, for their part, converted U.S. taxpayers’ dollars into innovations that made the country prosper. Nowhere was this more evident than at Stanford, where federal defense contracts and research funding supported a culture of innovation that helped create Silicon Valley. Faculty members such as Frederick Terman, who aggressively expanded the university’s statistics and engineering departments to win more Defense Department grants, encouraged students to commercialize their research, enabling the founding of companies such as Hewlett-Packard and Fairchild Semiconductor that would become cornerstones of the computing revolution.
While many other countries, such as France and the United Kingdom, continued to direct government funding for scientific research mainly toward government labs, the United States built a decentralized research system anchored in its universities. This decentralized system not only accelerated technological progress but also helped defense-related innovations flow into private commerce, giving U.S. industry a clear edge that the Soviet Union struggled to match, despite its extensive investments in technical education. By the end of the twentieth century, this system of federally funded university research had become the backbone of the United States’ global leadership.
LONG-TERM RELATIONSHIP
The same alliance that propelled Cold War–era breakthroughs continued to propel innovation after the Cold War—and to underwrite U.S. national security. But since the early 1990s, the stakes have become even more complex. Rapidly advancing technologies such as artificial intelligence, hypersonics, space systems, and quantum computing are creating new national security challenges as well as potential solutions. Although private companies such as OpenAI and Google are popularizing new AI models, the core technologies that power these systems were developed by researchers trained in university labs sustained by decades of publicly funded research. Without substantial U.S. government investment into universities, there would be no AI revolution to commercialize.
Indeed, academic research rarely stays confined to university labs. The flow of knowledge and expertise from academia into industry is what transforms abstract scientific insights into deployable technologies with strategic and economic value. Many universities have so-called technology transfer offices that work to patent inventions, license new technologies, and support startups. Through these initiatives, discoveries made on campuses migrate into the commercial sector and startup ecosystem, preserving the United States’ dominance in advanced technology. Today’s driverless vehicles, for instance, rely on a Light Detection and Ranging (LIDAR) system that originated from federally funded missile-tracking research at MIT.
This migration of ideas is accompanied by a migration of people. American graduate programs in engineering, applied physics, and computer science are among the most respected in the world, attracting top-tier talent and serving as engines of innovation. These programs function as incubators for the workforce that goes on to power the defense sector, the tech industry, and government research agencies. For example, Jensen Huang came to the United States to study electrical engineering at Oregon State University and earned his master’s in engineering at Stanford. The year after he graduated from Stanford, he founded the semiconductor company Nvidia, which has enabled the AI revolution.
Students trained in federally supported labs often move fluidly between academia, national laboratories, and private industry. Ashlee Vance’s biography of Elon Musk depicts the university-to-SpaceX pipeline: “Musk would personally reach out to the aerospace departments of top colleges and inquire about the students who had finished with the best marks on their exams.” Poaching from top aerospace departments allowed SpaceX to go from a risky startup to the world’s premier launch provider at a time when the United States’ dependence on Russian launch systems posed serious national security risks.
Yet the advantage conferred by the United States’ decentralized research funding system is no longer assured. Rivals have studied the U.S. model closely and are moving aggressively to replicate it. China, in particular, is racing to close the gap by pouring state investment into its universities. The People’s Liberation Army now collaborates with leading Chinese technical institutes to accelerate the development of dual-use technologies, particularly in AI, space systems, and cyber warfare. But there is one advantage China cannot easily replicate: the openness of U.S. universities. Authoritarian states can flood labs with money, but they cannot manufacture the academic freedom and economic dynamism that make the U.S. university system a magnet for global talent.
MISTAKEN IDENTITY
The Trump administration’s recent moves are stripping university labs of their funding by freezing Defense Department research grants to institutions it deems ideologically noncompliant—effectively targeting the very research pipelines that sustain national security innovation. Weakening the university-defense research partnership is a strategic miscalculation with far-reaching consequences. Universities are the channels through which scientific discoveries yield real-world applications and talented youth become world-changing entrepreneurs and innovative defenders of national security. The Trump administration might argue that it is willing to continue funding if universities align with its ideological demands. But ceding independence in exchange for scientific funding would undermine the very rigor and openness that has given the U.S. university system its edge for decades.
If the government allows ideological discomfort to disrupt its alliance with research universities, it will sacrifice its advantage in innovation and competitiveness. Cutting off Defense Department funding to universities will not halt defense innovation. But it will help ensure that it moves elsewhere. Some talent will drift toward private firms, where pressure to generate short-term profit often precludes a focus on projects that align with long-term national security priorities. Other talent and resources may move to foreign institutions eager to capitalize on U.S. retrenchment. To enable these shifts out of political pique is not principled—it is self-defeating.
Loading…