Congress is considering “The United States Innovation and Competition Act,” which includes large new subsidies for research and technology industries. The bill includes $81 billion for the National Science Foundation, $52 billion for semiconductors, and billions for various other programs. President Biden’s Jobs Plan similarly proposed new subsidies for research and technology.

America faces competitive challenges, but a lack of research spending does not seem to be a weakness. Scott Lincicome shows that U.S. R&D spending has been trending upward in recent decades as a share of the economy.

Government subsidies often generate damaging side effects. More subsidies for research may displace private research, steer markets in low-value directions, and tie up researchers in bureaucratic knots. Terence Kealey discusses pitfalls of science subsidies here.

Semiconductor entrepreneur T.J. Rodgers recently argued against subsidies to his industry. He noted that “in 1987 the Sematech consortium began spending $500 million in government funds that did zero for the industry,” and that “‘free government money’ induces horribly inefficient spending and undeserved payouts to executives and shareholders.”

Technology expert Jeffrey Funk has a great article on innovation in American Affairs. He critiques venture capital markets and current U.S. research funding. I am not on board with some of Funk’s ideas, but he nicely summarizes the bureaucratic distortions of government-funded research that policymakers should consider before increasing subsidies further.

University engineering and science programs are also failing us because they are not creating the breakthrough technologies that America and its start-ups need.

This decline in technological breakthroughs cannot be attributed to a lack of funding: governments have been funding university re­search for more than half a century, yet research productivity has declined overall, including research into semiconductors, agriculture, and pharmaceuticals. Other than the internet being commercialized in the 1990s—the technological foundations of which were created in the 1960s and 1970s—few new science-based technologies have emerged in the last thirty years. And the small number of successes were mostly achieved by foreign competitors: lithium-ion batteries, OLEDs, and solar cells, for instance, were commercialized by Japa­nese, Korean, and Chinese companies.

Even Nobel Prize–winning research seems to lead to fewer techno­logical breakthroughs than in the past, according to a survey of top scientists.

Furthermore, looking at some of these prizes in more detail reveals that much of the research work was done at corporate and not uni­versity labs. For instance, among Nobel Prizes for physics and chem­istry awarded since 2000 in lithium-ion batteries, LEDs, charge-coupled devices, lasers, integrated circuits, and optical fiber, nine of the seventeen recipients did their work at corporate labs. The only high-impact award that solely involved university research was gra­phene.

Many scientists point to the nature of the contemporary university research system, which began to emerge over half a century ago, as the prob­lem. They argue that the major breakthroughs of the early and mid-twentieth century, such as the discovery of the DNA double helix, are no longer possible in today’s bureaucratic, grant-writing, administration-burdened university. The idea of scientists following their hunches to find better explanations and thus better products and services has yielded to the reality of huge labs pursuing grants to keep staff employed. Young scientists have become mere cogs in a grant-seeking machine, forced to suppress their curiosity and do what they are told by senior colleagues who are overwhelmed by administrative work. Two-author papers, like the one describing the structure of DNA, have been replaced by hundred-author papers. Scientific merit is measured by citation counts and not by ideas or by the products and services that come from those ideas. Thus, labs must push papers through their research factories to secure funding, and issues of scientific curiosity, downstream products and services, and beneficial contributions to society are lost.

Nobel laureates have similar criticisms of the contemporary cul­ture of academic research. Various laureates in biochemistry, biology, computer science, and physics have claimed that they would now be denied funding for their prizewinning research because of grant-issuing bodies’ preference for less risky projects; one physicist even claims he could not get a job today. In today’s climate every project must succeed, and thus our scientists study only marginal, incremental topics where the path forward is clear and a positive result is virtually guaranteed.

One option is to recreate the system that existed prior to the 1970s, when most basic research was done by companies rather than uni­versities. This was the system that gave us transistors, lasers, LEDs, magnetic storage, nuclear power, radar, jet engines, and polymers during the 1940s and 1950s. Apart from these past successes, there are a number of structural reasons why conducting basic research at corporate labs is likely to produce more useful results than in uni­versities.

First, corporate scientists are focused more on solving prob­lems, whereas scientists in universities must also take on the administrative work of writing papers—often in collaboration with dozens of coauthors—managing PhD students and postdocs, reading dissertations and draft papers, writing letters of recommendation, and filing grant proposals to keep themselves, their students, and their staff em­ployed. Unlike their predecessors at Bell Labs, IBM, GE, Motorola, DuPont, and Monsanto seventy years ago, top university scientists are more administrators than scientists now—one of the greatest mis­uses of talent the world has ever seen. Corporate labs have smaller administrative workloads because funding and promotion depend on informal discussions among scientists and not extensive paperwork.

Second, the informal discussions and collaboration characteristic of corporate labs allows the scientists who work there to make better decisions about both the merits of different designs in the short term and problem-solving approaches for the long term. These informal discussions can also focus on issues of cost and performance: how to measure them and how to improve the technologies along these met­rics. Such discussions rarely occur in universities because their goal is the publication of research rather than the development of new products and services.

Third, conducting basic research at corporate laboratories can help avoid the problem of hyper-specialization in academia. Because publi­cations are the key output of university professors, there has been a growing number of journals over the last fifty years to accom­modate the growing number of university scientists, and these jour­nals have become increasingly specialized. For example, Nature now publishes more than 144 journals and the Institute of Electrical and Electronic Engineers more than 200. This growing specialization turns professors into narrowly focused researchers unable to under­stand not only the needs of the marketplace but also the metrics of cost and performance for a new technology, which should dictate long-term goals. Relocating more basic research to corporate labs can reduce this specialization by placing scientists in an organization whose goal is to commercialize new technologies.