The US needs a new social contract to boost corporate investment in innovation – and preserve its technological leadership.
Over the past century, America’s competitive advantage has been built largely on the willingness of its big corporations to make substantial investments in scientific research and apply scientific knowledge in downstream R&D. Yet in recent decades, US corporations have been gradually withdrawing from scientific research, preferring instead to access scientific knowledge from universities and startups.
Once-formidable industrial labs have been shut down. In the early and mid-20th century, the DuPont Central Research & Development organization was run on a par with top academic chemistry departments. In the 1990s, however, DuPont’s attitude toward research changed. The company began emphasizing the business potential of projects over the knowledge gained from basic research. The number of patents the company filed with the US Patent and Trademark Office increased from about 1,600 in 1994 to roughly 3,500 in 2012 – while the number of journal articles authored by DuPont scientists fell from 749 in 1994 to 245 in 2015. The priority was shifting from upstream scientific research to downstream development – and in 2016, following pressure from activist investors, DuPont’s Central Research & Development organization ceased to operate as an independent research unit. (It was merged with the firm’s engineering division.)
The decline of corporate science can be seen in aggregate statistics as well. Over the past three decades, the composition of business R&D has changed to include less ‘R’ and more ‘D’. The share of research, both basic and applied, in total business R&D expenditure in the US fell from about 30% in 1985 to below 20% in 2015. The decline of the corporate lab was especially striking given that the size of America’s public corporations increased substantially during this period. For example, net sales at GE grew from around US$25 billion in 1980 to around $100 billion in 1998 – yet the number of doctorate holders employed at GE’s corporate research laboratory dropped from 1,649 in 1979 to only 475 in 1998.
Today, the result is that the US is losing its historic lead in innovation. Rival nations are quickly catching up in R&D investments and R&D intensity. Companies from East Asia have taken the lead in key industrial segments where established American corporations have cut investment in scientific research. Yet that corporate investment is vital for preventing further erosion, because it is a unique activity that cannot be replicated by other players in the US innovation ecosystem.
To incentivize investments in scientific research, the US government should renew its ‘social contract’ with large corporations, centring around science-based innovation. Leading corporations must commit to invest in long-term, upstream research, which will further the national interest as well as shareholder value. In return, they should expect support and forbearance from policymakers, anchoring the legitimacy of corporate ‘bigness’ in advancing science: in other words, recognizing the vital role that big businesses can play in exploiting the transformative technologies of the digital era.
America’s competitive position
America still leads in technology fields where technical advances rely heavily upon scientific knowledge. An example is the emerging field of quantum computing, where US corporations are global leaders in both science and technology. Large corporations – including IBM, Intel, Microsoft, Amazon and Google – have invested heavily in quantum computing, often working in close partnership with universities and startups. Maintaining this lead requires continued efforts, as the Chinese government and corporations have also recognized the importance of the field and are investing heavily to catch up.
In October 2019, for instance, Google announced that its Sycamore quantum machine had reached “quantum supremacy”, the breakthrough standard based on the ability to solve problems beyond the reach of classical computers. (Sycamore solved a problem in 200 seconds that would have taken traditional computers about 10,000 years.) Yet in December 2020, researchers at the University of Science and Technology of China presented findings indicating that they too had reached quantum supremacy.
At the same time, America’s lead has shrunk in mature technology fields, such as semiconductors. Back in the 20th century, both the science and the technology underlying microprocessor design and production were American.
Today, while US corporations still play an important (though less dominant) role in semiconductor science, they no longer lead in semiconductor fabrication. Most semiconductor manufacturing has moved overseas. In 2020, only 12% of the world’s chips were manufactured in the US, compared to 22% in Taiwan, 21% in South Korea, 15% each in China and Japan, and 8% in Europe. The increased distance between research and manufacturing may have made it harder for American corporations to apply scientific knowledge in downstream semiconductor invention and innovation.
Increasing innovation investment
What are the mechanisms that drive investments in innovation? Free markets and open societies allow individuals to experiment with new ideas and bring them to fruition as innovations and commercial applications. But that does not guarantee an optimal level of investment in science and technology, for two reasons. First, when ideas are at an embryonic stage, it is often difficult to estimate the potential returns on investment. Second, in competitive markets, it may be difficult to appropriate a significant fraction of those returns. Self-interested private companies may be reluctant to make the necessary investments, especially when they occupy only a small fraction of the market.
To complement the invisible hand of free markets, we need the visible hands of the state and large corporations. Government agencies and corporate leaders can direct resources at scale toward promising sectors when the prospects of potential returns are uncertain and far in the future. Indeed, the federal government, through universities, research institutions and federal laboratories, has financed the lion’s share of investments in scientific research since World War II. Large federal procurement contracts have incentivized corporate investments in R&D by guaranteeing demand for the resulting innovations. In turn, large corporations have made important contributions to research and invention. Fundamental advances in technologies as varied as the transistor, laser, and the scanning tunnelling microscope – used for imaging surfaces at the atomic level – were born in corporate labs, earning Nobel Prizes for corporate scientists.
To retain its comparative advantage in the 21st century, the US must encourage close interactions between the different components of its innovation ecosystem – large corporations, universities, startups, and government laboratories. Innovation thrives in ecosystems that combine openness and diversity with a degree of central direction and support. A recent example is the accelerated development of Covid-19 vaccines. Historically, infectious disease vaccines took several years to develop. Launched on 15 May 2020, the US government’s Operation Warp Speed awarded approximately $13 billion to six vaccine companies – Moderna, Pfizer/BioNTech, Janssen, AstraZeneca, Sanofi/GSK, and Novavax – to develop or manufacture vaccines. Just seven months later, on 14 December 2020, the first vaccine shots were administered, demonstrating the potential of large-scale public-private partnerships to address significant societal problems.
The role of corporate science and innovation
Over recent years, corporations have increasingly been building their own innovation ecosystems. Corporate venture capital funds have provided funding to early-stage companies with high growth potential. And corporate research contracts have grown to represent a significant share of funding for university science and technology. Under the current division of innovative labour, universities specialize in scientific research, startups translate it into invention, while large corporations specialize in product development and commercialization. Conventional wisdom suggests that increased specialization leads to efficiency.
However, corporate science is different from science performed in universities, startups, and national labs. Compared to university research, large corporations’ research is more mission-oriented in nature. To invest their own money in scientific exploration, for-profit corporations must have at least a reasonable idea of how new knowledge might benefit their bottom line.
It is precisely the fact that corporate science is more applied that makes it so socially valuable. University science expands the frontiers of human knowledge. Corporate science produces results that help to bridge the vast chasm between ‘pure’ science and applications. This translational effort has historically proven crucial for connecting abstract scientific ideas to concrete prototypes.
The concern is that despite the rise of corporate innovation ecosystems, the overall picture for corporate science is one of decline. The visible hand of large corporations has atrophied in recent years. This has reduced the innovation potential of the US economy. The reduction has not been uniform across industries: in some sectors, such as AI, large American corporations still dominate and conduct important research. In others, such as biotech, translational research is now performed effectively by universities and university spinoffs, often backed by venture capital. In yet other sectors such as ‘clean tech’, large investments in infrastructure are crucial to overcome the under-provision of public goods. This is where the US is lagging.
Universities and other research institutions have been stepping in. They have increasingly been asked to demonstrate their direct societal impact.
Consequently, they have ramped up technology transfer, industrial collaboration and entrepreneurship education. These developments are welcome, but there are limits to what universities can realistically achieve. That comes down to the fundamental differences between university and corporate research: it is more curiosity-driven than mission-focused, favours insight over finding solutions to specific problems, and generally requires significant additional integration and transformation before its scientific insights can become economically useful.
Conversely, large corporate labs have many characteristics that make them very valuable for science-based innovation. Large corporations have access to significant resources, can more easily integrate multiple knowledge streams, and can direct their research toward solving specific practical problems, with a view to producing commercial applications. This is not to downplay the important contributions that universities and small firms make to American innovation. Rather, the point is that large corporate labs have complementary capabilities that may be difficult to replace.
‘Bigness’ and public goods
Even as many established corporations are withdrawing from research, some of the leading technology firms continue to invest in scientific research. The research laboratories of big corporations are an important source of scientific advances and technical breakthroughs. Big businesses are vital for applying these advances in the development of commercially viable products and processes – and their size matters. These companies have been the source of many inventions known as general purpose technologies (GPTs) due to their broad applications within and across industries. GPTs not only benefit the corporations that first create them, but society as a whole: they are public goods with large social returns.
GPTs combine knowledge from distinct scientific fields. Industrial labs are organized around technical problems, while university labs are organized by the rigid boundaries of academic departments. Arguably, the most important GPT of the 20th century – the transistor – was invented by the largest and most renowned US corporate lab, Bell Labs. Bell’s advantage over universities and startups lay in its ability to attract world-leading scientists to work on a common mission, patiently providing them with the resources needed to accomplish scientific breakthroughs that straddled multiple fields. By their nature, GPTs have broad applicability: a company that successfully creates a GPT will naturally try to apply it in different markets, and try to integrate existing products based on a shared general purpose use. Large firms with broad scope have an inherent advantage here, so science and GPTs go hand-in-hand with corporate ‘bigness’. That will be critical for the GPTs of the future, such as quantum computing and AI.
To retain its technological leadership, therefore, the US should renew its social contract with large corporations, recognizing that increasing corporate innovation is critical to increasing the potential of the economy as a whole. In turn, large US corporations will find that performing upstream scientific research and applying it to downstream technology development allows them to remain competitive when facing foreign corporate giants. US technological leadership has undoubtedly been eroded in recent decades, but there is every reason to believe it can be revitalized for the decades ahead.
Sharon Belenzon is professor of strategy at the Fuqua School of Business, Duke University.