University of Sussex Professor Mariana Mazzucato is making headlines with her 2013 book The Entrepreneurial State, which argues that government, not the private sector, ultimately drives technological innovation. In a series of detailed case studies from information technology, pharmaceuticals, biotech, and other industries she argues that government labs and public agencies are mainly responsible for the fundamental, high-risk discovery and development that makes these technologies possible, with profit-seeking entrepreneurs jumping in only later, after the difficult work has been done.
This is a very old argument, skillfully brought to life in Mazzucato’s writings (and a popular TED talk). Remember President Obama’s “you didn’t build that” remark to entrepreneurs, during his 2012 presidential campaign? “Somebody invested in roads and bridges. If you’ve got a business — you didn’t build that. Somebody else made that happen. The Internet didn’t get invented on its own. Government research created the Internet so that all the companies could make money off the Internet.”
The view that private actors are short-sighted, and that only government can afford (or is willing) to make the long-term, high-risk, patient investments in research and development needed for technological progress is in every basic economics textbook. Even economists who are generally favorable toward free markets and limited government will say sure, the market is good at producing shoes or trucks or laptop computers, but the market cannot provide basic research — it is a “public good” that only government can provide. The New York Times recently opined:
Fundamental innovations such as nuclear power, the computer and the modern aircraft were all pushed along by an American government eager to defeat the Axis powers or, later, to win the Cold War. The Internet was initially designed to help this country withstand a nuclear exchange, and Silicon Valley had its origins with military contracting, not today’s entrepreneurial social media start-ups. The Soviet launch of the Sputnik satellite spurred American interest in science and technology, to the benefit of later economic growth.
There are several problems with this kind of argument. First, it confuses technological innovation (impressive to engineers) and economic innovation (valuable to consumers). Second, it confuses gross and net benefit — of course, when government does X, we get more X, but is that more valuable than the Y we could otherwise have had? (Frédéric Bastiat, call your office.) Third, it confuses treatment and selection effects of government spending — government typically funds scientific projects that would have been undertaken anyway, such that a main benefit of government spending on science and technology is to increase the wages of science and technology workers. Fourth, as writers like Terence Kealey have pointed out, if you look carefully at the details of the sorts of programs lauded by the Times, you find they were grossly inefficient, ineffective, and potentially harmful. (Kealey offers a powerful critique of Mazzucato’s specific views here.)
Does War Drive Innovation?
It’s useful to illustrate these points by considering the specific argument that war is an important, and even necessary, source of scientific progress, because technologies developed by the state to fight wars often have important civilian uses. Innovation is a side benefit of war, say war’s defenders.
Social science textbooks also assume that war spurs innovation and note that the large-scale manufacturing of penicillin, for example, and the development of nylon and aerosol sprays occurred during the First World War. But that’s nothing compared to the many benefits of the Second World War, we’re told, which brought us benefits ranging from atomic energy to jet engines and the world’s first electronic computing devices, which were developed to break the Nazi “Enigma” codes. Moreover, key innovations in management practice came out of the Second World War, we’re reminded, including management techniques used to improve logistics, procurement, and operations research.
The Second World War changed the nature of scientific research as well. After the war, large-scale federally-funded laboratories devoted to practical applications for new research replaced the small academic laboratories that had existed before the war. Naturally, these new laboratories were geared toward producing new technologies that the federal government wanted, and scientists flocked toward these jobs and new well-funded facilities.
It’s true that many (though not all) of these technologies were developed — typically not invented, but refined — by government scientists working on military projects. The question nevertheless remains as to whether or not this model of innovation benefits society at large. Is this a “good side” of war?
“Crowding Out” and Interest-Group Politics
The answer is no, for multiple reasons. First, if we look at each of these cases carefully, we find that the government was usually inefficient, chose bad technologies that crowded out other, privately-funded technologies, and led to inertia in research in directions that the private sector would likely never have supported.
But there is a more basic theoretical problem with the claim that military research gives us great new technologies we otherwise wouldn’t have.
It is certainly true that governments spend money on building things or doing things that otherwise would not have been built or done. But this is not necessarily a good thing.
Take the Egyptian pyramids, for example. Had there been no pharaoh, commanding a huge budget, with the ability to mobilize vast quantities of resources (including labor), there would be no pyramids. But were the pyramids unambiguously good for the people of Egypt? They were not, of course, and the pyramids were simply monuments to the power of the pharaoh and the state religion. To this day, governments build monuments to themselves all the time, whether they’re huge statues or atomic bombs. Sure, without the federal government, we might not have the Lincoln Memorial. Is that an argument for government?
Pyramids and statues are cases of the state producing a good that likely would not have been produced in any form by the private sector, but even in cases in which the government shapes the development of private goods and technologies, the distorting effects on the final outcome of research and development can be significant.
We can see these distortions in the effects of the work of Vannevar Bush, the initiator of the Manhattan Project. Bush was chairman of the National Defense Research Committee (NDRC), and later director of the Office of Scientific Research and Development (OSRD), in the Second World War.
Bush wanted a peacetime successor to the OSRD and pushed for creation of the National Science Foundation, which was established in 1950. The NSF was controversial (one proposal was vetoed by Truman in 1947) because of the lack of accountability. A key figure was Senator Harley Kilgore of West Virginia, who initially opposed Bush’s plan to distribute the money through universities (he preferred the government to own the labs) but later agreed to Bush’s model. As Kealey describes it, Kilgore’s goal was not to generate new knowledge. Rather,
Kilgore wanted to create a reserve of scientifically trained personnel who could be mobilized for strategic purposes. … The National Science Foundation, therefore, was created in 1950, in the same year (and for the same reasons) as the National Security Council. (Economic Laws of Scientific Research, p. 154.)
A few scholars have recognized the potentially harmful effects of this approach. Best known is the “distortion thesis” of historian Paul Forman, which holds that WWII and Cold War national security concerns distorted the path of the physical sciences.
Applied to technology, there is the “crowding out” thesis, most closely associated with Seymour Melman, which maintains that, during the Cold War, commercial R&D was crowded out by government-funded R&D. As summarized by the distinguished historian of technology David Hounshell,
“Research, development and manufacture for a single customer (the national security state or the military) led firms and whole industries into a kind of fatal attraction, which ultimately undermined their ability to compete in the global economy in which consumers had very different wants than those of the military; “spin offs” from military projects into the civilian economy simply did not compensate for the drawbacks of being dependent on military contracting.
Again, the Broken Window Fallacy
We see once again the relevance of Frédéric Bastiat’s Broken Window Fallacy. That is, the research and development institutions created and sustained by government are like the pane of glass in the broken window. We see it being repaired but cannot see what might have been produced with those same resources had the glass not been broken.
Similarly, we see what is produced by government scientists producing R&D for the state, but we don’t see things we would have had the market been able to function in the absence of a giant militaristic government.
There is no doubt that military spending had a substantial effect on technological innovation. But was it a good one? Military spending distorts the efforts of scientists and engineers, and redirects them to particular projects, ones that do not necessarily generate benefits for consumers.
Military-funded R&D, like any government-funded projects, does not have to pass any kind of market test, so there is no way to know if it is actually beneficial to consumers. We cannot rely on the judgments of government scientists and scholars to say what are the “best” technologies. Remember Betamax? The experts told us that Betamax technology was superior to VHS tapes, from an engineering point of view. Yet, in the end, VHS proved to be economically superior in that consumers ultimately chose VHS over Beta. Betamax failed the market test in spite of its arguably superior technology.
Today, when we look at private companies like Google, Apple, and Facebook and marvel at their innovations, we should remember that these companies are constantly subject to market tests, and that the goods and services they innovate must be accepted by consumers to be profitable. When they succeed, we know that they are creating value for society because consumers have chosen their products and services over others.
Success, for government-funded researchers and engineers, on the other hand, means winning grants and contracts, and getting more money from the taxpayer, who has little say in what gets done.
The reality is far more complicated than the myths repeated by those who claim that many of the technologies and innovations we now value were produced single-handedly by government. Yet, the historical reality does not diminish the ease with which Obama and other fans of government spending can point to innovations like the internet and the interstate highways and say “you didn’t build that.” We can only speculate on what might have been produced had the market been allowed to function. Likewise, we can still see the pyramids today and marvel at the innovation that went into their construction, but unfortunately, the wealth and labor stolen from ordinary Egyptians to build them has now been long forgotten.
This article was taken from the Mises Daily which appeared at: https://mises.org/library/government-spending-innovation-true-cost-higher-you-think