World War II supposed a substantial increase in US federal R&D spending, and it may be regarded as an imperfect natural experiment of what happens when you transition a country from a relatively “laissez faire” (with many caveats) model of scientific support to one where there is heavy government involvement. In addition to that, WWII may have implied a societywide forced learn by doing program. It has been argued that WWII caused GDP growth to be higher than it would otherwise have been. Here I examine those claims.
This chart below shows the evolution of federal R&D expenditures in the US, and the one below that shows expenditures as a % of GDP for the various sectors that fund research.
It’s hard to find time series from before 1953, but in Research – A national resource (1939), says that “the Federal government spent in 1937 124 million dollars”, and 108 in 1938 (Table 1, pg. 66)
From BEA, GDP (current dolars, gross) in 1937 was 93bn dollars, and so federal R&D expenditure would have been 0.13% of GDP, which is substantially less than the figure from AAAS for 1953 (0.71%).
As a comparison, in Bush (1945), aggregate industry+federal expenditures from which we can get relative numbers, in 1940 GDP was 102.9bn, so we get 0.3%, while in 1953, that number was 1.4%. Under an assumption of an even industry/federal split, federal expenditures would have been around the same order of magnitude in 1940 (even if we assume a 60% share, we get a similar number), meaning that the increase in expenditure happened later in the war, in a relatively sudden way.
Even when GDP or R&D figures may be slightly off, the 5.46x difference is hard to hide except with heavy differences in accounting methodology for what counts and doesn’t as R&D, so the fact that there was a sharp increase seems true.
Yet, a 5x increase during 16 years in R&D funding in (Even as a relative % over GDP), did not do much to GDP growth or innovation, or so I’ll argue.
A case for the opposite view, that war had some good effects can be built on top of Robert Gordon’s The Rise and Fall of American Growth. The book has a plot (ht/ to Matt Clancy) that shows what happened with US GDP and output, seemingly showing that there was a consistent departure from the pre-war trend.
The predictions of cookie-cutter endogenous growth models would predict exactly this: that a step-change in R&D funding would lead to a gradual increase and then stabilisation in the rate of growth.
However, if we choose a different time frame to do our regression we get a different picture;
Was World War II a continuation of the pre-war trend, or were the pre-war years unusual? The pre-war years were the roaring twenties, which coincided with an economic bubble that ended in the Great Depression, so the figure above may be unfair, but not too unfair: those were also years where many great discoveries and inventions were made, if they eventually found application during and after WWII that would also have raised growth, temporarily.
The war began as the economy was recovering from the bust, and the post-war period coincided with a period of expansion of trade: many countries also grew faster after WWII. And if that’s not enough counfounders, that period was also the period of the baby boom. Was this expansion elsewhere due to the diffusion of technology from the US, result in turn of war preparation?
This cannot be settled just by looking at time series without other controls, so one might want to look at TFP instead.
And so what did war do for TFP? Gordon also argues that World War II caused an increase in the rate of growth of TFP:
Why would the war cause more TFP growth?
Robert Gordon in his book argues that WWII taught the entire economy to operate efficiently, and the manufacturing sector learned better practices (by doing) as Liberty ships and other war material was being churned out. After the war, that knowledge was retained, and then applied to the civilian sector. Gordon is not shy here, he goes as far as saying that
The most novel aspect of this chapter is its assertion that World War II itself was perhaps the most important contributor to the Great Leap.[The Great Leap being a period of strong TFP growth between 1920 and 1970]
Gordon acknowledges arguments that, prior to his book, were put forward to support the opposite view. In particular, that war restricted civil investment, and retarded private capital accumulation.
Tyler Cowen (2014) argued that war instils a sense of urgency into governments such that
the very possibility of war focuses the attention of governments on getting some basic decisions right — whether investing in science or simply liberalizing the economy. Such focus ends up improving a nation’s longer-run prospects.[…]
Fundamental innovations such as nuclear power, the computer and the modern aircraft were all pushed along by an American government eager to defeat the Axis powers or, later, to win the Cold War. The Internet was initially designed to help this country withstand a nuclear exchange, and Silicon Valley had its origins with military contracting, not today’s entrepreneurial social media start-ups. The Soviet launch of the Sputnik satellite spurred American interest in science and technology, to the benefit of later economic growth.
Last time I looked at this topic was in 2015, and I missed the article above. My conclusion won’t surprise you at this stage: war or defense R&D is not good for economic growth, that was the overall view expressed by the Handbook of the Economics of Innovation in general. Since then, there has been no additional systematic looks at the literature, or relevant papers on the relation between war, military R&D expenditures, and productivity or growth in general. But there is a series of papers from Alexander Field that are worth looking into, more in that in a second.
In my 2015 article I addressed some of the examples cited, by Gordon particularly those related to aerospace; as one example jet engines were in development before the war, and airplanes did not get better during the war, compared to the pre-war trend, which is also pointed out by Field in his paper. Rockets, another example, again one could argue that war tilted the development of rocketry towards larger (military satellites tend to be larger and heavier), and more expensive (defense contractors are paid on a cost plus basis which doesn’t incentivise cost savings) rockets instead of focusing on cost reductions, more crucial for commercial purposes; an odd case where I think subsidising R&D led to negative consequences… if one discounts the military benefits, of course!
Nuclear power is another good example of why war is bad for innovation: Nuclear power evokes mushroom clouds and radiation poisoning because of its military origins, and that same military origin may have also biased the development of the technology towards designs that are better suited for nuclear submarines, and not for large scale power generation, as mentioned in the article cited at the end of my article on progress in nuclear power. Imagine instead a counterfactual where nuclear power is developed more slowly, but with more attention paid to safety and cost. Wouldn’t adoption be more widespread?
While I don’t reject the argument that Cowen makes – in fact I embraced a closely related argument in my recent article on skyscraper construction speeds, one also has to factor in other effects that may be deleterious for productivity and innovation that war may induce.
The opposite side is taken by a paper by Alexander J. Field (2019) that has been in the making for some time now (it keeps growing every time I have a look), counting 83 pages long now, where he analyses growth in TFP in the same time period, finding that TFP growth was greater (Table 3) than what Gordon suggests (3% vs 1.82%) in the 1929-1941 period, and far lower in the war period (2.01% vs 3.39%) in 1941-1948). Both authors do not limit themselves to calculating the relevant value, they also make an effort to explain why, and in addition Field also explains in which specific ways one gets from his numbers to Gordon’s or viceversa.
Why do I side with Field? There are a series of heuristics in his favour:
- The paper came later, so he has had time to incorporate some work released since Gordon’s book.
- There are no criticisms of it available
- Gordon himself has helped Field with his paper, sharing with him his calculation (And also decreasing the risk of misunderstanding Gordon)
- Field also explains the specific methodological differences that lead to his result and justifies each of them
- On top of that, I had a strong prior view on this based on my pre-existing research on related areas, especially on the transferability of military spinoffs to the civilian sector, and in general on what I wrote for my article about war and growth.
- The next section of my post coheres with his arguments
How right I think he is? How likely is it that Gordon or others will show that actually WWII was good for growth? I think it’s unlikely, to put it in numbers, I would give “Field is broadly right in his assessment” a credence of 80%.
Looking directly at outputs
Long time readers of Nintil will remember my “No Great Technological Stagnation” wherein I argued that trends in the rate of improvements of diverse types of technologies seem to be doing fine. But one can also look at whether or not there was an acceleration in 1939-1945, or if there was an acceleration in the pre 1939 period vs post 1945 period. There aren’t that many time series going back into the past, but many continued as usual during that period: batteries, energy transportation, automotive engines, the efficiency of conversion of electricity into light, heat, or motive power. One can make a case for Moore’s law representing a trend break with respect of units of compute per $: Early computers arose around the time of the war, were put to use during it, and heavily invested in after it. But: If one looks at enough trends, you are bound to find at least one that gets better in the period you are interested in.
One may reply: But this tells us nothing of breakthroughs, there are no trends for the internet, or say, the appearance of touchscreens. Measuring those is harder, but those attempts are measuring it, like Huebner (2005) do not find anything like an acceleration.
Perhaps one should look at the impact on health, as a large chunk of that extra funding has gone to health research. But even then, we don’t find an increase on life expectancy, a change in the rate of decline on cancer rates (there is little data on this one, so we can’t say that much).
Robert Gordon in his book ends up saying that innovativeness broadly measured has declined; one could conclude that it has stayed the same depending on how you count that innovativeness, but an acceleration seems the least unlikely of the options to me, if we count the aggregate evidence in his book.
Patents could be another -highly imperfect- indirect measure, and again we find that the war period did not induce a change with respect to the previous era.
So why didn’t that extra R&D translate into growth?
It could be argued, that such an increase shouldn’t have been expected to do much on TFP to begin with, because large part of it was in defense R&D, not civilian R&D (Mowery & Rosenberg, 1998). This is to me the key of the explanation, in line with what Field argues in his paper and the consensus view. Military R&D competes with civilian R&D for resources, without many useful spillovers. When one is prompted to think of military R&D perhaps ARPANET come to mind, but in the grand scheme of things those projects were small, and cheap. Think instead of large defense contractors, developing faster fighter jets, and more powerful bombs. Now imagine all those engineers and scientists building cars and better batteries.
Or, perhaps there are situations where more money into research do not result in more or better research, but on better paid engineers as scientists, as Goolsbee (1998) found. This makes sense, if not everyone is capable of doing high level research, then pouring money into it will help insofar as the relevant research is constraint by material resources. But if it is not, you’ll get higher prices, not a greater amount. Whether or not this phenomenon is still around is a topic for another post.
Or, perhaps it was a demographic effect. One of the fun predictions of endogenous growth models is that the rate of growth of the economy is strongly tied to the rate of growth of population (more people, more ideas). The post WWII years were also the years of the baby boom, and from 1940 to 1970 the median age of americans actually decreased. In recent times, it is precisely demographics thas is being explored as a potential cause of the recent TFP growth slowdown (recently featured in the Nintil links, Hopenhayn et al., 2018 , but also Vandenbroucke and Ozimek et al.)
I could not find a paper that controls for everything I’ve mentioned explicitly across countries, so we only have the disparate pieces of evidence I’ve presented above. In the aggregate I do think that my prior belief that WWII (and war, preparation for war, and military R&D spending) was not good for innovation holds, but let me know in the comments if you disagree, and why, I will revise this post accordingly.
(The code for the plot above is in the Open Nintil repo)
 Imperfect for the difficulty of controlling for all the factors above. Also if was a very idiosyncratic boost in R&D expenditure. A related question that is also worth studying is what did increases of NIH budget do to diverse metrics .