The history of military technology, including the military funding of science, has had a powerful transformative effect on the practice and products of scientific research since the early 20th century. Particularly since World War I, advanced science-based technologies have been viewed as essential elements of a successful military.
World War I is often called "the chemists' war", both for the extensive use of poison gas and the importance of nitrates and advanced high explosives. Poison gas, beginning in 1915 with chlorine from the powerful German dye industry, was used extensively by the Germans and the British; over the course of the war, scientists on both sides raced to develop more and more potent chemicals and devise countermeasures against the newest enemy gases.[1] Physicists also contributed to the war effort, developing wireless communication technologies and sound-based methods of detecting U-boats, resulting in the first tenuous long-term connections between academic science and the military.[2]
World War II marked a massive increase in the military funding of science, particularly physics. In addition to the Manhattan Project and the resulting atomic bomb, British and American work on radar was widespread and ultimately highly influential in the course of the war; radar enabled detection of enemy ships and aircraft, as well as the radar-based proximity fuze. Mathematical cryptography, meteorology, and rocket science were also central to the war effort, with military-funded wartime advances having a significant long-term effect on each discipline. The technologies employed at the end—jet aircraft, radar and proximity fuzes, and the atomic bomb—were radically different from pre-war technology; military leaders came to view continued advances in technology as the critical element for success in future wars. The advent of the Cold War solidified the links between military institutions and academic science, particularly in the United States and the Soviet Union, so that even during a period of nominal peace military funding continued to expand. Funding spread to the social sciences as well as the natural sciences. Emerging fields such as digital computing, were born of military patronage. Following the end of the Cold War and the dissolution of the Soviet Union, military funding of science has decreased substantially, but much of the American military-scientific complex remains in place.
The sheer scale of military funding for science since World War II has instigated a large body of historical literature analyzing the effects of that funding, especially for American science. Since Paul Forman's 1987 article "Behind quantum electronics: National security as a basis for physical research in the United States, 1940-1960," there has been an ongoing historical debate over precisely how and to what extent military funding affected the course of scientific research and discovery.[3] Forman and others have argued that military funding fundamentally redirected science—particularly physics—toward applied research, and that military technologies predominantly formed the basis for subsequent research even in areas of basic science; ultimately the very culture and ideals of science were colored by extensive collaboration between scientists and military planners. An alternate view has been presented by Daniel Kevles, that while military funding provided many new opportunities for scientists and dramatically expanded the scope of physical research, scientists by-and-large retained their intellectual autonomy.