Can marketplace science be trusted?
Four years after the first issue of Nature was published, the US National Academy of Sciences (NAS) faced an existential crisis. In October 1873, one of its original members demanded the expulsion of another member for swindling.
Josiah Whitney, California’s state geologist, accused Benjamin Silliman Jr, professor of applied chemistry at Yale University in New Haven, Connecticut, of accepting large sums from California oil companies in return for favourable, possibly fraudulent, science. Silliman responded forcefully that company funding for science was evidence of responsibility, not misconduct: companies needed objective “technical opinions”. Without science, swindling would be more common, he argued.
NAS president Joseph Henry, secretary of the Smithsonian Institution and a former consultant to Samuel F. B. Morse, inventor of the telegraph, had to agree. If the NAS expelled every member who had ever consulted for a private company, it would not survive. Henry rejected the efforts to remove Silliman. More importantly, he resolved to expand the NAS membership; new members were to be judged on the basis of their research, not on the source of their income1. By the 1870s, it was already clear that industry relied on science.
The Silliman–Whitney controversy marked a watershed in the relationship between science and industry. For US scientists, as well as many in Britain and Europe, private companies had become valuable patrons, supplying both funds for research and problems to be researched, and were gainful employers who provided short-term commissions. Likewise, companies regarded scientists and their findings as profitable to the development of their respective industries.
Over the next 150 years, relations between science and industry continued to evolve — in four significant stages. Scientists moved from part-time consultants to full-time corporate researchers, and then to academic entrepreneurs. Industry grew from a scattering of local businesses to a concentration of large companies, and on to multinational corporations with global reach. Although these transformations might seem symbiotic, and even inevitable, the very fact that US scientists and industries emerged as leaders and exemplars (in terms of employment, funding, publishing, patenting and innovating) serves as a cautionary reminder of the contingent nature of such developments.
Consultancy (1820–80)
At the heart of the NAS crisis was an essential tension in the relations between science and industry: can the pursuit of knowledge be corrupted by the pursuit of profit? To Whitney and his allies, the answer was obviously yes. Their ‘pure’ science needed to be practised in places protected from the profit motive, such as government agencies or well-endowed universities. Silliman and supporters of ‘applied’ science, by contrast, believed the interactions between science and industry to be mutually advantageous. Indeed, the emergence of a distinct kind of endeavour called applied science characterized a new era in which research would address more and more industrial concerns, and private enterprise would, ideally, become a steady supporter of that work2.
The profession of scientific consulting goes back to the early nineteenth century, when individuals or groups of capitalists occasionally commissioned scientists to examine prospects in farming, mining, transportation (canals and railroads) and manufacturing. These fee-for-expertise engagements were short term and advisory. By the 1870s, changes in US commercial law (similar to those in British and European law) allowed the formation of limited-liability, joint-stock companies. These businesses, with their large pools of funds and numerous shareholders looking for investment assurances, regularly consulted scientists. As the engagements became both more routine (continuous testing and analysing of existing products and processes) and more investigative, scientists began to receive lucrative contracts and retainers1.
In the United States, geologists were among the most active consultants during the Gilded Age, a period of rapid economic growth from the 1870s to the 1890s, especially in precious-metal mining in the area west of the Mississippi River. In Britain and Germany, the most prolific consultants were chemists, because of their essential expertise in new products such as acids, soaps, paints and especially synthetic dyes, including mauve and alizarin. Consulting chemists also found themselves in prominent public roles as expert witnesses in sensational patent cases. Witness-box quarrelling among chemists made good newspaper copy, and it highlighted profound developments in the chemical industries. Changes in patent law in the United States, Britain and Germany allowed inventors to claim those new chemical products and processes as their intellectual property (IP) instead of judging them to be scientific discoveries, which were, by definition, unpatentable.
Industry (1880–1940)
At the turn of the twentieth century, the independent consulting scientist was replaced by the salaried researcher in new industrial laboratories. These labs represented the incorporation of applied science; that is, the creation of a separate place within the organization for ‘research and development’ — a phrase that entered the lexicon at this time.
In Germany, the largest dye companies, such as Bayer, Hoechst and BASF, were the first to establish dedicated labs for chemical research. These were connected to production departments, also staffed by university-trained chemists, and to specialized legal departments, from which the new products and processes were submitted for patenting. This type of industrialized invention, with close connections between German academic chemistry and company labs, was firmly established before the First World War3.
In the United States, the prototype for the industrial research lab appeared in the electrical industry, when inventor Thomas Edison set up an ‘invention factory’ in Menlo Park, New Jersey, in 1876. Edison wanted to replace what had been an unpredictable act of creative genius with a regular and reliable system. He recruited machinists, mechanics, chemists, physicists and mathematicians to work on technical problems connected to telegraphy and electric lighting. Although their efforts were collaborative, only the ‘Wizard of Menlo Park’ (the singular inventor) was listed on more than 1,000 US patents, including those for the phonograph (1878) and electric light bulb (1880)4.
The looming expiration of that original light-bulb patent and the threat from other lighting companies impelled General Electric (GE), the corporation that took over Edison’s Electric Light Company and all his patents, to establish the aptly named Research Laboratory in 1900 in Schenectady, New York. This proved profitable within a decade — commercially, with the invention of a new light bulb that restored GE to its dominant market position, and professionally, with the recruitment of more than 250 engineers and scientists.
A few other large US corporations followed suit and pioneered their own formal research and development (R&D) labs — DuPont (1903), Westinghouse Electric (1904), American Telephone and Telegraph (AT&T, 1909) and Eastman Kodak (1912).
It was the First World War and the embargo on all German products, especially chemicals, that was the catalyst to the golden age of ‘industrial research’, a neologism of the 1920s. Between 1919 and 1936, US corporations established more than 1,100 labs in nearly all industries — petroleum, pharmaceuticals, cars, steel — thereby dominating the world’s industrial research. In 1921, these employed roughly 3,000 engineers and scientists; by 1940, there were more than 27,000 researchers. At the end of the Second World War, the figure was nearly 46,0005.
This remarkable proliferation reflected the massive scale of vertically integrated corporations that controlled nearly all areas of their respective industries, from natural resources through R&D to mass production and mass marketing. Industrial research was also fuelled by radical changes in US patent law that allowed these behemoths to claim the IP of their employees. The inventor was now the corporation.
During the Great Depression, critics singled out modern big business for its ruinous consequences to society — unemployment, overproduction and bankruptcy. Having research in thrall to industry raised the alarm, again, that capitalism corrupted science. So corporate captains and R&D directors marshalled the cornucopia of wondrous consumer products (‘technology’ in the new parlance) created by their science-based industries. In this story, science in industry was good; it guaranteed efficacy, efficiency and safety. In words that nineteenth-century consulting scientists would have understood, consumers could trust these modern technologies (and their corporations) because of the R&D.
Read the full article at www.nature.com
PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorporated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX.
Please DONATE TODAY To Help Our Non-Profit Mission To Defend The Scientific Method.
Trackback from your site.
marilyn
| #
how do I stop these goddamned updates that keep popping up on my computer????????
Reply
tom0mason
| #
“Can marketplace science be trusted?” what is the alternative? A state run science program that is subjected to the whim and fancy of politicians and their backers?
“Can marketplace science be trusted?” – no, but it is far better than the alternative, and therefore readers of science MUST read the all research and theories with a skeptical eye, as no true science is ever settled.
However I believe ‘climate science’ is settled, as Obama said, and therefore it is not a true science.
Reply