In virtually any college, you’ll find the departments that represent the trinity of the basic sciences: biology, chemistry, and physics. The fact that these departments are their own separate entities may reinforce the illusion that the subjects taught are distinct from one another. In reality, it’s difficult to impose rigorous boundaries between these disciplines. Scientific knowledge flows back and forth between seemingly distinct disciplines. Even among scientists and engineers, these delineations are constantly revised, deconstructed or reinforced.
I am currently a second-year medical student. In retrospect, one of the most challenging (but rewarding) aspect about being a pre-med was completing my pre-medical course requirements across a number of different disciplines. In retrospect, these weren’t just classes to weed out physician-hopefuls; the concepts I learned were important to elucidate the pathophysiology of many diseases. Chemistry taught me how to understand the reactions in organic chemistry, which allowed me to understand the processes of protein interactions in biochemistry, which helped me piece together the mechanisms of molecular diseases. By becoming familiar with the basics of a broad array of scientific disciplines, I was free to mix and match these concepts as needed in determining why a patient was so sick.
One of my personal idols, chemist Paul Lauterbur, famously said that all science is interdisciplinary. To prove his point, he underscored in his 2003 Nobel Laureate lecture that he, though a chemist, would be sharing the prize for physiology or medicine with a physicist. In his words, the formal categorization of scientific knowledge exists for administrative and didactic convenience rather than ontological reality.
In fact, this interdisciplinary nature can be observed in Lauterbur’s scientific contributions, which made the development of magnetic resonance imaging possible. He combined concepts from different fields to create a successful, novel technology.
Lauterbur’s career trajectory itself highlights the importance of an interdisciplinary approach to science. Lauterbur was observing a mouse tissue sampling study via NMR in a biology lab when he first devised the idea of imaging with NMR. He then consulted with local mathematicians to see whether his theories were feasible, and they validated his ideas. To test whether his theory could be realized through radiofrequency coils, he consulted a physics textbook, “The Principles of Nuclear Magnetism,” by Anatole Abragam. With this, he completed a series of experiments that confirmed his ideas. Lauterbur succeeded because he did not strictly categorize his work and was comfortable using ideas from other fields.
Following the publication of his work in Nature, Lauterbur invited several scientists from multiple disciplines to share data and collaborate on projects. Lauterbur could slip between chemistry, biology, mathematics and physics and combine many ideas from these fields. He could then encourage collaboration among various scientists and caught the attention of businesses to develop machines with tremendous applications in medicine. And while many approached him after his success to say that they or their mentors had come up with similar ideas in the past, Lauterbur distinguished himself from the rest by actually realizing his idea — thanks to this revolutionary mindset.
Many upper-level science courses, particularly in a field as broad as biology, require extensive knowledge of other disciplines. Fortunately, many university departments offer a number of opportunities that encourage cross-disciplinary thinking and application. In addition, many new exciting applications in the sciences are already creating interdisciplinary collaborations. All science majors should seek instruction outside of their own immediate majors to become more capable of linking seemingly disparate ideas together. By doing so, an interdisciplinary scientific education leads to innovation and success.
Yoo Jung is a medical student at Stanford University. She is a recent graduate of Dartmouth College and a co-author of the book, “What Every Science Student Should Know,” a guide for college students interested in majoring in STEM (University of Chicago Press). You can find her on Twitter @YooJKim and on her website at yoojkim.com.
Scientific literacy – what it is, how to recognize it, and how to help people achieve it through educational efforts, remains a difficult topic. The latest attempt to inform the conversation is a recent National Academy report “Science Literacy: concepts, contexts, and consequences” (https://www.nap.edu/download/23595).
While there is lots of substance to take away from the report, three quotes seem particularly telling to me. The first is from Roberts [1] that points out that scientific literacy has “become an umbrella concept with a sufficiently broad, composite meaning that it meant both everything, and nothing specific, about science education and the competency it sought to describe.”
The second quote, from the report’s authors, is that “In the field of education, at least, the lack of consensus surrounding science literacy has not stopped it from occupying a prominent place in policy discourse” (p. 2.6). And finally, “the data suggested almost no relationship between general science knowledge and attitudes about genetically modified food, a potentially negative relationship between biology-specific knowledge and attitudes about genetically modified food, and a small, but negative relationship between that same general science knowledge measure and attitudes toward environmental science” (p. 5.4).
“Flat Earth” The Flammarion engraving (1888) Wikipedia
Recognizing the scientifically illiterate
So, perhaps it would be useful to consider the question of scientific literacy from a different perspective, namely, how can we recognize a scientifically illiterate person from what they write or say? What clues imply illiteracy?[1]
To start, let us consider the somewhat simpler situation of standard literacy. Assume we ask a person a question and that the question is clearly composed, we might expect the illiterate person to have trouble correctly interpreting what a reasonable answer should contain. Constructing a literate answer implies two distinct abilities: the respondent needs to be able to accurately interpret what the question asks and they need to recognize what an adequate answer contains.
These are not innate skills; students need feedback and practice in both, particularly when the question is a scientific one. In my own experience with teaching, as well as data collected in the context of an introductory course [2], all to often a student’s answers consist of a single technical term, spoken (or written) as if a word = an argument or explanation.
We need a more detailed response in order to accurately judge whether an answer addresses what the question asks (whether it is relevant) and whether it has a logical coherence and empirical foundations, information that is traditionally obtained through a Socratic interrogation.[2] At the same time, an answer’s relevance and coherence serve as a proxy for whether the respondent understood (accurately interprets) what was being asked of them.
So what is added when we move to scientific in contrast to standard literacy, what is missing from the illiterate response. At the simplest level we are looking for mistakes, irrelevancies, failures in logic, or in recognizing contradictions within the answer, explanation or critique. The presence of unnecessary language suggests, at the very least, a confused understanding of the situation.[3]
A second feature of a scientifically illiterate response is a failure to recognize the limits of scientific knowledge; this includes an explicit recognition of the tentative nature of science, combined with the fact that some things are, theoretically, unknowable scientifically. For example, is “dark matter” real or might an alternative model of gravity remove its raison d’être?[4]
When people speculate about what existed before the “big bang” or what is happening in various unobservable parts of the multiverse, have they left science for fantasy. Similarly, speculation on steps to the origin of life on Earth (including what types of organisms, or perhaps better put living or pre-living systems, existed before the “last universal common ancestor”), the presence of “consciousness” outside of organisms, or the probability of life elsewhere in the universe can be seen as transcending either what is knowable or likely to be knowable without new empirical observations.
While this can make scientific pronouncements somewhat less dramatic or engaging, respecting the limits of scientific discourse avoids doing violence to the foundations upon which the scientific enterprise is built. It is worth being explicit, universal truth is beyond the scope of the scientific enterprise.
The limitations of scientific explanations
Acknowledging the limits of scientific explanations is a marker of understanding how science actually works. As an example, while a drug may be designed to treat a particular disease, a scientifically literate person would reject the premise that any such drug would, given the nature of interactions with other molecular targets and physiological systems, be without side effects and that these side effects will vary depending upon the features (genetic, environmental, historic, physiological) of the individual taking the drug. While science knowledge reflects a social consensus, it is constrained by rules of evidence and logic (although this might appear to be anachronistic in the current post-fact age).
Even though certain ideas are well established (Laws of Conservation and Thermodynamics, and a range of evolutionary mechanisms), it is possible to imagine exceptions (and revisions). Moreover, since scientific inquiry is (outside of some physics departments) about a single common Universe, conclusions from different disciplines cannot contradict one another – such contradictions must inevitably be resolved through modification of one or the other discipline. A classic example is Lord Kelvin’s estimate of the age of the Earth (~20-50 million years) and estimates of the time required for geological and evolutionary processes to produce the observed structure of the Earth and the diversity of life (hundreds of millions to billions of years), a contradiction resolved in favor of an ancient Earth by the discovery of radioactivity.
Scientific illiteracy in the scientific community
There are also suggestions of scientific illiteracy (or perhaps better put, sloppy and/or self-serving thinking) in much of the current “click-bait” approach to the public dissemination of scientific ideas and observations. All too often, scientific practitioners, who we might expect to be as scientifically literate as possible, abandon the discipline of science to make claims that are over-arching and often self-serving (this is, after all, why peer-review is necessary).
A common example [of scientific illiteracy practiced by scientists and science communicators] is provided by studies of human disease in “model” organisms, ranging from yeasts to non-human primates. While there is no doubt that such studies have been, and continue to be critical to understanding how organisms work (and certainly deserving of public and private support) – their limitations need to be made explicit, while a mouse that displays behavioral defects (for a mouse) might well provide useful insights into the mechanisms involved in human autism, an autistic mouse may well be a scientific oxymoron.
Discouraging scientific illiteracy within the scientific community is challenging, particularly in the highly competitive, litigious,[5] and high stakes environment we currently find ourselves in.[6] How to best help our students, both within and without scientific disciplines, avoid scientific illiteracy remains unclear, but is likely to involve establishing a culture of Socratic discourse (as opposed to posturing). Understanding what a person is saying, what empirical data and assumptions it is based on, and what does it imply and or predict are necessary features of literate discourse.
Roberts, D.A., Scientific literacy/science literacy. I SK Abell & NG Lederman (Eds.). Handbook of research on science education (pp. 729-780). 2007, Mahwah, NJ: Lawrence Erlbaum.
Klymkowsky, M.W., J.D. Rentsch, E. Begovic, and M.M. Cooper, The design and transformation of Biofundamentals: a non-survey introductory evolutionary and molecular biology course. LSE Cell Biol Edu, in press., 2016. in press.
Lee, H.-S., O.L. Liu, and M.C. Linn, Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 2011. 24(2): p. 115-136.
4. Henson, K., M.M. Cooper, and M.W. Klymkowsky, Turning randomness into meaning at the molecular level using Muller’s morphs. Biol Open, 2012. 1: p. 405-10.
[1] Assuming, of course, that what a person’s says reflects what they actually think, something that is not always the case.
[2] This is one reason why multiple-choice concept tests consistently over-estimate students’ understanding ( 3. Lee, H.-S., O.L. Liu, and M.C. Linn, Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 2011. 24(2): p. 115-136.)
[3] We have used this kind of analysis to consider the effect of various learning activities 4. Henson, K., M.M. Cooper, and M.W. Klymkowsky, Turning randomness into meaning at the molecular level using Muller’s morphs. Biol Open, 2012. 1: p. 405-10..
Mike Klymkowsky is a Professor of Molecular, Cellular, and Developmental Biology at the University of Colorado Boulder. In the area of biology education research, he developed (with Kathy Garvin-Doxas) the NSF-supported Biological Concepts Instrument (BCI), as well as a suite of virtual laboratory activities in molecular biology with Tom Lundy. He has been involved with the general question of how to develop more rigorous, coherent, and engaging courses and curricula in the biological sciences, including a re-designed introductory evolutionary, molecular, and systems biology course – Biofundamentals and general chemistry – Chemistry, Life, the Universe & Everything (CLUE), both with Melanie Cooper (Chemistry – Michigan State University). See “About this Blog” for more on Mike’s work.
Forecasters have predicted an increased chance of a colder-than-usual start to this winter due to a change in Arctic conditions and ‘unusual’ tropical rainfall patterns.
The Met Office said the conditions mean there is a 30 per cent likelihood the mercury will plunge at the beginning of this winter – the highest risk of a cold start since the bitterly cold season of 2010/11.
But the agency said it was too early to predict whether it would be a snowy, wet or dry three-month period from November.
Reports from Seal River, just north of Churchill at Churchillwild, at July 26 were crowing about seeing lots of bears onshore, with a veritable beehive of activity the weekend of 16/17 July:
“This has without a doubt been Churchill Wild’s most spectacular start to the summer polar bear watching season. …Bear numbers are up spectacularly this year and all are looking very fat and healthy, perhaps much to the chagrin of climate change “experts.” Our best day for the seductive white carnivores over the past week featured 21 polar bears sighted between the Lodge and our whale swim spot! … The ice pack, which was still visible a week ago [i.e, 17 July or so], has finally dissipated and pushed a large number of bears on to our coastline here at Seal River, with the end result being many very happy cameras!” [my bold]
You can’t believe the ridiculous stuff the Alarmists and the MSM push!!
Researchers at the University of Auckland studied aerial images (pictured) of the islands from 1945 up until 2010 and found that a new island has grown from decimated remains
Those (non) disappearing Pacific Islands: Björn Lomborg, writing for the Wall Street Journal –
Once a year or so, journalists from major news outlets travel to the Marshall Islands, a remote chain of volcanic islands and coral atolls in the Pacific Ocean, to report in panicked tones that the island nation is vanishing because of climate change. Their dispatches are often filled with raw emotion and suggest that residents are fleeing atolls swiftly sinking into the sea.
The truth is
Peer-reviewed study, published in the September 2015 issue of Anthropocene, revealed that since the middle of the 20th century the total land area of the islands has actually grown.
. . . . . . .
The media and Carbon Pollution. They call it carbon pollution. This is a lie. They are actually, non-factually referring to Carbon Dioxide Pollution. Look at this story from the Alarmist Sydney Morning Herald (SMH). (There are a zillion examples like this from the Main Stream Media (MSM).
Australia emitted 549.3 mega-tonnes (Mt) of carbon dioxide in 2014-15, up 0.8 per cent on the year before but down nearly 3per cent on projections. Emissions increases were recorded in the electricity, transport, fugitive emissions and industrial and power generation sectors and offset only by a strong decline in agricultural emissions.
Is the SMH ignorant? Is it really Carbon Dioxide Pollution?
Carbon dioxide is a colorless, odourless, nontoxic gas. We exhale it. Carbon dioxide is a necessary component for photosynthesis and the growth of green vegetation. (link)
What happens when they can’t get their “dirty weather,” as Al Gore calls it? Then they’ll just have define down what a disaster is.
Eleven years ago, Gore swore that “the science is extremely clear now.” Global warming was “magnifying” the “destructive power” of the “average hurricane,” he said. Man’s impact on the environment “makes the duration, as well as the intensity of the hurricane, stronger.”
So why does the SMH use a graphic pushing a lie? The Alarmists kept saying that man made global warming would increase Hurricanes in mainland America, but the numbers have reduced to zero. What do these clowns do? Redefine Hurricanes.
These clowns, the Alarmists supported by the MSM have pushed so many false positions and, yet, they still think that they are beyond ridicule.
No. The tide has turned. The world has turned their collective backs on the man-made global warming hoax. However, it is still costing the citizens trillions.
The Polar Vortex is on the move unusually early this year, forecasters have revealed – and say it could strike the US in January.
A recent study claimed Arctic sea-ice loss is causing the Polar Vortex to shift and as a result, winters are expected to get longer and more bitter. Now, forecasters say it is ‘unprecedentedly early’.
Sensational new study shows western government climate models rely on a fatally flawed 1920’s algorithm. Scientists say this could be the breakthrough that explains why modern computers are so awful at predicting climate change: simulations “violate several known Laws of Thermodynamics.”
“This paper examines what was originally calculated as the greenhouse effect theory by Lewis Fry Richardson, the brilliant English mathematician, physicist and meteorologist.
In 1922 Richardson devised an innovative set of differential equations. His ingenious method is still used today in climate models. But unbeknown to Richardson he had inadvertently relied upon unchecked (and fatally flawed) numbers supplied by another well-known British scientist, W. H. Dines.”
Unfortunately for Richardson, Dines wrongly factored in that earth’s climate is driven by terrestrial (ground) radiation as the only energy source, not the sun. Derek Alker specifically draws attention to the key fact that:
“One of the main points the paper makes is that in the Dines model each layer of the atmosphere is THE energy source, NOT the sun, which is omitted in his table, nor the earth, as the excel model proves.”
Richardson had taken the Dines numbers on face value and did not detect the error when combining the Dines numbers with his own. Alker continues: “The archives show Richardson never double-checked the Dines work (see below) and the records do not show that any one else has ever exposed it.”
The outcome, says Alker, is that not only has the original Richardson & Charney computer model been corrupted – but all other computer climate models since. All government researchers use these core numbers and believe them to be valid even though what they seek to represent can be shown today as physically impossible.
Alker adds:
“My paper specifically describes how the theory Dines calculated in his paper violates several of the known Laws of Thermodynamics, and therefore does not describe reality.
The greenhouse effect theory we know of today is based on what Richardson had formulated from the Dines paper using unphysical numbers created by Dines. But Dines himself later suggested his numbers were probably unreliable.”
Unfortunately, Dines died in the mid 1920’s and did not inform Richardson of the error. Thereupon, in the late 1940’s, Richardson began working with another world figure in climate science – Jule Charney as the duo constructed the first world’s first computer climate model. It was then the dodgy Dines numbers infected the works.
Alker, who studied the archives scrupulously for his research reports that there is no published evidence that Richardson understood Dines’s calculation method. And we think he and Charney put the Dines numbers into the world’s first computer model verbatim.
In essence, the ‘theory’ of greenhouse gas warming from the Dines numbers can be shown to start with a misapplication of Planck’s Law, which generates grossly exaggerated ‘up’ and nonexistent ‘down’ radiative emissions figures. Then, layer by layer, part of the downward radiation is added to the layer below, which is in violation of the Second Law of Thermodynamics.
Thereby, like a domino effect, this bogus calculation method becomes GIGO (“garbage in, garbage out”) to all computers that run the program. Alker adds:
“What the climate simulations are doing is creating energy layer by layer in the atmosphere that shouldn’t be there (it has no other source than of itself). It is then destroyed layer by layer (it is absorbed and then discarded – in effect destroyed). This is all presented in such a way to give the appearance that energy is being conserved, when it is not being conserved,”
****
[1] Alker D.,‘Greenhouse Effect Theory within the UN IPCC Computer Climate Models – Is It A Sound Basis?’ (October 30, 2016), principia-scientific.org; https://principia-scientific.com/publications/PROM/GHE-UNIPCC.pdf (accessed online: November 02, 2016)
Planting trees is a cost-effective way to tackle urban air pollution, which is a growing problem for many cities. A study by US-based The Nature Conservancy (TNC) reported than the average reduction of particulate matter near a tree was between 7{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} and 24{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117}.
Particulate matter (PM) is microscopic particles that become trapped in the lungs of people breathing polluted air. PM pollution could claim an estimated 6.2 million lives each year by 2050, the study suggests. Lead author Rob McDonald said that city trees were already providing a lot of benefits to people living in urban areas.
“The global average surface temperature has increased over the 20th century by about 0.6°C.”
This Met Office graph of global temperatures for 1860-2000 was included in the IPCC’s 2001 report. It provided visual clarification of the 0.6°C temperature increase since 1900.
Overview: The sun has been completely spotless on 21 days in 2016 and it is currently featuring just one lonely sunspot region. In fact, on June 4th of this year, the sun went completely spotless for the first time since 2011 and that quiet spell lasted for about four days.
Sunspot regions then reappeared for the next few weeks on a sporadic basis, but that was followed by several more completely spotless days on the surface of the sun. The increasingly frequent blank sun is a sign that the next solar minimum is approaching and there will be an even greater number of spotless days over the next few years.
At first, the blankness will stretch for just a few days at a time, then it’ll continue for weeks at a time, and finally it should last for months at a time when the sunspot cycle reaches its nadir. The next solar minimum phase is expected to take place around 2019 or 2020. The current solar cycle is the 24th since 1755 when extensive recording of solar sunspot activity began and is the weakest in more than a century with the fewest sunspots since cycle 14 peaked in February 1906.
Every year it seems we get a new study linking cold winters to global warming and “melting” Arctic ice. Only last month I debunked the latest attempt, and I assumed this must be the same study. Turned out I was wrong! This is what Unscientific American has to say about the latest paper:
The polar vortex in recent years has brought the kind of miserable cold to northern states that made it hard to breathe outside. We’re probably in for more of the same.
That’s the finding of a new study published yesterday in the journal Nature that finds that as the Arctic warms, it is shifting the polar vortex to Europe. That in turn will bring more bursts of frigid cold to North America.
Those temperature drops could lead to miserable days in February and March, the research finds. Conversely, those drops in temperature could offset some of global warming’s effect in those regions, said Martyn Chipperfield, professor of atmospheric chemistry at the University of Leeds and a co-author of the paper.
“Climate change can lead to extremes; it’s not like a regular change, everyone to the same extent at all times and places,” he said. “Despite the overall warming, you can get in places like the Northeastern U.S. extreme cold events. That’s consistent with climate change and global warming.”
The polar vortex is a fast-moving band of air that encircles the frigid Arctic in winter months and traps it there. Its movement is part of a decades long change.
The polar vortex has actually “shifted persistently” away from North America and into Europe and Asia over the last 30 years, researchers found. That results in cooling over North America but warmer winters in Europe.
As global warming decreases sea ice, the sun’s warmth absorbed by the ocean is instead released from the ocean for a longer period of time, which disrupts the vortex.
When the vortex weakens, a growing number of climate scientists argue, the cold Arctic air migrates to lower latitudes, as happened in early 2014 and 2015. The sudden and somewhat prolonged burst of cold broke pipes and water mains and more than doubled energy bills in places like New York and New England as it wreaked havoc across a wide swath of the country.
It was 7 or 8 years ago when I was sitting in my office at the university. At that time I was a Ph.D. student in the Machine Intelligence group at the computer science department. One of the department professors knocked on my door – he was looking for my supervisor because he needed advice on how to perform some test to verify some properties of his recent experiments. Well, Finn is not here, but maybe I can help. What’s your problem? He wanted to know about different statistical tests and how and why they work. Why they work? This question puzzled me immensely because I had taken many advanced classes in statistics, read countless books on the subject, but I never recalled any of them answering or discussing “why they work?”.
Marine scientists have detected a tenfold increase in the population of phytoplankton species in the North Atlantic from 1965 to 2010. The rapid growth rate of the marine microorganisms corresponds with an increase in carbon dioxide levels. ( Jeff Schmaltz/NASA Earth Observatory | Wikimedia Commons )
******
Abnormal levels of carbon dioxide in the North Atlantic are being linked to the rapid growth of plankton population in the ocean over the past 45 years, according to a study featured in the journal Science.
A team of marine researchers, led by associate professor Anand Gnanadesikan of Johns Hopkins University, discovered that the population of microscopic marine alga known as Coccolithophores in the North Atlantic experienced a tenfold increase from 1965 to 2010.
This recent finding contradicts earlier assumptions made by scientists that the phytoplankton would find it difficult to produce plates from calcium carbonate as ocean waters become increasingly more acidic.
A nearly invisible layer of microscopic, plantlike organisms drifts through the surface of the ocean all over the globe, their movements mostly at the whim of ocean currents. When conditions are right, populations of these tiny creatures explode in vast “blooms” that span miles and miles of ocean. Called phytoplankton, they are the base of the entire ocean food web.
Like plants, phytoplankton have chlorophyll to absorb sunlight and use photosynthesis to produce food. In the Arctic Ocean, photosynthesis is frequently limited by the low angle of the winter Sun and short daylight hours, as well as by the blanket of sea ice that melts and refreezes with the changing seasons.
Over the last 30 years, however, the Arctic has warmed, and larger areas of the Arctic Ocean are now free of sea ice in the summer, which means phytoplankton are getting more sunlight. The result is that phytoplankton productivity has increased by about 20 percent based on satellite estimates of the amount of chlorophyll in the water.
Growth of microscopic phytoplankton in the Southern Ocean could double in size in the next 80 years because of climate change, according to scientists. The microscopic organisms form the basis of the entire food web, feeding everything from small fish and krill to giant whales.
Professor Phillip Boyd from the Institute of Marine and Antarctic Studies (IMAS) in Hobart has been measuring how changes in conditions affect phytoplankton in a lab setting. He has been using sophisticated modelling to predict changes.
He said oceans are warming because of climate change, and are also absorbing more carbon and become more acidic. Also, nutrient levels are rising and there are changes to the amount of light which penetrates the ocean.
“We designed an experiment where we basically threw the kitchen sink at this phytoplankton species in the lab,” he said.
“We changed the acidity of the ocean, the nutrients supplied, the amount of light and the amount of this trace element iron.
“We found that in a future ocean, in the sub-Antarctic waters our lab experiment showed that the growth rate will be about twice as high as it is at present.” He said the change would mean significant knock-on effects higher in the food chain. “For example the species diatoms are important food for krill and for other species,” he said.
“So you would expect then to see higher levels of productivity and therefore a more productive food web in the future for the sub-Antarctic.”
Results may vary between hemispheres
But the increase in productivity is not likely to extend to oceans in the northern hemisphere. “When we talk about climate change and a changing ocean, the stories are normally very doom and gloom,” Professor Boyd said.
“But really it’s going to be a balance, there will be some regions of the ocean particularly at the higher latitudes where we may see more productivity.“The flip side of that is that from other computer modelling simulations, they’re suggesting that the lower latitudes we may see a decrease in productivity.
“So there may be quite a change from region to region in ocean productivity.”
He said people sometimes attempted these experiments changing only one or two of the conditions likely to be affected by climate change. He warned his findings showed that not modelling all of the forecast changes made for extremely varied results.
“You may get a very different picture,” he said. “For example, in a different experiment published as part of this study we see that if we only change four out of the five properties we get a very different result.
“It looks like a very detrimental affect on phytoplankton.” The findings will be published in the journal Nature: Climate Change today
Written by Henrik Svensmark, DTU Space, National Space Institute, Technical University of Denmark
Now and then new results appear that suggest that the idea of cosmic ray influence on clouds and terrestrial climate does not work. “Sun-clouds-climate connection takes a beating from CERN” is the latest news story which is based on a new paper from the CLOUD collaboration at CERN [1].
It is important to note that the new CLOUD paper is not presenting an experimental result, with respect to the effect of cosmic ray generated ions on clouds, but a result of numerical modeling. CLOUD is using their experimental measurements to estimate the typical nucleation of various aerosols of small size (1-3 nm). However, for an aerosol to affect clouds (and climate) it must first grow to 50-100 nm, to become cloud condensation nuclei (CCN). CLOUD then uses a numerical model to estimate the effect of cosmic rays on the growth process, and finds that the response of cosmic rays on the number of CCN over a solar cycle is insignificant.
[The Altiplano-Puna plateau in the central Andes features vast plains punctuated by spectacular volcanoes, such as the Lazufre volcanic complex in Chile seen here.] Credit: Noah Finnegan
A new analysis of the topography of the central Andes shows the uplifting of Earth’s second highest continental plateau was driven in part by a huge zone of melted rock in the crust, known as a magma body.
The Altiplano-Puna plateau is a high, dry region in the central Andes that includes parts of Argentina, Bolivia, and Chile, with vast plains punctuated by spectacular volcanoes. In a study published October 25 in Nature Communications, researchers used remote sensing data and topographic modeling techniques to reveal an enormous dome in the plateau.
About 1 kilometer (3,300 feet) high and hundreds of miles across, the dome sits right above the largest active magma body on Earth. The uplifting of the dome is the result of the thickening of the crust due to the injection of magma from below, according to Noah Finnegan, associate professor of Earth and planetary sciences at UC Santa Cruz and senior author of the paper.
“The dome is Earth’s response to having this huge low-density magma chamber pumped into the crust,” Finnegan said.
The uplifting of the dome accounts for about one-fifth of the height of the central Andes, said first author Jonathan Perkins, who led the study as a graduate student at UC Santa Cruz and is now at the U.S. Geological Survey in Menlo Park, Calif.
“It’s a large part of the evolution of the Andes that hadn’t been quantified before,” Perkins said.
The other forces uplifting the Andes are tectonic, resulting from the South American continental plate overriding the Nazca oceanic plate. The subduction zone where the Nazca plate dives beneath the western edge of South America is the source of the magma entering the crust and feeding volcanic activity in the region. Water released from the subducting slab of oceanic crust changes the melting temperature of the overlying wedge of mantle rock, causing it to melt and rise into the overriding plate.
Perkins and Finnegan worked with researchers at the University of Arizona who had used seismic imaging to reveal the remarkable size and extent of the Altiplano-Puna magma body in a paper published in 2014. That study detected a huge zone of melted material about 11 kilometers thick and 200 kilometers in diameter, much larger than previous estimates.
“People had known about the magma body, but it had not been quantified that well,” Perkins said. “In the new study, we were able to show a tight spatial coupling between that magma body and this big, kilometer-high dome.”
Based on their topographic analysis and modeling studies, the researchers calculated the amount of melted material in the magma body, yielding an estimate close to the previous calculation based seismic imaging. “This provides a direct and independent verification of the size and extent of the magma body,” Finnegan said. “It shows that you can use topography to learn about deep crustal processes that are hard to quantify, such as the rate of melt production and how much magma was pumped into the crust from below.”
The Altiplano-Puna Volcanic Complex was one of the most volcanically active places on Earth starting about 10 million years ago, with several super-volcanoes producing massive eruptions and creating a large complex of collapsed calderas in the region. Although no major eruptions have occurred in several thousand years, there are still active volcanoes and geothermal activity in the region. In addition, satellite surveys of surface deformation since the 1990s have shown that uplifting of the surface is continuing to occur at a relatively rapid rate in a few places. At Uturuncu volcano located right in the center of the dome, the uplift is about 1 centimeter (less than half an inch) per year.
“We think the ongoing uplift is from the magma body,” Perkins said. “The jury is still out on exactly what’s causing it, but we don’t think it’s related to a supervolcano.”
The growth of the crust beneath the Altiplano-Puna plateau, driven by the intrusion of magma from below, is a fundamental process in the building of continents. “This is giving us a glimpse into the factory where continents get made,” Perkins said. “These big magmatic systems form during periods called magmatic flare-ups when lots of melt gets injected into Earth’s crust. It’s analogous to the process that created the Sierra Nevada 90 million years ago, but we’re seeing it now in real time.”
In addition to Perkins and Finnegan, the coauthors of the paper include Kevin Ward, George Zandt, and Susan Beck at the University of Arizona and Shanaka de Silva at Oregon State University. This research was funded by the National Science Foundation.
Jonathan P. Perkins, Kevin M. Ward, Shanaka L. de Silva, George Zandt, Susan L. Beck, Noah J. Finnegan. Surface uplift in the Central Andes driven by growth of the Altiplano Puna Magma Body. Nature Communications, 2016; 7: 13185 DOI: 10.1038/NCOMMS13185
Sunday’s early-morning quake near the town of Norcia is the biggest in Italy since the Magnitude-6.9 Irpinia event in the south of the country in 1980.
Back then, some 2,500 people died and more than 7,000 were injured. Thankfully, we are not expecting loss of life on that scale here.In part this is because of the strides made in recent years in improving readiness and reaction. But the fact that the population of central Italy is currently living on such an acute alert status also will have further limited any dreadful consequences.
We have now seen three Magnitude-6 tremors in Italy’s Apennines region in just three months.