The Internet of Things, activated through apps, promises tremendous convenience to homeowners. But it may also prove irresistible to hackers. [Photo Credit: Carlos Gonzalez for The New York Times]
SAN FRANCISCO — The so-called Internet of Things, its proponents argue, offers many benefits: energy efficiency, technology so convenient it can anticipate what you want, even reduced congestion on the roads.
Now here’s the bad news: Putting a bunch of wirelessly connected devices in one area could prove irresistible to hackers. And it could allow them to spread malicious code through the air, like a flu virus on an airplane.
Researchers report in a paper to be made public on Thursday that they have uncovered a flaw in a wireless technology that is often included in smart home devices like lights, switches, locks, thermostats and many of the components of the much-ballyhooed “smart home” of the future.
The researchers focused on the Philips Hue smart light bulb and found that the wireless flaw could allow hackers to take control of the light bulbs, according to researchers at the Weizmann Institute of Science near Tel Aviv and Dalhousie University in Halifax, Canada.
That may not sound like a big deal. But imagine thousands or even hundreds of thousands of internet-connected devices in close proximity. Malware created by hackers could be spread like a pathogen among the devices by compromising just one of them.
And they wouldn’t have to have direct access to the devices to infect them: The researchers were able to spread infection in a network inside a building by driving a car 229 feet away.
Just two weeks ago, hackers briefly denied access to whole chunks of the internet by creating a flood of traffic that overwhelmed the servers of a New Hampshire company called Dyn, which helps manage key components of the internet.
Security experts say they believe the hackers found the horsepower necessary for their attack by taking control of a range of internet-connected devices, but the hackers did not use the method detailed in the report being made public Thursday. One Chinese wireless camera manufacturer said weak passwords on some of its products were partly to blame for the attack.
Though it was not the first time hackers used the Internet of Things to power an attack, the scale of the effort against Dyn was a revelation to people who didn’t realize that having internet-connected things knitted into daily life would come with new risks.
“Even the best internet defense technologies would not stop such an attack,” said Adi Shamir, a widely respected cryptographer who helped pioneer modern encryption methods and is one of the authors of the report.
The new risk comes from a little-known radio protocol called ZigBee. Created in the 1990s, ZigBee is a wireless standard widely used in home consumer devices. While it is supposed to be secure, it hasn’t been held up to the scrutiny of other security methods used around the internet.
In virtually any college, you’ll find the departments that represent the trinity of the basic sciences: biology, chemistry, and physics. The fact that these departments are their own separate entities may reinforce the illusion that the subjects taught are distinct from one another. In reality, it’s difficult to impose rigorous boundaries between these disciplines. Scientific knowledge flows back and forth between seemingly distinct disciplines. Even among scientists and engineers, these delineations are constantly revised, deconstructed or reinforced.
I am currently a second-year medical student. In retrospect, one of the most challenging (but rewarding) aspect about being a pre-med was completing my pre-medical course requirements across a number of different disciplines. In retrospect, these weren’t just classes to weed out physician-hopefuls; the concepts I learned were important to elucidate the pathophysiology of many diseases. Chemistry taught me how to understand the reactions in organic chemistry, which allowed me to understand the processes of protein interactions in biochemistry, which helped me piece together the mechanisms of molecular diseases. By becoming familiar with the basics of a broad array of scientific disciplines, I was free to mix and match these concepts as needed in determining why a patient was so sick.
One of my personal idols, chemist Paul Lauterbur, famously said that all science is interdisciplinary. To prove his point, he underscored in his 2003 Nobel Laureate lecture that he, though a chemist, would be sharing the prize for physiology or medicine with a physicist. In his words, the formal categorization of scientific knowledge exists for administrative and didactic convenience rather than ontological reality.
In fact, this interdisciplinary nature can be observed in Lauterbur’s scientific contributions, which made the development of magnetic resonance imaging possible. He combined concepts from different fields to create a successful, novel technology.
Lauterbur’s career trajectory itself highlights the importance of an interdisciplinary approach to science. Lauterbur was observing a mouse tissue sampling study via NMR in a biology lab when he first devised the idea of imaging with NMR. He then consulted with local mathematicians to see whether his theories were feasible, and they validated his ideas. To test whether his theory could be realized through radiofrequency coils, he consulted a physics textbook, “The Principles of Nuclear Magnetism,” by Anatole Abragam. With this, he completed a series of experiments that confirmed his ideas. Lauterbur succeeded because he did not strictly categorize his work and was comfortable using ideas from other fields.
Following the publication of his work in Nature, Lauterbur invited several scientists from multiple disciplines to share data and collaborate on projects. Lauterbur could slip between chemistry, biology, mathematics and physics and combine many ideas from these fields. He could then encourage collaboration among various scientists and caught the attention of businesses to develop machines with tremendous applications in medicine. And while many approached him after his success to say that they or their mentors had come up with similar ideas in the past, Lauterbur distinguished himself from the rest by actually realizing his idea — thanks to this revolutionary mindset.
Many upper-level science courses, particularly in a field as broad as biology, require extensive knowledge of other disciplines. Fortunately, many university departments offer a number of opportunities that encourage cross-disciplinary thinking and application. In addition, many new exciting applications in the sciences are already creating interdisciplinary collaborations. All science majors should seek instruction outside of their own immediate majors to become more capable of linking seemingly disparate ideas together. By doing so, an interdisciplinary scientific education leads to innovation and success.
Yoo Jung is a medical student at Stanford University. She is a recent graduate of Dartmouth College and a co-author of the book, “What Every Science Student Should Know,” a guide for college students interested in majoring in STEM (University of Chicago Press). You can find her on Twitter @YooJKim and on her website at yoojkim.com.
Forecasters have predicted an increased chance of a colder-than-usual start to this winter due to a change in Arctic conditions and ‘unusual’ tropical rainfall patterns.
The Met Office said the conditions mean there is a 30 per cent likelihood the mercury will plunge at the beginning of this winter – the highest risk of a cold start since the bitterly cold season of 2010/11.
But the agency said it was too early to predict whether it would be a snowy, wet or dry three-month period from November.
You can’t believe the ridiculous stuff the Alarmists and the MSM push!!
Researchers at the University of Auckland studied aerial images (pictured) of the islands from 1945 up until 2010 and found that a new island has grown from decimated remains
Those (non) disappearing Pacific Islands: Björn Lomborg, writing for the Wall Street Journal –
Once a year or so, journalists from major news outlets travel to the Marshall Islands, a remote chain of volcanic islands and coral atolls in the Pacific Ocean, to report in panicked tones that the island nation is vanishing because of climate change. Their dispatches are often filled with raw emotion and suggest that residents are fleeing atolls swiftly sinking into the sea.
The truth is
Peer-reviewed study, published in the September 2015 issue of Anthropocene, revealed that since the middle of the 20th century the total land area of the islands has actually grown.
. . . . . . .
The media and Carbon Pollution. They call it carbon pollution. This is a lie. They are actually, non-factually referring to Carbon Dioxide Pollution. Look at this story from the Alarmist Sydney Morning Herald (SMH). (There are a zillion examples like this from the Main Stream Media (MSM).
Australia emitted 549.3 mega-tonnes (Mt) of carbon dioxide in 2014-15, up 0.8 per cent on the year before but down nearly 3per cent on projections. Emissions increases were recorded in the electricity, transport, fugitive emissions and industrial and power generation sectors and offset only by a strong decline in agricultural emissions.
Is the SMH ignorant? Is it really Carbon Dioxide Pollution?
Carbon dioxide is a colorless, odourless, nontoxic gas. We exhale it. Carbon dioxide is a necessary component for photosynthesis and the growth of green vegetation. (link)
What happens when they can’t get their “dirty weather,” as Al Gore calls it? Then they’ll just have define down what a disaster is.
Eleven years ago, Gore swore that “the science is extremely clear now.” Global warming was “magnifying” the “destructive power” of the “average hurricane,” he said. Man’s impact on the environment “makes the duration, as well as the intensity of the hurricane, stronger.”
So why does the SMH use a graphic pushing a lie? The Alarmists kept saying that man made global warming would increase Hurricanes in mainland America, but the numbers have reduced to zero. What do these clowns do? Redefine Hurricanes.
These clowns, the Alarmists supported by the MSM have pushed so many false positions and, yet, they still think that they are beyond ridicule.
No. The tide has turned. The world has turned their collective backs on the man-made global warming hoax. However, it is still costing the citizens trillions.
Sensational new study shows western government climate models rely on a fatally flawed 1920’s algorithm. Scientists say this could be the breakthrough that explains why modern computers are so awful at predicting climate change: simulations “violate several known Laws of Thermodynamics.”
“This paper examines what was originally calculated as the greenhouse effect theory by Lewis Fry Richardson, the brilliant English mathematician, physicist and meteorologist.
In 1922 Richardson devised an innovative set of differential equations. His ingenious method is still used today in climate models. But unbeknown to Richardson he had inadvertently relied upon unchecked (and fatally flawed) numbers supplied by another well-known British scientist, W. H. Dines.”
Unfortunately for Richardson, Dines wrongly factored in that earth’s climate is driven by terrestrial (ground) radiation as the only energy source, not the sun. Derek Alker specifically draws attention to the key fact that:
“One of the main points the paper makes is that in the Dines model each layer of the atmosphere is THE energy source, NOT the sun, which is omitted in his table, nor the earth, as the excel model proves.”
Richardson had taken the Dines numbers on face value and did not detect the error when combining the Dines numbers with his own. Alker continues: “The archives show Richardson never double-checked the Dines work (see below) and the records do not show that any one else has ever exposed it.”
The outcome, says Alker, is that not only has the original Richardson & Charney computer model been corrupted – but all other computer climate models since. All government researchers use these core numbers and believe them to be valid even though what they seek to represent can be shown today as physically impossible.
Alker adds:
“My paper specifically describes how the theory Dines calculated in his paper violates several of the known Laws of Thermodynamics, and therefore does not describe reality.
The greenhouse effect theory we know of today is based on what Richardson had formulated from the Dines paper using unphysical numbers created by Dines. But Dines himself later suggested his numbers were probably unreliable.”
Unfortunately, Dines died in the mid 1920’s and did not inform Richardson of the error. Thereupon, in the late 1940’s, Richardson began working with another world figure in climate science – Jule Charney as the duo constructed the first world’s first computer climate model. It was then the dodgy Dines numbers infected the works.
Alker, who studied the archives scrupulously for his research reports that there is no published evidence that Richardson understood Dines’s calculation method. And we think he and Charney put the Dines numbers into the world’s first computer model verbatim.
In essence, the ‘theory’ of greenhouse gas warming from the Dines numbers can be shown to start with a misapplication of Planck’s Law, which generates grossly exaggerated ‘up’ and nonexistent ‘down’ radiative emissions figures. Then, layer by layer, part of the downward radiation is added to the layer below, which is in violation of the Second Law of Thermodynamics.
Thereby, like a domino effect, this bogus calculation method becomes GIGO (“garbage in, garbage out”) to all computers that run the program. Alker adds:
“What the climate simulations are doing is creating energy layer by layer in the atmosphere that shouldn’t be there (it has no other source than of itself). It is then destroyed layer by layer (it is absorbed and then discarded – in effect destroyed). This is all presented in such a way to give the appearance that energy is being conserved, when it is not being conserved,”
****
[1] Alker D.,‘Greenhouse Effect Theory within the UN IPCC Computer Climate Models – Is It A Sound Basis?’ (October 30, 2016), principia-scientific.org; https://principia-scientific.com/publications/PROM/GHE-UNIPCC.pdf (accessed online: November 02, 2016)
“The global average surface temperature has increased over the 20th century by about 0.6°C.”
This Met Office graph of global temperatures for 1860-2000 was included in the IPCC’s 2001 report. It provided visual clarification of the 0.6°C temperature increase since 1900.
Every year it seems we get a new study linking cold winters to global warming and “melting” Arctic ice. Only last month I debunked the latest attempt, and I assumed this must be the same study. Turned out I was wrong! This is what Unscientific American has to say about the latest paper:
The polar vortex in recent years has brought the kind of miserable cold to northern states that made it hard to breathe outside. We’re probably in for more of the same.
That’s the finding of a new study published yesterday in the journal Nature that finds that as the Arctic warms, it is shifting the polar vortex to Europe. That in turn will bring more bursts of frigid cold to North America.
Those temperature drops could lead to miserable days in February and March, the research finds. Conversely, those drops in temperature could offset some of global warming’s effect in those regions, said Martyn Chipperfield, professor of atmospheric chemistry at the University of Leeds and a co-author of the paper.
“Climate change can lead to extremes; it’s not like a regular change, everyone to the same extent at all times and places,” he said. “Despite the overall warming, you can get in places like the Northeastern U.S. extreme cold events. That’s consistent with climate change and global warming.”
The polar vortex is a fast-moving band of air that encircles the frigid Arctic in winter months and traps it there. Its movement is part of a decades long change.
The polar vortex has actually “shifted persistently” away from North America and into Europe and Asia over the last 30 years, researchers found. That results in cooling over North America but warmer winters in Europe.
As global warming decreases sea ice, the sun’s warmth absorbed by the ocean is instead released from the ocean for a longer period of time, which disrupts the vortex.
When the vortex weakens, a growing number of climate scientists argue, the cold Arctic air migrates to lower latitudes, as happened in early 2014 and 2015. The sudden and somewhat prolonged burst of cold broke pipes and water mains and more than doubled energy bills in places like New York and New England as it wreaked havoc across a wide swath of the country.
Marine scientists have detected a tenfold increase in the population of phytoplankton species in the North Atlantic from 1965 to 2010. The rapid growth rate of the marine microorganisms corresponds with an increase in carbon dioxide levels. ( Jeff Schmaltz/NASA Earth Observatory | Wikimedia Commons )
******
Abnormal levels of carbon dioxide in the North Atlantic are being linked to the rapid growth of plankton population in the ocean over the past 45 years, according to a study featured in the journal Science.
A team of marine researchers, led by associate professor Anand Gnanadesikan of Johns Hopkins University, discovered that the population of microscopic marine alga known as Coccolithophores in the North Atlantic experienced a tenfold increase from 1965 to 2010.
This recent finding contradicts earlier assumptions made by scientists that the phytoplankton would find it difficult to produce plates from calcium carbonate as ocean waters become increasingly more acidic.
A nearly invisible layer of microscopic, plantlike organisms drifts through the surface of the ocean all over the globe, their movements mostly at the whim of ocean currents. When conditions are right, populations of these tiny creatures explode in vast “blooms” that span miles and miles of ocean. Called phytoplankton, they are the base of the entire ocean food web.
Like plants, phytoplankton have chlorophyll to absorb sunlight and use photosynthesis to produce food. In the Arctic Ocean, photosynthesis is frequently limited by the low angle of the winter Sun and short daylight hours, as well as by the blanket of sea ice that melts and refreezes with the changing seasons.
Over the last 30 years, however, the Arctic has warmed, and larger areas of the Arctic Ocean are now free of sea ice in the summer, which means phytoplankton are getting more sunlight. The result is that phytoplankton productivity has increased by about 20 percent based on satellite estimates of the amount of chlorophyll in the water.
Growth of microscopic phytoplankton in the Southern Ocean could double in size in the next 80 years because of climate change, according to scientists. The microscopic organisms form the basis of the entire food web, feeding everything from small fish and krill to giant whales.
Professor Phillip Boyd from the Institute of Marine and Antarctic Studies (IMAS) in Hobart has been measuring how changes in conditions affect phytoplankton in a lab setting. He has been using sophisticated modelling to predict changes.
He said oceans are warming because of climate change, and are also absorbing more carbon and become more acidic. Also, nutrient levels are rising and there are changes to the amount of light which penetrates the ocean.
“We designed an experiment where we basically threw the kitchen sink at this phytoplankton species in the lab,” he said.
“We changed the acidity of the ocean, the nutrients supplied, the amount of light and the amount of this trace element iron.
“We found that in a future ocean, in the sub-Antarctic waters our lab experiment showed that the growth rate will be about twice as high as it is at present.” He said the change would mean significant knock-on effects higher in the food chain. “For example the species diatoms are important food for krill and for other species,” he said.
“So you would expect then to see higher levels of productivity and therefore a more productive food web in the future for the sub-Antarctic.”
Results may vary between hemispheres
But the increase in productivity is not likely to extend to oceans in the northern hemisphere. “When we talk about climate change and a changing ocean, the stories are normally very doom and gloom,” Professor Boyd said.
“But really it’s going to be a balance, there will be some regions of the ocean particularly at the higher latitudes where we may see more productivity.“The flip side of that is that from other computer modelling simulations, they’re suggesting that the lower latitudes we may see a decrease in productivity.
“So there may be quite a change from region to region in ocean productivity.”
He said people sometimes attempted these experiments changing only one or two of the conditions likely to be affected by climate change. He warned his findings showed that not modelling all of the forecast changes made for extremely varied results.
“You may get a very different picture,” he said. “For example, in a different experiment published as part of this study we see that if we only change four out of the five properties we get a very different result.
“It looks like a very detrimental affect on phytoplankton.” The findings will be published in the journal Nature: Climate Change today
The greatest validation of scientific contribution is a peer-reviewed academic publication. But the face of academic publishing is changing as traditional journal publishers have come under attack from proponents of open access, which could change the mode of knowledge distribution in the sciences as we know it.
In selective traditional journals, editors and reviewers not only scrutinize a submission for whether the authors’ conclusions are valid, but also whether the work will generate buzz within the field. If the findings are deemed sound and significant, the paper is accepted for publication, typically after edits. At this point, the authors pay a publishing fee based on price models that usually resemble magazine advertising.
Scientific literacy – what it is, how to recognize it, and how to help people achieve it through educational efforts, remains a difficult topic. The latest attempt to inform the conversation is a recent National Academy report “Science Literacy: concepts, contexts, and consequences” (https://www.nap.edu/download/23595).
While there is lots of substance to take away from the report, three quotes seem particularly telling to me. The first is from Roberts [1] that points out that scientific literacy has “become an umbrella concept with a sufficiently broad, composite meaning that it meant both everything, and nothing specific, about science education and the competency it sought to describe.”
The second quote, from the report’s authors, is that “In the field of education, at least, the lack of consensus surrounding science literacy has not stopped it from occupying a prominent place in policy discourse” (p. 2.6). And finally, “the data suggested almost no relationship between general science knowledge and attitudes about genetically modified food, a potentially negative relationship between biology-specific knowledge and attitudes about genetically modified food, and a small, but negative relationship between that same general science knowledge measure and attitudes toward environmental science” (p. 5.4).
“Flat Earth” The Flammarion engraving (1888) Wikipedia
Recognizing the scientifically illiterate
So, perhaps it would be useful to consider the question of scientific literacy from a different perspective, namely, how can we recognize a scientifically illiterate person from what they write or say? What clues imply illiteracy?[1]
To start, let us consider the somewhat simpler situation of standard literacy. Assume we ask a person a question and that the question is clearly composed, we might expect the illiterate person to have trouble correctly interpreting what a reasonable answer should contain. Constructing a literate answer implies two distinct abilities: the respondent needs to be able to accurately interpret what the question asks and they need to recognize what an adequate answer contains.
These are not innate skills; students need feedback and practice in both, particularly when the question is a scientific one. In my own experience with teaching, as well as data collected in the context of an introductory course [2], all to often a student’s answers consist of a single technical term, spoken (or written) as if a word = an argument or explanation.
We need a more detailed response in order to accurately judge whether an answer addresses what the question asks (whether it is relevant) and whether it has a logical coherence and empirical foundations, information that is traditionally obtained through a Socratic interrogation.[2] At the same time, an answer’s relevance and coherence serve as a proxy for whether the respondent understood (accurately interprets) what was being asked of them.
So what is added when we move to scientific in contrast to standard literacy, what is missing from the illiterate response. At the simplest level we are looking for mistakes, irrelevancies, failures in logic, or in recognizing contradictions within the answer, explanation or critique. The presence of unnecessary language suggests, at the very least, a confused understanding of the situation.[3]
A second feature of a scientifically illiterate response is a failure to recognize the limits of scientific knowledge; this includes an explicit recognition of the tentative nature of science, combined with the fact that some things are, theoretically, unknowable scientifically. For example, is “dark matter” real or might an alternative model of gravity remove its raison d’être?[4]
When people speculate about what existed before the “big bang” or what is happening in various unobservable parts of the multiverse, have they left science for fantasy. Similarly, speculation on steps to the origin of life on Earth (including what types of organisms, or perhaps better put living or pre-living systems, existed before the “last universal common ancestor”), the presence of “consciousness” outside of organisms, or the probability of life elsewhere in the universe can be seen as transcending either what is knowable or likely to be knowable without new empirical observations.
While this can make scientific pronouncements somewhat less dramatic or engaging, respecting the limits of scientific discourse avoids doing violence to the foundations upon which the scientific enterprise is built. It is worth being explicit, universal truth is beyond the scope of the scientific enterprise.
The limitations of scientific explanations
Acknowledging the limits of scientific explanations is a marker of understanding how science actually works. As an example, while a drug may be designed to treat a particular disease, a scientifically literate person would reject the premise that any such drug would, given the nature of interactions with other molecular targets and physiological systems, be without side effects and that these side effects will vary depending upon the features (genetic, environmental, historic, physiological) of the individual taking the drug. While science knowledge reflects a social consensus, it is constrained by rules of evidence and logic (although this might appear to be anachronistic in the current post-fact age).
Even though certain ideas are well established (Laws of Conservation and Thermodynamics, and a range of evolutionary mechanisms), it is possible to imagine exceptions (and revisions). Moreover, since scientific inquiry is (outside of some physics departments) about a single common Universe, conclusions from different disciplines cannot contradict one another – such contradictions must inevitably be resolved through modification of one or the other discipline. A classic example is Lord Kelvin’s estimate of the age of the Earth (~20-50 million years) and estimates of the time required for geological and evolutionary processes to produce the observed structure of the Earth and the diversity of life (hundreds of millions to billions of years), a contradiction resolved in favor of an ancient Earth by the discovery of radioactivity.
Scientific illiteracy in the scientific community
There are also suggestions of scientific illiteracy (or perhaps better put, sloppy and/or self-serving thinking) in much of the current “click-bait” approach to the public dissemination of scientific ideas and observations. All too often, scientific practitioners, who we might expect to be as scientifically literate as possible, abandon the discipline of science to make claims that are over-arching and often self-serving (this is, after all, why peer-review is necessary).
A common example [of scientific illiteracy practiced by scientists and science communicators] is provided by studies of human disease in “model” organisms, ranging from yeasts to non-human primates. While there is no doubt that such studies have been, and continue to be critical to understanding how organisms work (and certainly deserving of public and private support) – their limitations need to be made explicit, while a mouse that displays behavioral defects (for a mouse) might well provide useful insights into the mechanisms involved in human autism, an autistic mouse may well be a scientific oxymoron.
Discouraging scientific illiteracy within the scientific community is challenging, particularly in the highly competitive, litigious,[5] and high stakes environment we currently find ourselves in.[6] How to best help our students, both within and without scientific disciplines, avoid scientific illiteracy remains unclear, but is likely to involve establishing a culture of Socratic discourse (as opposed to posturing). Understanding what a person is saying, what empirical data and assumptions it is based on, and what does it imply and or predict are necessary features of literate discourse.
Roberts, D.A., Scientific literacy/science literacy. I SK Abell & NG Lederman (Eds.). Handbook of research on science education (pp. 729-780). 2007, Mahwah, NJ: Lawrence Erlbaum.
Klymkowsky, M.W., J.D. Rentsch, E. Begovic, and M.M. Cooper, The design and transformation of Biofundamentals: a non-survey introductory evolutionary and molecular biology course. LSE Cell Biol Edu, in press., 2016. in press.
Lee, H.-S., O.L. Liu, and M.C. Linn, Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 2011. 24(2): p. 115-136.
4. Henson, K., M.M. Cooper, and M.W. Klymkowsky, Turning randomness into meaning at the molecular level using Muller’s morphs. Biol Open, 2012. 1: p. 405-10.
[1] Assuming, of course, that what a person’s says reflects what they actually think, something that is not always the case.
[2] This is one reason why multiple-choice concept tests consistently over-estimate students’ understanding ( 3. Lee, H.-S., O.L. Liu, and M.C. Linn, Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 2011. 24(2): p. 115-136.)
[3] We have used this kind of analysis to consider the effect of various learning activities 4. Henson, K., M.M. Cooper, and M.W. Klymkowsky, Turning randomness into meaning at the molecular level using Muller’s morphs. Biol Open, 2012. 1: p. 405-10..
Mike Klymkowsky is a Professor of Molecular, Cellular, and Developmental Biology at the University of Colorado Boulder. In the area of biology education research, he developed (with Kathy Garvin-Doxas) the NSF-supported Biological Concepts Instrument (BCI), as well as a suite of virtual laboratory activities in molecular biology with Tom Lundy. He has been involved with the general question of how to develop more rigorous, coherent, and engaging courses and curricula in the biological sciences, including a re-designed introductory evolutionary, molecular, and systems biology course – Biofundamentals and general chemistry – Chemistry, Life, the Universe & Everything (CLUE), both with Melanie Cooper (Chemistry – Michigan State University). See “About this Blog” for more on Mike’s work.
Reports from Seal River, just north of Churchill at Churchillwild, at July 26 were crowing about seeing lots of bears onshore, with a veritable beehive of activity the weekend of 16/17 July:
“This has without a doubt been Churchill Wild’s most spectacular start to the summer polar bear watching season. …Bear numbers are up spectacularly this year and all are looking very fat and healthy, perhaps much to the chagrin of climate change “experts.” Our best day for the seductive white carnivores over the past week featured 21 polar bears sighted between the Lodge and our whale swim spot! … The ice pack, which was still visible a week ago [i.e, 17 July or so], has finally dissipated and pushed a large number of bears on to our coastline here at Seal River, with the end result being many very happy cameras!” [my bold]
The Polar Vortex is on the move unusually early this year, forecasters have revealed – and say it could strike the US in January.
A recent study claimed Arctic sea-ice loss is causing the Polar Vortex to shift and as a result, winters are expected to get longer and more bitter. Now, forecasters say it is ‘unprecedentedly early’.
Planting trees is a cost-effective way to tackle urban air pollution, which is a growing problem for many cities. A study by US-based The Nature Conservancy (TNC) reported than the average reduction of particulate matter near a tree was between 7{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} and 24{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117}.
Particulate matter (PM) is microscopic particles that become trapped in the lungs of people breathing polluted air. PM pollution could claim an estimated 6.2 million lives each year by 2050, the study suggests. Lead author Rob McDonald said that city trees were already providing a lot of benefits to people living in urban areas.
Overview: The sun has been completely spotless on 21 days in 2016 and it is currently featuring just one lonely sunspot region. In fact, on June 4th of this year, the sun went completely spotless for the first time since 2011 and that quiet spell lasted for about four days.
Sunspot regions then reappeared for the next few weeks on a sporadic basis, but that was followed by several more completely spotless days on the surface of the sun. The increasingly frequent blank sun is a sign that the next solar minimum is approaching and there will be an even greater number of spotless days over the next few years.
At first, the blankness will stretch for just a few days at a time, then it’ll continue for weeks at a time, and finally it should last for months at a time when the sunspot cycle reaches its nadir. The next solar minimum phase is expected to take place around 2019 or 2020. The current solar cycle is the 24th since 1755 when extensive recording of solar sunspot activity began and is the weakest in more than a century with the fewest sunspots since cycle 14 peaked in February 1906.
It was 7 or 8 years ago when I was sitting in my office at the university. At that time I was a Ph.D. student in the Machine Intelligence group at the computer science department. One of the department professors knocked on my door – he was looking for my supervisor because he needed advice on how to perform some test to verify some properties of his recent experiments. Well, Finn is not here, but maybe I can help. What’s your problem? He wanted to know about different statistical tests and how and why they work. Why they work? This question puzzled me immensely because I had taken many advanced classes in statistics, read countless books on the subject, but I never recalled any of them answering or discussing “why they work?”.
Written by Henrik Svensmark, DTU Space, National Space Institute, Technical University of Denmark
Now and then new results appear that suggest that the idea of cosmic ray influence on clouds and terrestrial climate does not work. “Sun-clouds-climate connection takes a beating from CERN” is the latest news story which is based on a new paper from the CLOUD collaboration at CERN [1].
It is important to note that the new CLOUD paper is not presenting an experimental result, with respect to the effect of cosmic ray generated ions on clouds, but a result of numerical modeling. CLOUD is using their experimental measurements to estimate the typical nucleation of various aerosols of small size (1-3 nm). However, for an aerosol to affect clouds (and climate) it must first grow to 50-100 nm, to become cloud condensation nuclei (CCN). CLOUD then uses a numerical model to estimate the effect of cosmic rays on the growth process, and finds that the response of cosmic rays on the number of CCN over a solar cycle is insignificant.