February 3, 2020 JV

Imprimatur

An imprimatur (from Latin, “let it be printed”) is an official, Catholic Church declaration that a book may be published; the material can be trusted.  It is the ultimate quality check for publications in the Catholic church. While laws and standards are valued by all scientists, any “imprimatur” on a scientific finding is temporary, at best.  That is because the purpose of scientific enquiry is not to confirm current belief.  Instead, its focus is on breaking existing hypotheses or theories with new empirical evidence.  Therefore, science is not something to be “settled” and should never be about “consensus.”  Late author and film producer, Michael Crichton put it well: “in science, consensus is irrelevant.  What is relevant is reproducible results.  The greatest scientists in history are great precisely because they broke with the consensus.  There is no such thing as consensus science.  If it’s consensus, it isn’t science.  If it’s science, it isn’t consensus.  Period.”

When former US President Obama called the UN IPCC report “the gold standard” of international climate science, just what did he mean by that?  Was the former leader aware of the genesis and integrity of the data that underpinned its findings?  Was it even appropriate to mix science and politics, when adding POTUS to the assertions (Obama, 2014) that “ninety-seven percent of scientists agree: climate change is real, man-made and dangerous?”  History will judge his actions and his grasp of science.

Tarnished?

Unfortunately, the gilded veneer of scientific endeavour had already begun to tarnish.  For example, in an anonymous survey (Fanelli, 2009), 2% of scientists admitted to have fabricated, falsified or modified data or results at least once and 33% admitted other questionable research practices.  When asked about the behaviour of colleagues, admission rates were much higher – 14% and 72% respectively.  Fanelli states: “A popular view … sees fraudsters as just a ‘few bad apples.’  Increasing evidence, however, suggests that known frauds are just the ‘tip of the iceberg’, and that many cases are never discovered.”  Freedman (2010) identified a long list of reasons why experts are often wrong, including pandering to audiences, lack of oversight, reliance on flawed evidence provided by others, and failure to take into account important confounding variables.  Finally, in a paper published in Nature, Park et al (2014) summarised research on publication bias, careerism, data fabrication, and fraud to explain how scientists converge on false conclusions.  They described the phenomenon of “herding” which subjects the scientific community to an inherent risk of converging on an incorrect answer and raises the possibility that, under certain conditions, science may not be self-correcting.

What criteria did President Obama use when conferring his golden imprimatur?  This is unknown, but one would expect, as a minimum, that all data used to inform policy decisions be free of bias and the data, process and assumptions subject to appropriate peer review.  The following timeline plots some significant influences and events with respect to climate science integrity.

Five main themes emerge from this diagram:

  1. Mann and Jones allegedly conspired to deceive and hide data from the scientific community in producing and defending the hockey-stick graph.
  2. The UN IPCC selected Mann and Jones as lead authors and were complicit in 1.
  3. Proponents of man-made global warming have been rewarded financially by it.
  4. Opponents of the man-made position are sued in courts and smeared in social media.
  5. Accurate temperature measurement shows little warming from 2005 onwards.

Once the evidence from this series of articles has been examined, one may be tempted to ask “how could this happen without significant mainstream news coverage?”  The various climate scientists and political leaders employed or recognised by the UN IPCC – were they not hand-picked?  If these events are true (and they can be tested as such), then either, the actors (including the mainstream media) were grossly incompetent or acted to deceive when measuring the earth’s temperature and interpreting and reporting it to recommend policy decisions.  You get to decide which is more likely to be true.

Origins

NASA scientist James Hansen made the first claim for Anthropogenic (man-made) Global Warming (AGW) in an address to the US congress in 1988.  However, it was Professor Michael Mann’s Hockey Stick graph (published in 1998) that proved most influential.  Within a few years, both the UN IPCC and former US Vice President, Al Gore and were using this graph as a centre-piece in their respective productions: the Third Assessment Report (2001) and the film, An Inconvenient Truth (2006).  Mann’s interpretation of the data asserts that the onset of industrialisation, circa 1910 and the subsequent rise in Carbon Dioxide levels in the atmosphere has brought about an “unprecedented” rise in the earth’s temperature.  Is it true that the measured temperature rise is “genuine” and “unprecedented” in its magnitude?  Let’s examine these claims further.

The measurement of temperature across the entire earth, over a century or so is crucial to the man-made climate-change argument.  It must be measured to a high level of precision in order to show any discernible difference against background levels (caused by nature).  The global average surface temperature is constructed from measured highs and lows.  Although the sporadic record highs tend to feature more prominently in news reports, it is the hidden, continual rise in lows that will be examined more closely in this series of articles.  On land, temperature has been monitored by weather station thermometers since about 1880 (being automated in the past few decades).  Automated sea surface temperature measurement commenced in the 1950’s, with manual methods used before then.  Satellites have also measured land and sea surface temperatures since the late 1970’s.  Paleoclimate proxies (i.e. physical, chemical and biological materials preserved within ice cores, tree rings, corals etc.) provide temperature records from millennia past through to modern times.

Does it Measure up?

In order to draw definite trends from global temperature data, its resolution should be to within ± 0.1 degree C.  The rule of thumb for instrument accuracy is three times that, or ± 0.03 degrees.  Few instruments used in the 20th century offered these capabilities and temperature data suffers from other systemic issues.  Is it, therefore possible to cobble together a temperature data set with high integrity and acceptance, that shows a discernible warming trend over the past century?  Let’s see.

Las Vegas at night (Thermal Imaging)

Land surface temperature measurements suffer from systematic errors and bias including the Urban Heat Island (UHI) effect.  Verifying UHI is simple.  Stand near a brick building or a paved road after sunset on a hot day.  Then contrast that sensation with standing on a grassy field or near a bubbling brook after sunset.  It can take many hours for the stored heat to dissipate fully from urban structures and this can affect the evening minimum by several degrees.  A thermometer is hardly required to measure the difference.

TO BE CONTINUED

Comment (1)

Leave a Reply