Dodgy Data Homogenisation Practices and Deficient Temperature Data Collection
Much of Dr Jennifer Marohasy’s research over the past decade has been on how the Bureau of Meteorology (BoM) takes raw temperature data and ‘homogenises’ it. The concerns raised are not with the concept of homogenisation per se, but rather with the non-transparent and non-repeatable means by which it is undertaken by the BoM, and the lack of reliability consequently associated with the trends produced from the data once it has been homogenised. Articles relevant to this are listed in Part I of this post.
A second line of inquiry has been with the collection of raw temperature data, particularly in the period since mercury thermometers have been replaced by digital recording devices (the last few decades), and ad hoc practices of altering said records uncovered by Dr Marohasy and others. This is Part II of this post.
Below you will find links to key articles, and edited extracts in italics. This list is not exhaustive.
Dr Marohasy is editor of Climate Change: The Facts 2020, in preparation, further information on which – including how to donate to support climate change research – is available here.
Articles about flaws in the Bureau of Meteorology’s approach to homogenisation of temperature data in Australia.
The importance of the question of homogenisation is shown by a chart prepared by Dr Marohasy for Chapter 9 of Climate Change: The Facts 2017, which shows how homogenisation changed a cooling trend to a warming trend for data collected at the weather station as Rutherglen.
Jennifer Marohasy, The Spectator Australia, 11 March, 2019
The extensive remodelling is not denied by the Australian Bureau of Meteorology. Rather it is justified on the basis that temperatures are now measured using a non-standard method (spot readings) from non-standard equipment (custom built probes in automatic weather stations). Apparently, we need to know how hot it was back then, relative to the equipment used now – so temperature are remodelled. To be clear, there are three factors that potentially confound how hot it was back then – or now: the equipment, how it is used, and the remodelling, which is often referred to as homogenisation.
The largest single change in the new ACORN-SAT Version 2 temperature database is a drop of more than 13 degrees Celsius at the town of Wagga on 27 November 1946.
Jennifer Marohasy, IPA Today, 16 February, 2019
Temperatures are changed through a process known as homogenisation, and then changed again, sometimes by as much as 6.4 degrees Celsius for the one day.
For example, on 8th February 1933 the maximum temperature as measured at Albany in Western Australia was 44.8 degrees Celsius. Then, six years ago it was changed to 51.2 degrees Celsius! Recently it was changed again, this time to 49.5.
Jennifer Marohasy, IPA Today, 10 August 2017
(The Government should initiate a) four-step process to restore confidence in the recording and handling of historical temperature data (including):
Establish a Parliamentary Enquiry, through the House Energy and Environment Committee, with Terms of Reference that include:
a. When and why the policy of recording actual measurements from weather stations into ADAM was modified through the placement of limits on the lowest temperature that an individual weather station could record;
b.. Scrutiny of the methodology used by the Bureau in the remodelling of individual temperature series from ADAM for the creation of ACORN-SAT that is used to report climate change trends;
c. Scrutiny of the complex area weighting system currently applied to each of the individual series used in ACORN-SAT;
d. Clarification of the objectives of ACORN-SAT, specifically to ensure public expectations are consistent with the final product;
e. Clarification as to why statistically-relevant uncertainty values generally increase, rather than decrease with homogenisation.
Brett Hogan, IPA Review, July 2017, PDF Version here.
Australia’s official temperature record is put through a process of ‘homogenisation’ to remove data imperfections caused by changes in the quality, location or amenity of the measuring equipment, or human error…
Particular examples show the scale of the changes. At Amberley in Queensland, what was a cooling trend of 1.0 degree over the course of a century became a warming trend of 2.5 degrees. Similarly, at Deniliquin in New South Wales, cooling of 0.7 degrees over the course of a century was homogenised to warming of 1.0 degree. Forty years of data at Bourke in NSW was deleted, and not even adjusted, because the information wasn’t collected using standard equipment. A 1.7 degree cooling trend over a century is now a mild warming trend, and the site high of 51.7 degrees in 1909 was simply disregarded despite a similar temperature the same day at nearby Brewarrina, and despite both being recorded in standard equipment.
Jennifer Marohasy, IPA Today, 6 June, 2017
There are many chapters in this book (Climate Change: The Facts 2017, available here) about ‘homogenisation’ (chapters 5, 6, 7, 8 and 9 by Anthony Watts, Tony Heller, Dr Tom Quirk, Jo Nova and me, respectively). Homogenisation, in essence, involves the remodelling of data, and is now a technique integral to the development of key official national and global measures of climate variability and change – including those endorsed by the IPCC.
It is generally stated that without homogenisation temperature series are unintelligible. But Dr Jaco Vlok from the University of Tasmania and I dispute this – clearly showing that there exists a very high degree of synchrony in all the maximum temperature series from the State of Victoria, Australia – beginning in January 1856 and ending in December 2016 (chapter 10). The individual temperature series move in unison suggesting they are an accurate recording of climate variability and change. But there is no long-term warming trend. There are, however, cycles of warming and cooling, with the warmest periods corresponding with times of drought.
Jennifer Marohasy, IPA Today, 25 November 2016
The difference between the official-adjusted maximum temperature for Rutherglen on 13th January 1939 versus the actual measured value is rather large- more than 5 °C. Historical temperature data is used to model and forecast the likely impact of future bushfires, with Fire Danger Indices sensitive to small changes in temperature. But which values should be inputted when modelling bushfire behaviour: the raw data, or the improved-adjusted ACORN-SAT values?
Articles about issues with how temperature data is collected, recorded, and published by the Bureau of Meteorology.
Jennifer Marohasy, IPA Today, 16 May 2018
To be clear, my issues continue to be less with the actual policies, protocols and best practice manuals already in place, but with increasing evidence these are being systematically ignored…
Historically maximum air temperature was measured by mercury thermometers – worldwide. But over recent decades there has been a transition to electronic probes in automatic weather stations….
… it could be concluded that the current system is likely to generate new record hot days for the same weather – because of the increased sensitivity of the measuring equipment and the absence of any averaging/smoothing. To be clear, the highest one-second spot reading is now recorded as the maximum temperature for that day at the 563 automatic weather stations across Australia that are measuring surface air temperatures.
Jennifer Marohasy, IPA Today, 12 February, 2018
On 23 September 2017, a new record hot day for Victoria was claimed at the Mildura airport using an electronic probe in an automatic weather station (AWS) housed in a Stevenson screen.
The BoM claims that measurements from such devices are ‘comparable’ to measurements from traditional mercury thermometers, which were used to measure official air temperatures at Mildura from 13 June 1889 until 1 November 1996.
There is no documentation, however, supporting this contention for Mildura or any of the other nearly 500 AWS spread across the Australian continent. Furthermore, the BoM does not have World Meteorological Organisation, or any other form of accreditation (i.e. ISO 17025) for any of its AWS.
Jennifer Marohasy, IPA Today, 12 November, 2017
The Bureau have since acknowledged that their method of recording temperatures from electronic sensors is not accredited, though they claim it nevertheless gives readings equivalent to mercury thermometers…
I can confirm, that the values recorded manually on the A8 forms from the mercury thermometers for the period November 1996 to December 2000 (at Mildura) are significantly different from the official values recorded from the electronic sensors. If we consider just the values for September, the mean difference is statistically significant at the 0.05 level of probability, and is +0.34 °C, +0.27 °C and +0.28 °C for the years 1997, 1998 and 1999, respectively.
While the current head, Andrew Johnson, claims the Bureau has always taken one-second readings from electronic sensors, this is at odds with a letter from Sue Barrell, Bureau of Meteorology, to Dr Peter Cornish dated 6th February 2013, available online here.
Jennifer Marohasy, The Spectator Australia, 19 October 2017.
Electronic sensors, progressively installed into Bureau weather stations replacing mercury thermometers, beginning some twenty years ago, can capture rapid changes in temperature…
…to ensure consistency with measurements from mercury thermometers there is an international literature, and international standards, that specify how spot-readings need to be averaged – a literature and methodology being ignored by the Bureau. ..
In Australia, our Bureau takes not five-minute averages, nor even one-minute averages…the Bureau just take one-second spot-readings.
We still need an inquiry in the Bureau of Meteorology:
John Roskam and Jennifer Marohasy, IPA Today, 11 September, 2017.
The release of the Bureau of Meteorology’s internal review only underlines the need for a transparent parliamentary inquiry into the manipulation of temperature data, according to free market think tank the Institute of Public Affairs.
“This is a vindication of our own investigations. The Bureau has now acknowledged that it had inadvertently set limits on how cold temperatures could be recorded at Goulburn and Thredbo,” said Dr Marohasy.
John Roskam, Executive Director at the IPA said, “The Bureau has now acknowledged that for many years temperature readings have been wrong for at least two weather stations. The Bureau’s claim that of its 695 automatic weather stations the only two at which there were problems were by coincidence the two identified by Dr Jennifer Marohasy makes the Bureau a scientific and statistical laughing stock – the chances of this happening are approximately 1 in 120,000.
Jennifer also discussed the need for an Inquiry into the BoM with Alan Jones on Sky News in August 2017, and the interview can be seen here.