Robots Recreating Past Temperatures – Are Best to Avoid Australian Data

Robots Recreating Past Temperatures – Are Best to Avoid Australian Data

AT an artificial intelligence (AI) conference in New York recently, Sean Gourley explained Wiener’s Law: automation will routinely tidy up ordinary messes but will occasionally create an extraordinary mess – that so mimics what could have been, that the line between what is real, and what is fake, becomes impossible to decipher, even by the experts.

AI research over the last couple of years at the University of Tasmania could have been a check on the existing mess with historical temperature reconstructions. Reconstructions that suggest every next year is hotter than the last the world over. Except that Jaco Vlok began with the Australian Bureau of Meteorology’s temperature datasets without first undertaking adequate quality assurance (QA).

Remember the infamous Climategate emails, and in particular the ‘Harry read me files’? Harry, working at the Climate Research Unit (CRU) at the University of East Anglia, wrote:

Getting seriously fed up with the state of the Australian data. so many new stations have been introduced, so many false references … so many changes that aren’t documented. Every time a cloud forms I’m presented with a bewildering selection of similar-sounding sites, some with references, some with WMO codes, and some with both. And if I look up the station metadata with one of the local references, chances are the WMO code will be wrong (another station will have it) and the latitude/longitude will be wrong too.

For years, the Australian Bureau of Meteorology has been capitalizing on the mess that by its very nature throws up ‘discontinuities’ that can subsequently be ‘homogenized’ … so Blair Trewin is obliged to apply algorithms, to ensure every reconstruction shows steadily rising temperatures in accordance with theory.

As Christopher Booker explained some years ago:

What is tragically evident from the Harry Read Me file is the picture it gives of the CRU scientists hopelessly at sea with the complex computer programmes they had devised to contort their data in the approved direction, more than once expressing their own desperation at how difficult it was to get the desired results.

In short, Phil Jones at the Climatic Research Unit in the UK, Gavin Schmidt at GISS NASA in New York, and even David Jones at the Australian Bureau in Melbourne have overseen the reworking of climate data until it fits the theory of catastrophic anthropogenic climate change (AGW).

They have, in fact, become the masters of Wiener’s Law, without actually knowing the first thing about AI.

They have overseen the use of algorithms – independently of the checks and balances routinely applied in the mainstream AI community – to recreate past temperatures.  In the process the Medieval Warm Period (MWP) and the temperature extremes of the late 1930s, so evident in the raw data for both Australia and also the US, have been removed from our historical temperature records. Thus, we have the Paris Accord, and a federal election in Australia where both candidates for future Prime Minister are committed to saving the environment from rising temperatures even if it means ruining the economy.

The history of science would suggest that disproving a failed paradigm is always more difficult than replacing one, and so I have thought beginning afresh with the latest AI techniques had merit.   But this work is only likely to succeed if the Australian raw temperature database – known as ADAM – is reworked from the beginning.  Otherwise artificial warming from both the Urban Heat Island (UHI) effect and also the Bureau’s new electronic probes in Automatic Weather Stations (AWS), that record hotter for the same weather, will keep creating hockey sticks as inescapably as Groundhog day.

While artificial intelligence, and in particular ANNs, are now considered a mature technology used for a variety of tasks that require pattern recognition and decision making and forecasting – their capacity is denied by mainstream climate scientists.  One of the reasons is that leading climate scientists claim the natural climate cycles have been so perturbed by carbon dioxide that the patterns no longer persist.  This is of course little more than a hypothesis, which can be tested using ANNs as a research tool.

It has been my experience that the raw measurements of any variable associated with weather and climate, when arranged chronologically, show a pattern of recurring cycles.

These oscillations may not be symmetrical, but they will tend to channel between an upper and lower boundary – over and over again. Indeed, they can be decomposed into a few distinct sine waves of varying phase, amplitude and periodicity.  It could be the case that they represent actual physical phenomena, which drive continuous climate change.

If this is the case, it may be possible to forecast the climate including temperature, wind speed and direction and even rainfall, by understanding its component parts.  As long as the relationships embedded in the complex oscillation continue into the future, a skilful weather and climate forecast is theoretically mathematically possible using ANNs – despite chaos theory.

Skilful weather and climate forecasts using ANN represent a new application for an existing technology.  Indeed, if only a fraction of the resources spent applying this technology to mining social media data for advertising, could be diverted to the goal of better climate forecasting I’m sure more major advances would be made very quickly.  But in the case of Australia, the databases will first need to be reworked to install some integrity.

In particular, every time there is a significant equipment change (for example, a change from a mercury thermometer to an electronic probe in an automatic weather station) then that temperature series needs to be given a new ID.  In this way the ANN has some hope of finding the real patterns in climate change from the artificial warming embedded with the new equipment … or the growth of a city.

Innovation, while usually technological, often has a real political implication.  For example, with the invention of the printing press in the 1430s, suddenly there was an efficient way of replicating knowledge – it became harder to control the information available to the masses.

Since the printing press, there have been many other inventions that have dramatically improved our quality of life including the invention of the steam engine in 1712, the telephone in 1876, penicillin in 1928 and personal computing as recently as the 1970s.  Today more people are living longer, healthier and more connected lives thanks to these and other innovations.  But when we consider the history of any single invention we find that it rarely emerged easily: there was initially confusion, followed by resistance.

The history of innovation (and science) would suggest that only when there is opportunity for competition do new and superior technologies take hold.  Of course, this does not bode well for the adoption of AI for weather and climate forecasting by meteorological agencies because they are government-funded monopolies. Furthermore, they are wedded to general circulation modelling that is a completely different technique – based on simulation modelling and next year being hotter than the last.

To be clear, there is the added complication that simulation modelling is integral to demonstrating anthropogenic global warming, while ANN rely exclusively on assumptions about the continued existence of natural climate cycles.  To reiterate, it has been said that because elevated levels of carbon dioxide have perturbed weather systems, ANNs will not work into the future because the climate is on a new trajectory. Conversely, if ANN can produce skilful climate forecasts then arguably anthropogenic climate change is not as big an issue as some claim.  Clearly, as with the printing press, there are political consequences that would follow the widespread adoption of AI in climate science for historical temperature reconstructions and also weather and climate forecasting.  I’m hoping this could begin with more funding for the important work of Jaco Vlok – but perhaps not at the University of Tasmania or with Australian temperature data.

The new report by Jaco Vlok ‘Temperature Reconstruction Methods’ can be downloaded here, and my explanation of its importance and limitations ‘New Methods for Remodelling Historical Temperatures: Admirable Beginnings Using AI’ can be downloaded here.

The feature image (at the very top) shows Jaco Vlok (left) then Jennifer Marohasy, John Abbot and JC Olivier.

Figure 50Figure 50 from the new report by Jaco Vlok showing monthly mean maximum temperatures from the 71 locations used to recreated the temperature history at Deniliquin.


If you've enjoyed reading this article from the Institute of Public Affairs, please consider supporting us by becoming a member or making a donation. It is with your support that we are securing freedom for the future.