A Carbon Neutral Climate

4 October 2018
A Carbon Neutral Climate - Featured image

Jennifer Marohasy and John Abbot investigate what the world’s climate would be like if the Industrial Revolution never happened.

What if there’d never been an industrial revolution? What if the ‘dark satanic mills’ decried by William Blake had never been built, and the fossil fuels stayed in the ground?

According to official climate bodies like the International Panel on Climate Change (IPCC) most of the recent global warming is caused by human emissions of carbon dioxide. So in our alternate universe the planet would not have warmed, right?

To explore this scenario, we used the latest big data techniques, with a 2,000-year old proxy-temperature series decomposed (a statistical technique) and then inputted to artificial neural networks. We found there would very likely have been significant warming between the end of the Little Ice Age in the early 1800s to at least 1980, just as has been observed in the real world of ‘carbon pollution’.

These results were published in a peer-review journal and were, moreover, in accord with results from experimental spectroscopy studies (which derive atmospheric temperatures by ‘reading’ the distinctive electromagnetic signatures of various molecules).


But the results were also at odds with output from the General Circulation Models (GCMs) which are the mainstay of climate science and relied upon by the UN and government agencies worldwide. More specifically, we were able to derive Equilibrium Climate Sensitivity (ECS) – the temperature increase expected from a doubling of carbon dioxide concentrations in the atmosphere – of 0.6 ºC, about one-sixth of the average estimates from GCMs. We were not so naïve as to expect ringing endorsements of our challenge to the status quo view of human caused climate change, but the furious reaction to its publication says much about the state of climate science today.

The rationale for believing that increasing carbon dioxide (CO2) concentrations would cause global warming relies on speculative theory about the absorption and emission of infrared radiation by CO2 put forward in 1896. It’s not disputed that CO2 can retain heat by absorbing infrared radiation; what is uncertain is the sensitivity of the climate to increasing atmospheric concentrations.

This sensitivity may have been grossly overestimated by the Swedish chemist who was the theory’s progenitor, Svante Arrhenius, with the overestimations persisting in the CGMs based on his theory. We just don’t know; in part because the key experiments have never been undertaken, so most of the commentary – such as the excellent critique by Dr Judith Curry – rely on the failure of the CGMs predictions rather than the fundamentals of models’ assumptions and the methods by which they are constructed.

In fact we should challenge both the assumptions and internal logic of the CGMs, especially now we have available new techniques and massive computer power which enable the use of artificial neural networks (ANN), which are a form of machine learning: think big data and artificial intelligence. Crucially, ANN does not rely on the assumptions of the programmers, only the data. Our only assumption is that the same physical mechanisms affecting climate today have affected climate in the past, and will into the future. If patterns are not maintained into the future, then our models will not work.

Since 2011, we have been applying this same technology to forecast rainfall for the next month and season, and have published many papers in international climate science journals on the application of this technique, demonstrating that it is more skilful than the Australian Bureau of Meteorology’s General Circulation Models for forecasting monthly rainfall.

Our paper was published in 2017 in the GeoResJ journal (volume 14 pages 36–46). This was a first ever attempt to apply the latest big data mining techniques to mimic natural cycles of warming and cooling so as to forecast an alternative 20th century where the industrial revolution never happened. By measuring the difference between the temperature profile forecast by the models, and actual temperatures, we could derive an estimate of warming attributable to industrialisation. (Jennifer’s blog post at the time of publication can be found here)

We used proxy temperature records, because there are no thermometer records that go back as a far as needed. The proxy record is based on things like tree rings and coral cores, which are frequently used in climate science to provide an indirect measure of past temperatures. In the past 2,000 years, most of these records show cycles of warming and cooling fluctuating within a band of approximately 2 ºC.

For example, there are multiple lines of evidence indicating it was about 1ºC warmer across western Europe during a period known as the Medieval Warm Period (MWP). We date the MWP from AD 986 when the Vikings settled southern Greenland, until 1234 when a particularly harsh winter took out the last of the olive trees growing in Germany. This period was succeeded by the Little Ice Age when it was too cold for the inhabitation of Greenland. We date the end of the Little Ice Age as 1826, when Upernavik in northwest Greenland was again inhabitable after a period of 592 years.

The modern inhabitation of Upernavik also corresponds with the beginning of the industrial age. For example, it was on 15 September, 1830 that the first coal-fired train arrived in Liverpool from Manchester: which some claim as the beginning of the modern era of fast, long-distant, fossil-fueled transport for the masses.

So, the end of the Little Ice Age corresponds with the beginning of industrialisation. But did industrialisation cause the global warming?

The Northern Hemisphere data we used begins in 50 AD and ends in 2000. It was derived from studies of pollen, lake sediments, stalagmites and boreholes. Typical of most such proxy temperature series, when charted the data zigzags up and down within a band of perhaps 0.4 ºC on a short time scale of perhaps 60 years. Over a longer nearly 2,000 year period of the record, it shows a rising trend which peaks in 1200AD before trending down to 1650AD, and then rising to about 1980 – then dipping to the year 2000.

The decline at the end of the record is typical of many such proxy-temperature reconstructions and is known within the technical literature as ’the divergence problem’. To be clear, while the thermometer and satellite-based temperature records generally show a temperature increase throughout the 20th century, the proxy record, which is used to describe temperature change over the last 2,000 years – a period that predates thermometers and satellites – generally dips from 1980, at least for Northern Hemisphere locations. This is particularly the case with tree ring records. Rather than address this issue, key climate scientists have been known to graft instrumental temperature data onto the proxy record from 1980 onwards in a process literally known by insiders as ‘hide the decline’, resulting in the famous (ice) hockey stick shape.

In the jargon of the discipline we ‘trained’ our ANN using only proxy data for the period up to 1830 (preceding the Industrial Revolution), and then used the resulting model to develop a forecast through to 2000.

Both the proxy record and also our ANN forecast show a general increase in temperatures to 1980, and then a decline.

The average divergence between the proxy temperature record from this Northern Hemisphere composite and the ANN projection for the period 1880 to 2000 is just 0.09 °C. This suggests that even if there had been no industrial revolution and burning of fossil fuels, there would have still been some warming through the 20th century – to at least 1980.

Considering the results from all six geographic regions as reported in our paper, output from the ANN models suggests that warming from natural climate cycles over the 20th century would be in the order of 0.6 to 1 ºC, depending on the geographical location. The difference between output from the ANN models and the proxy records is at most 0.2 ºC. This was also the situation in studies from Switzerland and New Zealand. So, at most, the contribution of industrialisation to warming over the 20th century would be in the order of 0.2 ºC.


The IPCC estimates warming of approximately 1 ºC, and attributes this all to industrialisation. It achieves this by remodeling the proxy temperature series, before comparing them with output from the GCMs. For example, the last IPCC Assessment report concluded: ‘In the northern hemisphere, 1983-2012 was likely the warmest 30-year period of the last 1,400 years.’

If we go back 1,400 years, we have a period in Europe immediately following the fall of the Roman Empire and predating the MWP. The IPCC denies that the MWP was as warm as today, instead claiming that temperatures were flat for 1,300 years and then suddenly kicked-up from sometime after 1830 and certainly after 1880 – with no decline evident after 1980

Interestingly, in the very first report from the IPCC, which was published by Cambridge University Press in 1991, the Medieval Warm Period was shown on a chart occurring from AD 950 to 1250, and was about as warm as the present. Then in the third IPCC report published in 2001 there is no such warm period. Instead there is a chart, shaped like an ice-hockey stick, which suggests temperatures were flat until the 20th century, when sudden and sustained warming occurs.

The IPCC’s more recent approach to historical data contradicts mainstream climate science literature which is replete with published proxy temperature studies showing that temperatures have cycled up and down over the last 2,000 years – with the Medieval Warm Period as warm or warmer than the present.

The science is far from settled. In reality, some of the data is problematic, the underlying physical mechanisms are complex and poorly understood, the literature voluminous, and new alternative techniques (such as our method using ANNs) can give very different answers to those derived from GCMs and remodeled proxy-temperature series.

Our paper in GeoResJ received extensive coverage in the global press (for example, see  James Delingpole’s piece here, and the coverage by Anthony Watts here). This led to inevitable attacks on our approach and widespread defence of the IPCC’s approach. Much was made of the use of only six proxy records, out of the thousands available. The small number was appropriate for application of a novel research technique, and in any event, we chose representative studies which had good geographic coverage while enabling very detailed analysis.

Others objected to the fact that our work mines data and excludes any assumptions about physical mechanisms such as volcanoes or water vapour feedback. To insist that we incorporate equations that purport to model actual physical phenomena is to entirely miss the beauty of machine learning – a technique that works for forecasting precisely because there are recurrent cycles in the Earth’s 2,000 year temperature history.

Because our technique actually works so well at both monthly rainfall forecasting and forecasting temperature cycles, we conclude CO2 can’t have significantly perturbed natural climate cycles.

This article first appeared in the August 2018 IPA Review.

Support the IPA

If you liked what you read, consider supporting the IPA. We are entirely funded by individual supporters like you. You can become an IPA member and/or make a tax-deductible donation.