Reposted from Jennifer Marohasy’s Blog
August 12, 2021 By jennifer
I was at the Australian National University in October 2018, when the largest supercomputer in the Southern Hemisphere began running the simulations that have now been published as the IPCC’s Assessment Report No. 6 (AR6). It’s being touted as the most comprehensive climate change report ever. It is certainly based on a very complex simulation model (CMIP6).
Many are frightened by the official analysis of the model’s results, which claims global warming is unprecedented in more than 2000 years. Yet the same modelling is only claiming the Earth is warming by some fractions of a degree Celsius! Specifically, the claim is that we humans have caused 1.06 °C of the claimed 1.07 °C rise in temperatures since 1850, which is not very much. The real-world temperature trends that I have observed at Australian locations with long temperature records would suggest a much greater rate of temperature rise since 1960, and cooling before that.
Allowing some historical perspective shows that the IPCC is wrong to label the recent temperature changes ‘unprecedented’. They are not unusual in magnitude, direction or rate of change, which should diminish fears that recent climate change is somehow catastrophic.
To understand how climate has varied over much longer periods, over hundreds and thousands of years, various types of proxy records can be assembled derived from the annual rings of long-lived tree species, corals and stalagmites. These types of records provide evidence for periods of time over the past several thousand years (the late Holocene) that were either colder, or experienced similar temperatures, to the present, for example the Little Ice Age (1309 to 1814) and the Medieval Warm Period (985 to 1200), respectively. These records show global temperatures have cycled within a range of up to 1.8 °C over the last thousand years.
Indeed, the empirical evidence, as published in the best peer-reviewed journals, would suggest that there is no reason to be concerned by a 1.5 °C rise in global temperatures over a period of one hundred years – that this is neither unusual in terms of rate nor magnitude. That the latest IPCC report, Assessment Report 6, suggests catastrophe if we cannot contain warming to 1.5 °C is not in accordance with the empirical evidence, but rather a conclusion based entirely on simulation modelling falsely assuming these models can accurately simulate ocean and atmospheric weather systems. There are better tools for generating weather and climate forecasts, specifically artificial neural networks (ANNs) that are a form of artificial intelligence.
Of course, there is nowhere on Earth where the average global temperature can be measured; it is very cold at the poles and rather warmer in the tropics. So, the average global temperature for each year since 1850 could never be a direct ‘observation’, but rather, at best, a statistic calculated from measurements taken at thousands of weather stations across the world. And can it really be accurately calculated to some fractions of a degree Celsius?
AR6, which runs to over 4,000-pages, claims to have accurately quantified everything including confidence ranges for the ‘observation’ of 1.07 °C. Yet I know from scrutinising the datasets used by the IPCC, that the single temperature series inputted for individual locations incorporate ‘adjustments’ by national meteorological services that are rather large. To be clear, even before the maximum and minimum temperature values from individual weather stations are incorporated into HadCRUT5 they are adjusted. A key supporting technical paper (eg. Brohan et al. 2006, Journal of Geophysical Research) clearly states that: ‘HadCRUT only archives single temperature series for particular location and any adjustments made by national meteorological services are unknown.’ So, the idea, that the simulations are based on ‘observation’ with real meaningful ‘uncertainty limits’ is just not true.
According to the Australian Bureau of Meteorology (BOM), which is one of the national meteorological services providing data for HadCRUT, the official remodelled temperatures are an improvement on the actual measurements. This may be so that they better accord with IPCC policy, with the result being a revisionist approach to our climate history. In general they strip the natural cycles within the datasets of actual observations, replacing them with linear trends that accord with IPCC policy.
The BOM’s Blair Trewin, who is one of the 85 ‘drafting authors’ of the Summary for Policy Makers, in 2018 remodelled and published new values for each of the 112 weather stations used to calculate an Australian average over the period 1910 to 2016, so that the overall rate of warming increased by 23 %. Specifically, the linear trend (°C per century) for Australian temperatures had been 1 °C per century as published in 2012 in the Australian Climate Observations Reference Network − Surface Air Temperature (ACORN-SAT) database version 1. Then, just in time for inclusion in this new IPCC report released on Tuesday, all the daily values from each of the 112 weather stations were remodelled and the rate of warming increased to 1.23 °C per century in ACORN-SAT version 2 that was published in 2018. This broadly accords with the increase of 22% in the rate of warming between the 2014 IPCC report (Assessment Report No. 5) which was 0.85 °C (since 1850), and this new report has the rate of warming of 1.07 °C.
Remodelling of the data sets by the national meteorological services generally involves cooling the past, by way of dropping down the values in the first part of the twentieth century. This is easy enough to check for the Australian data because it is possible to download the maximum and minimum values as recorded at the 112 Australian weather stations for each day from the BOM website, and then compare these values with the values as listed in ACORN-SAT version 1 (that I archived some years ago) and ACORN-SAT version 2 that is available at the BOM website. For example, the maximum temperature as recorded at the Darwin weather station was 34.2 °C on 1 January 1910 (this is the very first value listed). This value was changed by Blair Trewin in the creation of ACORN-SAT version 1 to 33.8 °C. He ‘cooled’ this historical observation by a further 1.4 °C in the creation of ACORN-SAT version 2, just in time for inclusion in the values used to calculate a global average temperature for AR6. When an historic value is cooled relative to present temperatures, then an artificial warming trend is created.
I am from northern Australia, I was born in Darwin, so I take a particular interest in its temperature series. I was born there on 26th August 1963. A maximum temperature of 29.6 °C was recorded at the Darwin airport on that day from a mercury thermometer in a Stevenson screen, which was an official recording station using standard equipment. This is also the temperature value shown in ACORN-SAT version 1. This value was dropped down/cooled by 0.8 °C in the creation of ACORN-SAT version 2, by Blair Trewin in 2018. So, the temperature series incorporated into HadCRUT5, which is one of the global temperature datasets used in all the IPCC reports shows the contrived value of 28.8 °C for 26th August 1963, yet the day I born a value of 29.6 °C was entered into the meteorological observations book for Darwin. In my view, changing the numbers in this way is plain wrong, and certainly not scientific.
The BOM justifies remodelling because of changes to the equipment used to record temperatures and because of the relocation of the weather stations, except that they change the values even when there have been no changes to the equipment or locations. In the case of Darwin, the weather station has been at the airport since February 1941, and an automatic weather station replaced the mercury thermometer on 1 October 1990. For the IPCC report (AR5) published in 2014, the BOM submitted the actual value of 29.6 °C as the maximum temperature for Darwin on 26th August 1963. Yet in November 2018, when the temperatures were submitted for inclusion in the modelling for this latest report (AR6), the contrived value of 28.8 °C was submitted.
The temperature series that are actual observations from weather stations at locations across Australia tend to show cooling to about 1960 and warming since then. This is particularly the case for inland locations from southeast Australia. For example, the actual observations from the weather stations with the longest records in New South Wales were plotted for the period to 1960 and then from 1960 to 2013, for a presentation that I gave to the Sydney Institute in 2014. I calculated an average cooling from the late 1800s to 1960 of minus 1.95 °C, and an average warming of plus 2.48 °C from the 1960s to the present, as shown in Table 1. Yet this new United Nation’s IPCC report claims inevitable catastrophe should the rate of warming exceeds 1.5 °C, yet this can be shown to have already occurred at many Australian locations.
This is consistent with the findings in my technical report as published in the international climate science journal Atmospheric Research (volume 166, pages 141-149) in 2015, which shows significant cooling in the maximum temperatures at the Cape Otway and Wilsons Promontory lighthouses, in southeast Australia, from 1921 to 1950. The cooling is more pronounced in temperature records from the farmlands of the Riverina, including at Rutherglen and Deniliquin. To repeat, while temperatures at the lighthouses show cooling from about 1880 to about 1950, they then show quite dramatic warming from at least 1960 to the present. In the Riverina, however, minimum temperatures continued to fall through the 1970s and 1980s because of the expansion of the irrigation schemes. Indeed, the largest dip in the minimum temperature record for Deniliquin occurs just after the Snowy Hydroelectricity scheme came online. This is masked by the remodelled by dropping down/cooling all the minimum temperatures observations at Deniliquin before 1971 by 1.5 °C.
In my correspondence with the Bureau about these adjustments it was explained that irrigation is not natural and therefore there is a need to correct the record through remodelling of the series from these irrigation areas until they show warming consistent with theory. But global warming itself is not natural, if it is essentially driven by human influence, which is a key assumption of current policy. Indeed, there should be something right-up-front in the latest assessment of climate change by the IPCC (AR6) explaining that the individual temperature series have been remodelled before inclusion in the global datasets to ensure a significant human influence on climate in accordance with IPCC policy. These remodelled temperature series are then incorporated into CMIP6 which is so complex it can only be run only a supercomputer that generates so many scenarios for a diversity of climate parameters from sea level to rainfall.
In October 2018, I visited the Australian National University (ANU) to watch CMIP6 at work on the largest supercomputer in the Southern Hemisphere. It was consuming obscene amounts of electricity to run the simulations for this latest IPCC report, and it is also used to generate medium to long range rainfall forecasts for the BOM. The rainfall forecasts from these simulation models even just three months in advance are, however, notoriously unreliable. Yet we are expected to believe rainfall forecasts based on simulations that make projections 100 years in advance, as detailed in AR6.
There are alternative tools for generating temperature and rainfall forecasts. In a series of research papers and book chapters with John Abbot, I have documented how artificial neural networks (ANNs) can be used to mine historical datasets for patterns and from these generate more accurate medium and long-range rainfall and temperature forecast. Our forecasts don’t suggest an impending climate catastrophe, but rather that climate change is cyclical, not linear. Indeed, temperatures change on a daily cycle as the Earth spins on its axis, temperatures change with the seasons because of the tilt of the Earth relative to its orbit around the Sun, and then there are ice ages because of changes in the orbital path of the Earth around the Sun, and so on.
Taking this longer perspective, considering the sun rather than carbon dioxide as a driver of climate change, and inputting real observations rather than remodelled/adjusted temperature values, we find recurrent cycles greater than 1.07 degrees Celsius during the last 2000 years. Our research paper entitled ‘The application of machine learning for evaluating anthropogenic versus natural climate change’, published in GeoResJ in 2017 (volume 14, pages 36-46) shows a series of temperature reconstructions from six geographically distinct regions and gives some graphic illustration of the rate and magnitude of the temperature fluctuations.
ANNs are at the cutting edge of AI technology, with new network configurations and learning algorithms continually being developed. In 2012, when John Abbot and I began using ANNs for rainfall forecasting we choose a time delay neural network (TDNN), which was considered state-of-the-art at that time. The TDNN used a network of perceptrons where connection weights were trained with backpropagation. More recently we have been using General Regression Neural Networks (GRNN), that have no backpropagation component.
A reasonable test of the value of any scientific theory is its utility – its ability to solve some particular problem. There has been an extraordinary investment into climate change over the last three decades, yet it is unclear whether there has been any significant improvement in the skill of weather and climate forecasting. Mainstream climate scientists, and meteorological agencies continue to rely on simulation modelling for their forecasts such as the CMIP6 models used in this latest IPCC report – there could be a better way and we may not have a climate catastrophe.
Further Reading/Other Information
The practical application of ANNs for forecasting temperatures and rainfall is detailed in a series of papers by John Abbot and me that are listed here: https://climatelab.com.au/publications/
Chapter 16 of the book ‘Climate Change: The Facts 2020’ provides more detail on how the Australian Bureau of Meteorology takes a revisionist approach to Darwin’s history.
There is an interactive table based on the maximum and minimum values as originally recorded for each of the 112 Australian weather stations used to calculate the official temperature values as listed in ACORN-SAT version 1 and version 2 at my website, click here:
The feature image, at the top of this blog post, shows Jennifer Marohasy in front of the supercomputer at the Australian National University in October 2018, which was running simulations for the latest IPCC report.