Australia’s Broken Temperature Record (Part 1)


Australia’s Broken Temperature Record (Part 1)

10 hours agoGuest Blogger86 Comments

January 26, 2022 By jennifer marohasy

It could be that the last 26-years of temperature recordings by the Australian Bureau of Meteorology will be found not fit for purpose and will eventually need to be discarded. This would make for a rather large hole in the calculation of global warming – given the size of Australia.

The Australian Bureau of Meteorology undertakes industrial-scale remodelling of historic temperatures in ways that generate more global warming for the same weather through the process of homogenisation, remember the work I did some years ago on Rutherglen.  The process is neither transparent nor scientific and lacks consistency with the Bureau’s own policies.  Also of concern, since 1996 the Bureau has converted to custom-made electronic probes for temperature recording, and rather than averaging temperatures over one or five minutes as is standard practice around the world from such equipment, the Australian Bureau is recording one second extrema. To be clear a ‘hottest temperature’ record is now a one second automatic download from a supersensitive electronic probe rather than a reading from a more inert mercury thermometer.  This is another way the Bureau gets more global warming for the same weather – with its own third generation probes. I am going to explain all of this in more detail in a book I’m sketching out, to write this year.

Theoretically the probes would also bias the minima downwards, except remember Thredbo? How I showed a few years ago that the Bureau has placed limits on how cold an individual weather station can record temperatures, so most of the bias is going to be warmer.

Journalist Graham Lloyd, ignoring all the problems that I’ve documented in such detail over the years, was arguably somewhat premature when he published uncritically for the Bureau on 14th January.

… according to the Bureau of Meteorology annual climate statement, 2021 was the coolest year in nearly a decade and wettest since 2016. By the end of 2021 – and for the first time in five years – no large parts of the country were experiencing rainfall deficits and drought conditions.

Announcing BoM’s 2021 temperature data, climatologist Dr Simon Grainger says: ‘After three years of drought from 2017 to 2019, above-average rainfall last year resulted in a welcome recharge of our water storages but also some significant flooding to eastern Australia.’

In 2021, Australia’s mean temperature was 0.56C above the 1961-1990 climate reference period. It was the 19th-warmest year since national records began in 1910, but also the coolest year since 2012. Rainfall was 9 per cent above the 1961-1990 average, making 2021 the wettest year since 2016, with November the wettest on record.

I went looking for the Annual Climate Statement, and the supporting data.  There was a note at the Bureau’s website saying the Annual Climate Statement wouldn’t be published until February.  There was no data published on 14th January, just the promise.

I am keen to see which few years they have put before 2021; as being hotter.   But alas I will have to wait until February.  Of more concern, in February the dilemma will remain that many of the earlier annual average temperatures have been calculated with a different mix of weather stations.

Very few people realise – though I have explained all of this multiple times including to multiple journalists – that when the Bureau of Meteorology transitioned in 2011 to the new Australian Climate Observation Reference Network – Surface Air Temperatures (ACORN-SAT) system for calculating the national average temperature it removed 57 stations from its calculations, replacing them with 36 on-average hotter stations.  This had the effect of increasing the recorded Australian average temperature by 0.42 degree Celsius, independently of any actual change in the weather.

Of the 57 stations removed from the calculation of the national average temperature, only 3 of these had closed as weather stations.   I will explain all of this in my book.  You should be able to order it for Christmas.

There are so many problems with Australia’s official temperature record including the changing combination of stations, the use of custom-designed probes without averaging, not to mention all the homogenisation.


Australia has reliable historical temperature data for the period from about 1889 until 1996 measured using liquid-in-glass thermometers – mercury for maxima and alcohol for minima.  Averaging the maxima and minima gives a mean temperature.   In some of the records there is a cooling trend to about 1960, and then warming to the present.  This is particularly the case at inland locations.  Within the longer trend there are short cycles with temperatures generally trending up during drought, and down during wetter years.

When the longest continuous temperature series are combined, with a transparent system of area weighting, as I did in an analysis of temperature trends for south-eastern Australia published as a book chapter by Elsevier in 2016, the overall warming trend is only 0.3 degrees Celsius per century (1887 – 2013).   This is significantly less than the Australian Bureau of Meteorology 0.7 degrees Celsius for the same region but a shorter period (1910 – 2016).  NASA recently announced a rate of 1.1 degrees Celsius since the late 19th century average, the start of the industrial revolution.

The daily maximum and minimum values in ‘the national temperature dataset’ (the homogenised ACORN-SAT data) are different from the actual recorded historical values, often by several degrees, usually cooler.  The further back you go in time, the more significant the cooling thus making what was a modest temperature increase over the period appear greater than it is.

This remodelling is in a different category to correcting for outlier that might have been caused by transcription errors or faulty equipment.  The remodelling cannot be confused with legitimate data-hygiene/quality assurance.

Furthermore since 1996, at an increasing number of weather stations platinum electronic probes have replaced the traditional mercury and alcohol thermometers.   For example, the Rutherglen agricultural research station has a long, continuous, temperature record with minimum and maximum temperatures first recorded using standard and calibrated equipment in a Stevenson Screen back in November 1912. Considering the first 85 years of summer temperatures – unadjusted/not homogenized – the very hottest summer on record at Rutherglen is the summer of 1938/1939.

At Rutherglen, the first significant equipment change happened on 29 January 1998. That is when the mercury and alcohol thermometers were replaced with an electronic probe – custom built to the Australian Bureau of Meteorology’s own standard, with the specifications still yet to be made public.

According to Bureau policy, when such a major equipment change occurs there should be at least three years (preferably five) of overlapping/parallel temperature recordings. However,  the mercury and alcohol thermometers (used to measure maximum and minimum temperatures, respectively) were removed on the same day the custom-built probe was placed into the Stevenson screen at Rutherglen, in direct contravention of this policy.

In 2011, the Bureau made further changes in that it stopped averaging one- second readings from the probe at Rutherglen over one minute. The maximum temperature as recorded each day at Rutherglen is now the highest one-second spot reading from the custom-built probe. That is correct – a spot reading.

So, to reiterate, we now have a non-standard method of measuring (spot readings) from non-standard equipment (custom-built probes) making it impossible to establish the equivalence of recent temperatures from Rutherglen with historical data.

At Rutherglen, a modest rate of warming in the historical maximum temperatures of 0.7 degrees Celsius per Century was changed to 1.3 degrees Celsius in ACORN-SAT Version 2. Changes to the minimum temperature trend are more dramatic: a slight cooling trend of 0.3 degrees Celsius in the historic dataset was changed to warming of 1.9 degrees in ACORN-SAT Version 2 for Rutherglen.

There is much more detail concerning temperatures at Rutherglen and surrounding stations in a report I wrote a few years ago, that can be downloaded by clicking here.


The number of weather stations with electronic probes has slowly increased since 1996, and the bureau now has a network of about 700, referred to as automatic weather stations (AWS). In a report released in September 2017 it acknowledged issues with the performance of just two of these: Goulburn Airport (Goulburn) and Thredbo Top Station (Thredbo). These are the same two weather stations that I reported in blog posts on the 5th and 18th July 2017 as not recording temperatures measured below minus 10 degrees, respectively.

While the Bureau strenuously denied it was setting limits, Minister Josh Frydenberg nevertheless insisted on a review of the entire AWS network.

When the report was published the Bureau’s investigations confirmed that Goulburn and Thredbo were the only sites where temperature records had been affected by the inability of some Bureau AWS to read low temperatures.  What are the chances? Of the nearly 700 weather stations, I stumbled across the only two with problems.

Goulburn was discovered because my friend Lance Pidgeon lives nearby and was up early on the morning of 2 July concerned his pipes were going to freeze and burst – while watching the live AWS temperature readings tick-over on that weather station, then letting me know when the record for July of minus 10.4 was reached: only to see it rounded up to minus 10.0.

Thredbo was discovered because, after making a fuss about Goulburn, I wanted to check that the Bureau had lifted the limits on readings below minus 10. So, two weeks later I decided to get up early and watch the one-second reading at one of the stations in the snow fields on the Sunday morning of 16th July thinking it might be a cold morning. Why did I choose Thredbo – of all the weather stations in the Australian Alps? Simply because my school friend Diane Ainsworth died in the landslide there twenty years ago.

I wrote this all up some years ago, at my blog, and it was republished by The Spectator Australia online, you can read more by clicking here.


When the Australian Bureau of Meteorology announce how much hotter last year was relative to earlier years, which they usually do at the beginning of each year, a reasonable person might assume they had just added-up and averaged recorded temperature measurements, perhaps adding an area weighting.   But it’s much more complicated than that.  In an article in the Weekend Australian on January 22-23 entitled, ‘BoM cools the past, warms present’ Graham Lloyd explains that the Bureau has remodelled Australia’s official temperature record for a third time in nine years.  He wrote this the week after announcing to readers of The Australian that the Bureau had published the data for 2021, when it hadn’t.

The Bureau are not saying how much the most recent remodelling (December 2020) has warmed the overall trend.  They acknowledged back in November 2018 that the second remodelling added 23% to the overall warming trend.

The Bureau claims the changes are necessary because of changes in the location of recording equipment, abrupt warming or cooling relative to other sites in the region. The article concludes with assurances from the Bureau that what they do is World’s Best Practice.

In June 2014 I gave a presentation to the Sydney Institute entitled ‘Modelling Australian and Global Temperatures: What’s Wrong? Bourke and Amberley as Case Studies’.  I choose Amberley because this is a military base with temperatures recorded by military personnel.   Temperatures have always been recorded at this same location since 1941.  My analysis was of the ACORN-SAT version 1 dataset for this location, and for the period to 2013.

At Amberley the historic minimum temperatures showed cooling at a rate of about 1 degree per century from 1970.  The Bureau changed this to warming after first determined there were two statistical discontinuities in the data in 1980 and 1996, and to correct for these changed all the historical temperatures back from 1996 to 1941 were changed creating a warming trend of 2.5 degrees Celsius per century.   A combined absolute temperature increase of 1.5 degrees Celsius.    This is a very large adjustment.

According to various peer-reviewed papers, and technical reports, homogenization is a technique that enables non-climatic factors to be eliminated from temperature series. It is often done when there is a site change (for example from a post office to an airport), or equipment change (from a Glaisher Stand to a Stevenson screen). But at Amberley neither of these criteria can be applied. The temperatures have been recorded at the same well-maintained site within the perimeter of the air force base since 1941.

My criticisms were published in Issue 26 of The Sydney Papers Online.  The Bureau has a policy of not responding to my enquiries, submissions, or peer-reviewed journal articles.  However, interestingly Gavin Schmidt then the new Director of NASA’s Goddard Institute for Space Studies in New York came to the Bureau’s defence – on Twitter.

Dr Schmidt was quite blunt about what had been done to the Amberley minimum temperature series, ‘@jennmarohasy There is an inhomogenity detected (~1980) and based on continuity w/nearby stations it is corrected. #notrocketscience’.

When I sought clarification regarding what was meant by “nearby” stations I was provided with a link to a list of 310 localities used by climate scientists at Berkeley when homogenizing the Amberley data. The inclusion of Berkeley scientists was perhaps to make the point that all the key institutions working on temperature series (the Australian Bureau, NASA, and also scientists at Berkeley) appreciated the need to adjust up the temperatures at Amberley.

But these 310 ‘nearby’ stations stretch to a radius of 974 kilometres and include Frederick Reef in the Coral Sea, Quilpie post office and even Bourke post office.

Considering the unhomogenized data for the six nearest stations that are part of the Bureau’s ACORN-SAT network (old Brisbane aero, Cape Moreton Lighthouse, Gayndah post office, Bundaberg post office, Miles post office and Yamba pilot station) the Bureau’s jump-up for Amberley creates an increase for the official temperature trend of 0.75 degree C per century. Temperatures at old Brisbane aero, the closest station that is also part of the ACORN-SAT network, also shows a long-term cooling trend. Indeed, perhaps the cooling at Amberley is real. Why not consider this, particularly in the absence of real physical evidence to the contrary?

In the Twitter conversation with Dr Schmidt we suggested it was nonsense to use temperature data from radically different climatic zones to homogenize Amberley, and repeated our original question asking why it was necessary to change the original temperature record in the first place. Dr Schmidt replied, ‘@jennmarohasy Your question is ill-posed. No-one changed the trend directly. Instead procedures correct for a detected jump around ~1980.’

If Twitter was around at the time George Orwell was writing the dystopian fiction Nineteen Eighty-Four, I wonder whether he might have borrowed some text from Dr Schmidt’s tweets, particularly when words like, ‘procedures correct’ refer to mathematical algorithms reaching out to ‘nearby’ locations that are across the Coral Sea and beyond the Great Dividing Range to change what was a mild cooling trend, into dramatic warming, for an otherwise politically incorrect temperature series.

There is more in the article published by the Sydney Institute following the presentation that I gave there a few years ago, you can read it by clicking here.


The feature image, at the very top of this blog post, shows me at the Goulburn weather station in August a few years back.4.9Article Rating

Share this:

Like this:


Australian Met Office Accused Of Manipulating Temperature Records

August 23, 2014

In “Climate data”

It has been hotter, fires have burnt larger areas

January 4, 2020

In “Wildfires”

Hottest Day Ever in Australia Confirmed: Bourke 51.7°C, 3rd January 1909

July 10, 2020

In “Land Surface Air Temperature Data”Tags: Australian Bureau of Meteorology

Post navigation

Southern Ocean storms cause outgassing of carbon dioxide“Rivers of Rain” Could Wreck China, Unless We Reduce CO2 Emissions Subscribe {}[+]86 COMMENTSOldest John Tillman January 26, 2022 10:13 am

Are any ground station “records” fit for purpose?9 ReplyThomas GasloliReply to John Tillman January 26, 2022 10:55 am


If they were interested in accurate data they would convert minute averages into hourly averages then into daily averages. The only reason for not doing this is because they want bad data.20 ReplyJohn TillmanReply to Thomas Gasloli January 26, 2022 11:30 am

That’s just one ploy in their bag of tricks, but an important one.11 ReplyH BReply to John Tillman January 26, 2022 2:34 pm

only takes 1 to stuff the data5 ReplyLoydoReply to John Tillman January 26, 2022 2:46 pm

Nothing is “fit for purpose” if it refutes your ideology.

comment image

Last edited 4 hours ago by Loydo-11 ReplyMarkWReply to Loydo January 26, 2022 3:03 pm

Conversely, everything is fit for purpose if supports your ideology.

As usual, Loydo doesn’t even bother dealing with the facts presented, just throws out insults and slinks away.10 ReplyLoydoReply to MarkW January 26, 2022 3:32 pm

Is UAH data is fit for purpose? Please ‘splain.-7 ReplyJohn TillmanReply to Loydo January 26, 2022 3:55 pm

How does that trend from the height of 20th century warmth in 1979 in any way justify destroying industrial civilization, which supports eight billion rather than one billion people?5 ReplyLoydoReply to John Tillman January 26, 2022 6:32 pm

I pointed out two closely correlated trends.

What do you think Jen? Pretty close? And what does it mean? That the BOM and a strikingly different methodology come up with the same trend? And which also happens to be a cool outsider? Pretty close? Or not really that close? Or close-ish but not immediately proximate in any extraordinary way?

No matter, obsfucating the glaring warming trends seems precisely what this post is about and frankly, what all your posts are about.-2 ReplyJohn TillmanReply to Loydo January 26, 2022 5:00 pm

Flat temperature from 1997 to 2016 and cooling since then refutes your antiscientific faith, as of course does cooling from 1945 to 1977 under rising CO2.4 ReplyLdBReply to Loydo January 26, 2022 5:09 pm

I vote for more global warming as Perth clocked up 5 days in a row over 40 degree C for the first time ever. We spent more time beaches than usual and other than the odd shark alert it was best summer ever.4 ReplyGeoff SherringtonReply to LdB January 26, 2022 5:16 pm


I do not know your data source, but Perth exceeded 5 consecutive days over 40 deg C in each of these years:
1956 1933 1961 1933 1933 1961 1965 1956 1961 1961 1933 1956 1964 2016 1978 1980 1991 1991 2016 2007 1980 1962 1956 1989 1985 1961 2003 1961 1964 1985 1989 2007 1968 1975
Geoff S5 ReplyJohn TillmanReply to Geoff Sherrington January 26, 2022 5:27 pm

So, WA, like the US, was hotter in 1933 than 2021.3 ReplyLoydoReply to LdB January 26, 2022 6:02 pm

You looked out your window; good for you, and say 10 or 15 in a row would be ok too? Can you see outside of a western, a/ced, beached city like Perth?-4 ReplyRich DavisReply to Loydo January 26, 2022 6:26 pm

Loydodo projecting again2 ReplyStreetcredReply to John Tillman January 26, 2022 6:10 pm

The BoM’s 7-day forecasts in Brisbane always run hot and rarely do they match actuals … you have to be quick to check the actuals before they ‘homogenise’ them for publication the following day.Last edited 1 hour ago by Streetcred2 ReplyPillage IdiotReply to John Tillman January 26, 2022 1:26 pm

“Those who vote temperature stations decide nothing. Those who count the vote select the temperature stations decide everything.”

— Zombie Joseph Stalin13 ReplyTom Halla January 26, 2022 10:13 am

There are just questions it is not polite to ask about how one supports The Narrative.2 ReplyBill Rocks January 26, 2022 10:15 am

Oh what a tangled web they weave. All for settled science?5 ReplyPeta of NewarkReply to Bill Rocks January 26, 2022 10:35 am

Precisely. They are trying Far Too Hard to find the trapped heat.
Those ‘once-a-second’ thermometers tell as much.
In many ways similar to Germany’s Engywendy

Why. What has Australia got to feel guilty about – what are they trying to prove?

It’s obvious that in their heart-of-hearts they know it is Junk Science.
Hence also the ‘Centres Of Excellence’ that are looking for the trapped heat – they are trying to convince/brainwash themselves as much as everyone else.

do you laugh or cry8 ReplyClyde SpencerReply to Peta of Newark January 26, 2022 2:06 pm

I thought it was the “hidden heat!”2 ReplyScissorReply to Bill Rocks January 26, 2022 12:27 pm

Settled science is social science.5 ReplyDennisReply to Scissor January 26, 2022 4:18 pm

The “scientists” that settled climate science as claimed were exposed just before the IPCC Copenhagen Conference, hacked emails in two batches that were called “Climate Gate 1 & 2” containing exchanges between them as they discussed how to create a warming trend computer model to support the climate hoax, as compared to natural climate change over time since the beginning.4 ReplyJohn TillmanReply to Scissor January 26, 2022 5:17 pm

Political “science”.1 ReplyTom.1 January 26, 2022 10:29 am

Can you explain for us casual observers what the problem is with homogenization? I feel like I have a vague understanding, but how does it create bad data, if that is the case?-10 ReplyMr.Reply to Tom.1 January 26, 2022 11:20 am

My understanding is that science is about observation, confirmation, accuracy, precision and replication of results.

In layman’s terms –
“you can’t make sh1t up as you go along according to your own rules and call it observed & recorded data”20 ReplyTim SpenceReply to Tom.1 January 26, 2022 11:29 am

If the raw data reads high or low it will continue to reflect increases or decreases and so is useful, if not indispensable. Once homoed it corrupts the historic record and also the raw data has a tendency to disappear once homoed data appears. Plus dozens of other things.10 ReplyTim GormanReply to Tim Spence January 26, 2022 3:19 pm

Assumption 1: Temperature profiles are nearly sine waves.

Thus the temperature readings at two different stations are related by:

Station 1 – sin(t)
Station 2 – sin(t + a)

The correlation of the two sin waves turns out to be cos(a).

“a” is affected by distance, elevation, pressure, and humidity plus probably other variables such as terrain, nearby bodies of water, and ground cover under the stations.

A distance of about 50 miles results in a correlation factor of .8 or less. This is typically low enough that most people do not consider it representative of useful correlation.

Trying to homogenize non-correlated (or at least only lightly correlated) temperatures leads to nothing but mis-applied temperature corrections to stations being homogenized. It is certain to cause mis-identification of stations that are “wrong”. Hubbard and Lin did a study in 2002 on the application of corrections to measuring stations and their conclusion was that any type of correction should only be done on a station-by-station basis. Even such a difference in micro-climate as one station having green grass under it while another has pea gravel can result in a difference in temperature readings between stations.

The only *accurate* way to identify a discontinuity in temperature readings at a station s by analyzing the actual station readings – and that is nearly impossible since it is difficult to separate out drift in the measurement device from actual climate change.

Ms. Marohasy’s statement of “According to Bureau policy, when such a major equipment change occurs there should be at least three years (preferably five) of overlapping/parallel temperature recordings. ” is on point. It should, however, also require that both stations be compared to a recently calibrated third measurement device in order to determine the accuracy of each of the two stations. Assuming that the newest device is the most accurate is not a justified assumption.6 ReplyTom.1Reply to Tom.1 January 26, 2022 11:55 am

It’s a fair question, and I wanted to hear the author’s explanation.4 ReplyJennifer MarohasyReply to Tom.1 January 26, 2022 12:26 pm

Thanks Tom. I explain in some detail in the article at this link:

If you put ‘Rutherglen’ into the search engine at my site you will find a lot more. Including how they claim Rutherglen does not move in synchrony with other stations in that region … but it does, perfectly, except they ‘homogenised’ the other stations and then compared it to Rutherglen. I kid you not. Search Rutherglen and ‘Benalla’ or Rutherglen and ‘Deniliquin’ at my site.

If we considering just one example of a change forced by homogenisation, let’s consider January 1939. That was when a terrible bushfire ravaged the state of Victoria, known as the BlackFriday Bushfire. The actual recorded minimum temperature on the hottest day at Rutherglen was 28.3 degrees Celsius. This temperature was changed in 2011 to 27.8 and then with the second lot of revisions in 2018 to 25.7. I haven’t checked to see what it is in this latest iteration (ACORN-SAT 2.2).

Anyone wanting to say model past temperatures during extreme events/bushfires and they look at the official Australian record they are not going to get accurate data. They are going to be out by some few degrees.

A reason that Rutherglen shows continual cooling in its minimum temperatures over the 20th century is because it is an area that saw the development of major irrigation schemes. Water cools the landscape. We can see this in the Rutherglen record, before homogenisation.

And to be clear, the temperatures at Rutherglen until they changed to the probe on 29th January 1998, were measured using standard equipment and in the same paddock at an agricultural research station.12 ReplyTom.1Reply to Jennifer Marohasy January 26, 2022 12:44 pm

Thanks for the explain and the link.1 ReplyMarkWReply to Tom.1 January 26, 2022 12:06 pm

The most common method of homogenization is to replace missing data. To do this they will examine stations that are “near” the station with the missing data. They will examine trends in the nearby stations. If they find that on average the nearby station is 1 degree warmer, then they conclude that on the day the data is missing, the nearby stations were likely 1 degree warmer as well. So they take the nearby data, subtract 1 degree, and put that in to replace the missing data.
Of course nearby can be any station within 500 to 600 kilometers.

The exact method of determining what the average for the nearby stations and how to use that average to replace the missing data varies from one researcher to another, and to a large degree is considered a trade secret and not shared.Last edited 7 hours ago by MarkW8 ReplyJohn TillmanReply to MarkW January 26, 2022 12:10 pm

I thought 1200 km, but that might be diameter rather than radius. However in polar regions, the radius could be that great.

Please correct if wrong.3 ReplyMarkWReply to John Tillman January 26, 2022 3:06 pm

I’m working from memory so you have as good a chance of being right as I do. I wanted to keep it on the low side so that the usual suspects wouldn’t be able to use that as a distraction.3 ReplyJohn TillmanReply to MarkW January 26, 2022 5:35 pm

“Within 1200 km of a reporting station”:

On an airline, it’s 1339 km from Churchill, Manitoba to Duluth, MN. From Tucson to Boise is 1352 km. Las Vegas to Portland, 1227 km.Last edited 1 hour ago by John Tillman0 ReplyTom.1Reply to Tom.1 January 26, 2022 12:08 pm

A little research tells me it probably is not what thought. It seems to be more of a process of identifying discontinuities or step changes in the data that mean something changed with the instrument or its surroundings. The homogenization tries to remove the discontinuities, perhaps by looking at nearby temperature records, or some other means. Temperature readings tend to error on the high side (although not for sea surface temperatures), so to the extent that there are more of these “discontinuities” in older data, which seems logical for land based instruments, homogenization would tend to increase the long term trend. Not sure it’s a very big deal since these kinds of adjustments will occur less in more recent data.-8 ReplyRon LongReply to Tom.1 January 26, 2022 12:30 pm

I cannot imagine, as a scientist, disappearing actual older data, which seems to be a marker for misconduct.11 ReplyD. J. HawkinsReply to Tom.1 January 26, 2022 1:03 pm


Not sure it’s a very big deal since these kinds of adjustments will occur less in more recent data.

Ahhh, but that’s not so. Now that they have modern instrumentation, they merrily add degrees to the latest temperatures.


8 ReplyGnrnrReply to Tom.1 January 26, 2022 2:09 pm

Discontinuities and step changes occur at specfic time when specific changes are made. The off set that results should be a set value until the next step change is identified. They are too lazy/it is too difficult to actually identify what changes occurred at each location so they create an algorithm to homogenise the data. So on day X the alteration value might be 0.5° and Day X+1 might be -0.25°. it isn’t done the correct way.Last edited 5 hours ago by Gnrnr2 ReplyClyde SpencerReply to Tom.1 January 26, 2022 2:15 pm

It appears that you thought you knew the answer before you asked the question. That might explain why you received so many down votes for a question. Your reputation precedes you.

You said, “Temperature readings tend to error [sic] on the high side.” Do you have a citation or at least a reasonable explanation for that?4 ReplyTim GormanReply to Tom.1 January 26, 2022 3:39 pm

How do you differentiate between a change in the station measurement device and a change in the actual climate at that location?

If something changed in the surrounding then why is an adjustment justified? The measurements at a station are supposed to be indicative of what is happening at that station. If someone built a pond upwind of the station then how to you determine what adjustment is needed?

As I posted in a different message, homogenization is a losing battle because of decreasing correlation between stations as distance, elevation, humidity, terrain, ground cover, etc increase cos(a), the correlation between the temperatures at two different stations.2 ReplyLdBReply to Tom.1 January 26, 2022 5:12 pm

You are trying to deal with a historic record and you think it’s okay to change the historic data where you can’t test and check the error .. Mate I think you lack any understanding at all 🙂1 ReplyJanice MooreReply to Tom.1 January 26, 2022 12:10 pm

What’s “homogenization” you say[, Tom Dot1]? Some kind of dairy product treatment?

Well no, not quite. It is data that has been put through a series of processes that render it so the end result is like comparing the temperature between several bowls of water that have been mixed together, then poured back into the original bowls and the temperature measured of each. What you get is an end temperature for each bowl that is a mixture of the other nearby bowl temperatures. ***

Admittedly, raw data can have its own problems, but there are ways my friends and I at the Pielke research team can make valid station trend comparisons without making numerical adjustments to the actual data raw data. ***

Anthony Watts
(Source: )

Homogenization Analysis

Investigation of Methods for Hydroclimatic Data Homogenization, Steirou, E., and D. Koutsoyiannis, European Geosciences Union General Assembly 2012, Geophysical Research Abstracts, Vol. 14, Vienna, 956-1, European Geosciences Union (2012). —  

‘We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. … From the global database, GHCN-Monthly Version 2, we examined all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. … in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. ***


1. Homogenization is necessary to remove errors introduced into climatic time series. 

2. Homogenization practices used until today are mainly statistical, not well-justified by experiments and rarely supported by metadata. … 

3. [] Homogenization is expected to increase or decrease the existing multiyear trends in equal proportions[, however,] in 2/3 of the cases, the trends increased after homogenization. 

4. The above results cast some doubts on use of homogenization procedures and … indicate that the global temperature increase during the last century is smaller than 0.7 – 0.8˚ C.

(Edited by me for readability)
(Source: )6 ReplyRud IstvanReply to Tom.1 January 26, 2022 12:21 pm

I gave a detailed explanation with several examples from around the world in essay When Data Isn’t in ebook Blowing Smoke.3 ReplyRobert BReply to Tom.1 January 26, 2022 1:19 pm

An actual technique doesn’t create bad data unless misapplied. This thing that is called ‘homogenization’ is a bespoke technique made up by climate scientists with an agenda. That isn’t the evidence that it is bad. The evidence is the stupid results that are spat out.7 ReplyMartin ClarkReply to Tom.1 January 26, 2022 3:16 pm

The problem for the believers is that it isn’t enough to fudge temperature records for the northern hemisphere. It must be repeated in the southern hemisphere, and guess what – most of the measurements in the SH are in Australia, although there has been some tampering in one of the South American countries. If those of us in business homogenised our financial records the same way as BoM do to the temperature records, we would end up in jail.
I was aware of the scam over Amberley, as I had collected a large quantity of Amberley records in order to advise a customer nearby on CRD (climate-responsive design) issues. The raw figures show a decline in average temperature.6 ReplyPaulReply to Tom.1 January 26, 2022 4:20 pm

casual observer ? You ? bullshit !!3 ReplyRick C January 26, 2022 10:38 am

Seismometers in Australia must surely be buzzing with the vibrations caused by long deceased weather station personnel spinning as their carefully recorded data is tossed on the bonfires of politically correct climate science.11 ReplyMarkWReply to Rick C January 26, 2022 12:13 pm

I am particularly surprised to find out that each station is custom built.
The idea that you can build a climate database when each station is unique is something that is so dumb that only a climate scientist could come up with it.

This also further proof that the claims that they can measure the “average” temperature to within a few thousandths of a degree is nothing but pure fantasy.

To use multiple readings to improve precision the first and most basic requirement is that you need to measure the same thing, using the same thing.

Using a thousand identical sensors to measure a thousand different patches of air already violates both of those requirements. But to find out that the sensors are not in fact identical blows their claims so far out of the water that you will need to use low flying planes to find it.9 ReplyJohn TillmanReply to MarkW January 26, 2022 12:59 pm

If we were talking about actual science, that fact would be surprising. But in “climate science” (TM), just SOP.4 ReplyFrank from NoVA January 26, 2022 10:51 am

Good post re. Australian data tampering and climate modeling (redundant, I know). Expecting the antipodal CAGW support team any moment now…9 ReplyMarkWReply to Frank from NoVA January 26, 2022 12:13 pm

I can just hear Nick proclaiming that as long as the sensor records what you were expecting to see, that it must be good.6 ReplyJohn TillmanReply to MarkW January 26, 2022 12:22 pm

I would actually like to hear Nick’s take on this.

IMO, hard to find fault with JM’s objections, but if anyone could, it would be our Nick.

BTW, a super hotty turning the Big 6-0 next year:

comment image

Which as we all know is the new 5-0.

Who lectures in pearls and a LBD. Prepared for the post-function cocktail parties. OK, maybe a medium black dress, for the formal, academic occasion.Last edited 7 hours ago by John Tillman8 ReplyPeter Fraser January 26, 2022 10:56 am

National Institute of Weather and Atmospherics, NIWA, has been fiddling with the New Zealand temperature record too, always tending to show an upward trend. When the record was challenged Australian Bureau of Meteorology was asked to do a review. NIWA refused to release the paper. I think it is still not on the public record. Good work Ms Marohasy, looking forward to your book. A nice photo.9 ReplyJennifer MarohasyReply to Peter Fraser January 26, 2022 12:29 pm

Thanks Peter. But I don’t know why you can’t refer to me as either Jen, Jennifer or Dr Marohasy. Please. I get tired of the ‘Ms Marohasy’.7 ReplyPeter FraserReply to Jennifer Marohasy January 26, 2022 2:19 pm

My apology. I should have used the honorific Dr.3 ReplyDennisReply to Jennifer Marohasy January 26, 2022 4:22 pm

I was told years ago that when a person insists on being referred to as Ms something is missing in their life and mind.

sarc.0 ReplyJoseph Zorzin January 26, 2022 11:00 am

$9.2 Trillion Per Year To Save The World
data:image/gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw==2 ReplyLdBReply to Joseph Zorzin January 26, 2022 5:14 pm

Multiply that by about 50 because the report was done by idiots who assumed the most crazy stats for green technology.1 ReplyDMacKenzie January 26, 2022 11:29 am

If they are unfit in Australia, temps will be estimated by homogenization between “fit” stations in South America and Africa, then published in many places as “data”. This will be useful in future to check if the parameters of future models fit past data. /sLast edited 7 hours ago by DMacKenzie6 ReplyUncle Mort January 26, 2022 11:40 am

So what we have are tamperature measurements.8 ReplyJohn TillmanReply to Uncle Mort January 26, 2022 11:52 am

Tamperatures, period. No actual measurements need apply.2 ReplyMarkWReply to John Tillman January 26, 2022 12:15 pm

Do you record tamperatures monthly?1 ReplyJohn TillmanReply to MarkW January 26, 2022 12:19 pm

What difference does it make?

They’re all tampered with.1 ReplyBruce CobbReply to John Tillman January 26, 2022 12:35 pm

Please don’t throw a tamper tantrum.2 ReplyJohn TillmanReply to Bruce Cobb January 26, 2022 12:57 pm

Shameless. Funny, but shameless.1 Replyglenn holdcroft January 26, 2022 12:12 pm

Crimate change has a well paying agenda . Not just Aussie but most of the western societies.2 ReplyBob January 26, 2022 12:13 pm

I never trust anyone to fiddle with data. If it is necessary to adjust, an explanation must be included along with the raw data.5 ReplyMr.Reply to Bob January 26, 2022 12:30 pm

As soon as you fiddle with recorded measurements, you are dealing with numerical constructs derived through assumptions. not “data”, “observations, “facts”, “realities”.7 ReplyTim GormanReply to Bob January 26, 2022 3:44 pm

If an explanation is required then a whole new data set should be started. You should not try to adjust the old data to match the new data. It’s too bad if that affects the length of the records available but that is far better than just making it up out of nothing.0 ReplyBruce Cobb January 26, 2022 12:25 pm

It’s like the Indiana Jones movie where he says “Bad data”.1 ReplyDnalor50 January 26, 2022 1:00 pm

Does anyone know the mass of the new temperature probes compared to the big blob of mercury and glass making up the old thermometers? A thermistor can be a tiny component standing above a circuit board on a long pair of pigtails or it could be encased in a big blob of epoxy/something to increase it’s thermal mass. Honest scientists would match the physical mass of the sensors to ensure identical response to temperature change.3 ReplyJohn TillmanReply to Dnalor50 January 26, 2022 1:07 pm

Let alone comparing temperatures recorded with mercury thermometers vs. those digitally, with a nearby electrical power source.2 ReplyD. J. HawkinsReply to Dnalor50 January 26, 2022 1:10 pm

Well, honest scientists would match the response curves of the sensors to ensure identical response to identical inputs. This may or may not require identical masses. It’s important to keep your eye on what really matters.2 ReplyDnalor50Reply to D. J. Hawkins January 26, 2022 1:23 pm

Computers can average readings over any time period, but keeping the thermal mass of measuring devices the same is probably a good starting point.1 ReplyMarkWReply to Dnalor50 January 26, 2022 3:35 pm

Thermal mass is especially important if you are going to measure temperatures every second.2 ReplyTim GormanReply to Dnalor50 January 26, 2022 3:48 pm

It’s not just the sensor, it is the whole measuring station that needs to be standardized. While the Argo floats use calibrated thermistors (calibrated at least initially) the actual uncertainty of the temperature measured by the float is +/- 0.5C. Not much better than a LIG thermometer. Anything that changes the water flow rate or salinity of the samples measured by the thermistor, e.g. a barnacle in the water intake of output, will affect the temperature measurement.1 ReplyRobert B January 26, 2022 1:06 pm

BOM seem to do many strange things. Mildura had a weekend of 123 F and 124 F in 1906. They, reasonably, insisted that it was read in the shade but not a Stevenson Screen so was higher than it would be with the more modern equipment.

When they did a study to correct it down over 4 C, they used a comparable site, Deniliquin, over 300 km east of Mildura. During that month, they differed a lot, from D being warmer than M to being 14 C cooler. How it could possibly be considered to be comparable is beyond me.

Despite a good reason to think that it was too high because of equipment, they ignored the second hot day completely even though reported in newspapers. They also converted the 123 F and 124 F to 50.1 C and 50.7 C, or 122.18 F and 123.25 F. Basically, they assumed that it was rounded up from 122.5 and 123.5 and because the precision was 0.25 F, that they would use the lowest possible values in C that could get reported as 123 F and 124 F.

They are clearly looking for any reason to cool the past and warm the present, but ignoring any reason to adjust temperatures the other way.8 ReplyDennisReply to Robert B January 26, 2022 3:42 pm

Another late 1800s was a BoM weather station located at the Bourke NSW Post Office and during days of heatwave weather conditions the the operator responsible for recording the data decided to check it on a Sunday, not normally done on that holiday, and one of the hottest days on record was recorded there. BoM ignore that data and only use record data after 1910.1 ReplyRud Istvan January 26, 2022 1:53 pm

Dr. Merohasy, I definitely will be buying your forthcoming book if available in ebook form. Long ago ran out of real book shelf space at both homes. Your Rutherglen Research analysis lives on (with text credits and live footnotes to your site) in essay When Data Isn’t in ebook Blowing Smoke.3 ReplyClyde Spencer January 26, 2022 2:03 pm

Averaging the maxima and minima gives a mean temperature.

What is being described is a special case of ‘averaging’ that should more properly be called the “mid-range.” It is really the median of two numbers. That is, it is equidistant between two extremes, and therefore strongly affected by error in either of the two measurements. It lacks the redeeming features of a true arithmetic mean, which is redundancy that reduces the sensitivity to singular errors; also, any calculation of a standard deviation is meaningless. Therefore, one cannot say anything about the probability of observations from a population.4 ReplyPhil Salmon January 26, 2022 2:43 pm

A little OT but here’s an interview with John Cook, by YouTuber Mallen Baker. It’s quite revealing. Cook comes over as a small minded hate-filled person, for whom the underlying motivation of his whole “debunking” campaign is hatred of his own father-in-law.

His unexceptional intellect and lack of any scientific background at all make him highly susceptible to alarmist pseudoscience, of which he has become an obsessive devotee. He is a curious example of a real-time self-debunker with practically everything he says.3 ReplyDennis January 26, 2022 3:33 pm

Jennifer Marohasy and colleagues have monitored various BoM automatic weather stations and discovered that the temperature readings are often not reported accurately, not an actual figure but for example 29.2 C actual might become 29.9 C reported, and obviously spread over the network of weather stations the impression of a temperature rise trend can be created over time.

They also noted where weather stations were located in or near heat sinks, roads, airport infrastructure and others. One of the original locations now close to the Sydney Harbour Bridge at the original Observatory and alongside the motorway for traffic heading north onto the bridge was open parkland but now has the motorway and buildings not far away.1 Replytygrus January 26, 2022 4:29 pm

Recent headlines seen like “Australia has provisionally just had its joint hottest day on record” regarding the 13th Jan 2022.

3rd January 1909, Bourke 51.7°C (125°F) & Brewarrina 50.6°C (123°F).
What did the rest of Australian weather stations record for min/max/avg on Thurs 13th Jan 2022 compared to historical records?
Headline statement should say A town in Aust NOT ALL.

“Hottest Day Ever in Australia Confirmed: Bourke 51.7°C, 3rd January 1909” Jennifer Marohasy
(And several other pages/comments regarding this & similar cases)

The latest Onslow obs was at the Onslow Airport. Checking the BOM records show they previously had another station in town. Airport read ~0.7C hotter than town in early decades (1941-1970), ~1.4C hotter towards the last decades of the overlap (1991-2020). I question the significance of the news.

See January mean of daily max.

The crude comparison above may not exactly quantify a fixed offset or bias. But it is worth further investigation of raw data & the differences need better explanations. It’s easier to write the sensational headlines than to check the details & explain the uncertainties.2 ReplyAndy Pattullo January 26, 2022 5:34 pm

Weather records are supposed to be an historical record of what actually happened, but these days the official records are simple manipulated propaganda for the climate change religion. Crying “wolf” is now the highest of “scientific” achievements, – ranked high above doing actual science.2 ReplyStreetcred January 26, 2022 6:08 pm

I’m seeing more and more alamista cr8p appearing in our Australian newspapers, particularly the Courier Mail, the others have always been ‘full of it’ … mostly without accreditation.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s