Vindicated: Bureau Not Following WMO Guidelines

Two decades ago the Australian Bureau of Meteorology replaced most of the manually-read mercury thermometers in its weather stations with electronic probes that could be read automatically – so since at least 1997 most of the temperature data has been collected by automatic weather stations (AWS).

Before this happened there was extensive testing of the probes – parallel studies at multiple site to ensure that measurements from the new weather stations tallied with measurements from the old liquid-in-glass thermometers.

There was even a report issued by the World Meteorological Organisation (WMO) in 1997 entitled ‘Instruments and Observing Methods’ (Report No. 65) that explained because the modern electronic probes being installed across Australia reacted more quickly to second by second temperature changes, measurements from these devices need to be averaged over a one to ten-minute period to provide some measure of comparability with the original thermometers.

This report has a 2014 edition, which the Bureau now claim to be operating under – these WMO guidelines can be downloaded here:
http://www.wmo.int/pages/prog/www/IMOP/CIMO-Guide.html .

Further, section 1.3.2.4 of Part 2 explains how natural small-scale variability of the atmosphere, and the short time-constant of the electronic probes makes averaging most desirable…  and goes on to suggest averaging over a period of 1 to 10 minutes.

I am labouring this point.

So, to ensure there is no discontinuity in measurements with the transition from thermometers to electronic probes in automatic weather stations the maximum and minimum values need to be calculated from one-second readings that have been averaged over at least one minute.

Yet, in a report published just yesterday the Bureau acknowledge what I have been explaining in blog posts for some weeks, and Ken Stewart since February: that the Bureau is not following these guidelines.

In the new report, the Bureau admits on page 22 that:

* the maximum temperature is recorded as the highest one-second temperature value in each minute interval,

*the minimum is the the lowest one-second value in the minute interval, and

* it also records the last one-second temperature value in the minute interval.

No averaging here!

Rather than averaging temperatures over one or ten minutes in accordance with WMO guidelines, the Bureau is entering one second extrema.

Recording one-second extrema (rather than averaging) will bias the minima downwards, and the maxima upwards. Except that the Bureau is placing limits on how cold an individual weather station can record a temperature, so most of the bias is going to be upwards.

****

The Bureau’s new review can be downloaded here: http://www.bom.gov.au/inside/Review_of_Bureau_of_Meteorology_Automatic_Weather_Stations.pdf

I’ve also posted on this report, and limits on low temperatures, here: http://jennifermarohasy.com/2017/09/vindicated-bureau-acknowledges-limits-set-cold-temperatures-can-recorded/

This originally appeared on Dr. Jennifer Marohasy’s blog

Vindicated: Bureau Acknowledges Limits Set On How Cold Temperatures Can Be Recorded

The Bureau has a network of 695 automatic weather stations (AWS) across Australia. In a report released late yesterday it acknowledged issues with the performance of just two of these: Goulburn Airport (Goulburn) and Thredbo Top Station (Thredbo). These are the same two weather stations that I reported at my blog were not recording temperatures measured below minus 10 degrees on the 5th and 18th July, respectively.

While the Bureau strenuously denied it was setting limits, the Minister Josh Frydenberg nevertheless insisted on a review of the entire AWS network.

The Minister phoned me late yesterday to let me know that the report had just been published, and that the Bureau’s investigations confirmed that Goulburn and Thredbo were the only sites where temperature records had been affected by the inability of some Bureau AWS to read low temperatures.

What are the chances? Of the nearly 700 weather stations, I stumbled across the only two with problems.

Goulburn was discovered because my friend Lance Pidgeon lives nearby and was up early on the morning of 2 July concerned his pipes were going to freeze and burst – while watching the live AWS temperature readings tick-over on that weather station, then letting me know when the record for July of minus 10.4 was reached: only to see it rounded up to minus 10.0.

Thredbo was discovered because, after making a fuss about Goulburn, I wanted to check that the Bureau had actually lifted the limits on readings below minus 10. So, two weeks later I decided to get up early and watch the one-second reading at one of the stations in the snow fields on the Sunday morning of 16th July thinking it might be a cold morning. Why did I choose Thredbo – of all the weather stations in the Australian Alps? Simply because my school friend Diana Ainsworth died in the landslide there twenty years ago.

Never mind – I’m vindicated!

The Bureau has now acknowledged that it had inadvertently set limits on how cold temperatures could be recorded at Goulburn and Thredbo.

To be clear the equipment has a general operating range to minus 60 degrees Celsius, but smart card readers – with a nominal range to only minus 10 degrees Celsius and that stop reading all together at minus 10.4 – were inserted placing limits on the actual recordings, not the measurements.

According to the report published late yesterday, the cards were inserted into the Goulburn weather station in September 2002, and into the Thredbo weather station in May 2007. So, for a period of nearly 15 years there has been a limit on how cold temperatures can be recorded at Goulburn, and for nearly 10 years at Thredbo.

This Goulburn weather station was first opened in 1990, and had previously recorded temperatures below minus 10 degrees Celsius in 1994,1999 and 2000 – with a record cold minus 10.9 recorded on 17 August 1994.

The Thredbo weather station opened in 1966, and recorded an average of 2.5 days below minus 10 degrees until 1996 when an automatic weather station was installed – replacing the previous liquid-in-glass manually-read thermometers.

Since the AWS was first installed, back in April 1997 there has been a reduction in the average number of days when temperatures have fallen below minus 10 degrees Celsius, as shown in the chart.

Further, since May 2007 when the MSI2 sensor interface card was replaced with the MSI1 card (see page 50 of the new report from the Bureau) there has been no potential to record below minus 10.4. Yet not far from this location, at Charlotte Pass, an all-time record low temperature of minus 23 degree Celsius was recorded on 29 June 1994; this was with an old style liquid-in-glass thermometer – not with an AWS.

How can this review possibly conclude that there are no problems with the other 693 automatic weather stations – and there has been no impact on official temperature records from the limits it now acknowledges were placed on recordings from Thredbo and Goulburn?

Surely, there is now evidence enough for a proper external review to be initiated, this should be a Parliamentary Enquiry, through the House Energy and Environment Committee.

The Bureau’s report can be downloaded here: www.bom.gov.au/inside/Review_of_Bureau_of_Meteorology_Automatic_Weather_Stations.pdf

Originally published on Dr. Jennifer Marohasy’s blog

Most of the Recent Warming Could be Natural

AFTER deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution.  The results from this novel technique, just published in GeoResJ [1], accord with climate sensitivity estimates from experimental spectroscopy but are at odds with output from General Circulation Models.According to mainstream climate science, most of the recent global warming is our fault – caused by human emissions of carbon dioxide.  The rationale for this is a speculative theory about the absorption and emission of infrared radiation by carbon dioxide that dates back to 1896.  It’s not disputed that carbon dioxide absorbs infrared radiation, what is uncertain is the sensitivity of the climate to increasing atmospheric concentrations.

This sensitivity may have been grossly overestimated by Svante Arrhenius more than 120 years ago, with these overestimations persisting in the computer-simulation models that underpin modern climate science [2].  We just don’t know; in part, because the key experiments have never been undertaken [2].

What I do have are whizz-bang gaming computers that can run artificial neural networks (ANN), which are a form of machine learning: think big data and artificial intelligence.

My colleague, Dr John Abbot, has been using this technology for over a decade to forecast the likely direction of particular stock on the share market – for tomorrow.

Since 2011, I’ve been working with him to use this same technology for rainfall forecasting – for the next month and season [4,5,6].  And we now have a bunch of papers in international climate science journals on the application of this technique showing its more skillful than the Australian Bureau of Meteorology’s General Circulation Models for forecasting monthly rainfall.

During the past year, we’ve extended this work to build models to forecast what temperatures would have been in the absence of human-emission of carbon dioxide – for the last hundred years.

We figured that if we could apply the latest data mining techniques to mimic natural cycles of warming and cooling – specifically to forecast twentieth century temperatures in the absence of an industrial revolution – then the difference between the temperature profile forecast by the models, and actual temperatures would give an estimation of the human-contribution from industrialisation.

Firstly, we deconstruct a few of the longer temperature records: proxy records that had already been published in the mainstream climate science literature.

These records are based on things like tree rings and coral cores which can provide an indirect measure of past temperatures.  Most of these records show cycles of warming and cooling that fluctuated within a band of approximately 2°C.

For example, there are multiple lines of evidence indicating it was about a degree warmer across western Europe during a period known as the Medieval Warm Period (MWP).  Indeed, there are oodles of published technical papers based on proxy records that provide a relatively warm temperature profile for this period [7], corresponding with the building of cathedrals across England, and before the Little Ice Age when it was too cold for the inhabitation of Greenland.

I date the MWP from AD 986 when the Vikings settled southern Greenland, until 1234 when a particularly harsh winter took out the last of the olive trees growing in Germany.  I date the end of the Little Ice Age as 1826, when Upernavik in northwest Greenland was again inhabitable – after a period of 592 years.

The modern inhabitation of Upernavik also corresponds with the beginning of the industrial age.  For example, it was on 15 September 1830 that the first coal-fired train arrived in Liverpool from Manchester: which some claim as the beginning of the modern era of fast, long-distant, fossil-fuelled fired transport for the masses.

So, the end of the Little Ice Age corresponds with the beginning of industrialisation.  But did industrialisation cause the global warming?

In our just published paper in GeoResJ, we make the assumption that an artificial neural network (ANN) trained on proxy temperature data up until 1830, would be able to forecast the combined effect of natural climate cycles through the twentieth century.

We deconstructed six proxy series from different regions, with the Northern Hemisphere composite discussed here. This temperature series begins in 50 AD, ends in the year 2000, and is derived from studies of pollen, lake sediments, stalagmites and boreholes.  Typical of most such proxy temperature series, when charted this series zigzags up and down within a band of perhaps 0.4°C on a short time scale of perhaps 60-years. Over the longer nearly 2,000-year period of the record, it shows a rising trend which peaks in 1200AD before trending down to 1650AD, and then rising to about 1980 – then dipping to the year 2000: as shown in Figure 12 of our new paper in GeoResJ.

Proxy temperature record (blue) and ANN projection (orange) based on input from spectral analysis for this Northern Hemisphere multiproxy. The ANN was trained for the period 50 to 1830; test period was 1830 to 2000.

Proxy temperature record (blue) and ANN projection (orange) based on input from spectral analysis for this Northern Hemisphere multiproxy. The ANN was trained for the period 50 to 1830; test period was 1830 to 2000.

The decline at the end of the record is typical of many such proxy-temperature reconstructions and is known within the technical literature as “the divergence problem”.  To be clear, while the thermometer and satellite-based temperature records generally show a temperature increase through the twentieth century, the proxy record, which is used to describe temperature change over the last 2,000 years – a period that predates thermometers and satellites – generally dips from 1980, at least for Northern Hemisphere locations, as shown in Figure 12.  This is particularly the case with tree ring records. Rather than address this issue, key climate scientists, have been known to graft instrumental temperature series onto the proxy record from 1980 to literally ‘hide the decline’[8].

Using the proxy record from the Northern Hemisphere composite, decomposing this through signal analysis and then using the resulting component sine waves as input into an ANN, we generated a forecast for the period from 1830 to 2000.

Figure 13 from our new paper in GeoResJ shows the extent of the match between the proxy-temperature record (blue line) and our ANN forecast (orange dashed line) from 1880 to 2000.  Both the proxy record and also our ANN forecast (trained on data the predates the Industrial Revolution) show a general increase in temperatures to 1980, and then a decline.

Proxy temperature record (blue) and ANN projection (orange) for a component of the test period, 1880 to 2000.

The average divergence between the proxy temperature record from this Northern Hemisphere composite, and the ANN projection for this period 1880 to 2000, is just 0.09 degree Celsius. This suggests that even if there had been no industrial revolution and burning of fossil fuels, there would have still been some warming through the twentieth century – to at least 1980.

Considering the results from all six geographic regions as reported in our new paper, output from the ANN models suggests that warming from natural climate cycles over the twentieth century would be in the order of 0.6 to 1 °C, depending on the geographical location. The difference between output from the ANN models and the proxy records is at most 0.2 °C; this was the situation for the studies from Switzerland and New Zealand.  So, we suggest that at most, the contribution of industrialisation to warming over the twentieth century would be in the order of 0.2°C.

The Intergovernmental Panel on Climate Change (IPCC) estimates warming of approximately 1°C, but attributes this all to industrialization.

The IPCC comes up with a very different assessment because they essentially remodel the proxy temperature series, before comparing them with output from General Circulation Models.  For example, the last IPCC Assessment report concluded that,

“In the northern hemisphere, 1983-2012 was likely the warmest 30-year period of the last 1,400 years.”

If we go back 1,400 years, we have a period in Europe immediately following the fall of the Roman empire, and predating the MWP.  So, clearly the IPCC denies that the MWP was as warm as current temperatures.

This is the official consensus science: that temperatures were flat for 1,300 years and then suddenly kick-up from sometime after 1830 and certainly after 1880 – with no decline at 1980.

To be clear, while mainstream climate science is replete with published proxy temperature studies showing that temperatures have cycled up and down over the last 2,000 years – spiking during the Medieval Warm Period and then again recently to about 1980 as shown in Figure 12 – the official IPCC reconstructions (which underpin the Paris Accord) deny such cycles.  Through this denial, leaders from within this much-revered community can claim that there is something unusual about current temperatures: that we have catastrophic global warming from industrialisation.

In our new paper in GeoResJ, we not only use the latest techniques in big data to show that there would very likely have been significant warming to at least 1980 in the absence of industrialisation, we also calculate an Equilibrium Climate Sensitivity (ECS) of 0.6°C. This is the temperature increase expected from a doubling of carbon dioxide concentrations in the atmosphere. This is an order of magnitude less than estimates from General Circulation Models, but in accordance from values generated from experimental spectroscopic studies, and other approaches reported in the scientific literature [9,10,11,12,13,14].

The science is far from settled. In reality, some of the data is ‘problematic’, the underlying physical mechanisms are complex and poorly understood, the literature voluminous, and new alternative techniques (such as our method using ANNs) can give very different answers to those derived from General Circulation Models and remodeled proxy-temperature series.

Dr Jennifer Marohasy is a Senior Fellow at the Institute of Public Affairs, this was origally published at her blog.

Key References

Scientific publishers Elsevier are making our new paper in GeoResJ available free of charge until 30 September 2017, at this link:

https://authors.elsevier.com/a/1VXfK7tTUKabVA

1. Abbot, J. & Marohasy, J. 2017. The application of machine learning for evaluating anthropogenic versus natural climate change, GeoResJ, Volume 14, Pages 36-46.   http://dx.doi.org/10.1016/j.gf.2017.08.001

2. Abbot, J. & Nicol, J. 2017. The Contribution of Carbon Dioxide to Global Warming, In Climate Change: The Facts 2017, Institute of Public Affairs, Melbourne, Editor J. Marohasy, Pages 282-296.

4. Abbot, J. & Marohasy, J. 2017. Skilful rainfall forecasts from artificial neural networks with long duration series and single-month optimisation, Atmospheric Research, Volume 197, Pages 289-299. DOI10.1016/j.atmosres.2017.07.01

5. Abbot, J. & Marohasy, J. 2016. Forecasting monthly rainfall in the Western Australian wheat-belt up to 18-months in advance using artificial neural networks. In  AI 2016: Advances in Artificial Intelligence, Eds. B.H. Kand & Q. Bai. DOI: 10.1007/978-3-319-50127-7_6.

6. Abbot J., & J. Marohasy, 2012. Application of artificial neural networks to rainfall forecasting in Queensland, Australia. Advances in Atmospheric Sciences, Volume 29, Number 4, Pages 717-730. doi: 10.1007/s00376-012-1259-9 .

7. Soon, W. & Baliunas, S. 2003. Proxy climatic and environmental changes of the past 1000 years, Climate Research, Volume 23, Pages 89–110. doi:10.3354/cr023089.

8. Curry, J. 2011. Hide the Decline, https://judithcurry.com/2011/02/22/hiding-the-decline/

9. Harde, H. 2014. Advanced two-layer climate model for the assessment of global warming by CO2. Open J. Atmospheric Climate Chang. Volume 1, Pages 1-50.

10. Lightfoot, HD & Mamer, OA. 2014. Calculation of Atmospheric Radiative Forcing (Warming Effect) of Carbon Dioxide at any Concentration. Energy and Environment Volume 25, Pages 1439-1454.

11. Lindzen, RS & Choi, Y-S. 2011. On the observational determination of climate sensitivity and its implications. Asia-Pacific. Journal of Atmospheric Science Volume 47, Pages 377-390.

12. Specht, E, Redemann, T & Lorenz, N. 2016. Simplified mathematical model for calculating global warming through anthropogenic CO2. International Journal of Thermal Science, Volume 102, Pages 1-8.

13. Laubereau, A & Iglev, H. 2013. On the direct impact of the CO2 concentration rise to the global warming, Europhysics Letters, Volume 104, Pages 29001.

14. Wilson, DJ & Gea-Banacloche, J. 2012. Simple model to estimate the contribution of atmospheric CO2 to the Earth’s greenhouse effect. American Journal of Physic, Volume 80, Pages 306-315.

Four-Steps Needed To Restore Confidence In Bureau’s Handling Of Temperature Data

THE Minister for Environment and Energy, Josh Frydenberg, needs to immediately instigate the following four-step process to restore confidence in the recording and handling of historical temperature data.

Step 1 – Instruct the Bureau to immediately:

1. Lift any limits currently placed on the recording of minimum temperatures;

2. Make publicly available the dates on which limits were first set (e.g. minus 10.0 for Goulburn), and the specific weather stations for which limits were set;

3. Advise whether or not the actual measured temperatures have been stored for the weather stations where limits were set (e.g. Goulburn and Thredbo Top);

4. Make publicly available the stored values, which were not entered into the Australia Data Archive for Meteorology (ADAM) – known more generally as the CDO dataset;

5. Clarify, and document, the specific standard applied in the recording of measurements from the automatic weather station (AWS) equipment including period of the measurement (i.e. 1-second or 10-minute average), checks in place to ensure compliance with the standard, checks in place to monitor and correct any drift, and temperature range over which the equipment gives valid measurements.

Note to chart: Since the installation of an automatic weather station at Thredbo, there has been a reduction in the number of days each year when the temperature has fallen to, or below minus 10.0 degree Celsius – from an average of 2.5 (1966 to 1996) to 1.1 days (1997 to July 2017). As a matter or urgency, the Bureau needs to explain when the limits were placed on the minimum temperature that could be recorded at this, and other, automatic weather stations..

Step 2 – Establish a Parliamentary Enquiry, through the House Energy and Environment Committee, with Terms of Reference that include:

6. When and why the policy of recording actual measurements from weather stations into ADAM was modified through the placement of limits on the lowest temperature that an individual weather station could record;

7. Scrutiny of the methodology used by the Bureau in the remodelling of individual temperature series from ADAM for the creation of ACORN-SAT that is used to report climate change trends;

8. Scrutiny of the complex area weighting system currently applied to each of the individual series used in ACORN-SAT;

9. Clarification of the objectives of ACORN-SAT, specifically to ensure public expectations are consistent with the final product;

10. Clarification as to why statistically-relevant uncertainty values generally increase, rather than decreases with homogenisation.

Step 3 – Establishment of a formal Red Team*, setup independently of the Bureau, to formally advise the parliamentary committee mentioned in Step 2. In particular, the Red Team might:

11. Act to challenge, where appropriate, the evidence and arguments of the Blue Team (the Bureau);

12. Provide a genuinely open review environment so the parliamentarians (and public) can hear the counter arguments and evidence, including how homogenisation may have corrupted the official historical temperature record – and incorrectly suggest that every year is hotter than the previous;

13. Suggest lines of argument for the parliamentary committee to consider, and questions to ask.

Step 4 – As a government committed to innovation, the Bureau be told to consider alternative and more advanced techniques for the storage, quality assurance and reconstruction of historical datasets, in particular:

14. A two-day workshop be held at which the Bureau’s ACORN-SAT team (currently 2.5 people) be exposed to the latest quality assurance techniques and big-data methods – including the application of artificial neural networks for historical temperature reconstructions as an alternative to homogenisation.

In summary – This four-step process must be implemented as a matter of urgency.Incorrect historical temperature data currently underpins the theory of human-caused global warming that has resulted in government policies ostensibly to mitigate further global warming. These policies are costing the Australian economy hundreds of billions of dollars, and forcing-up the price of electricity for ordinary Australian families and businesses.

___________
* Red Team versus Blue Team exercises take their name from their military antecedents. The idea is that the Red Team provides evidence critical of Blue Team’s methodology (i.e. the Bureau’s temperature data handling and recording methods). The concept was originally applied to test force readiness in the military, and has since been applied to test physical security of sensitive sites like nuclear facilities, and also information security systems.

This first appeared on Dr. Jennifer Marohasy’s blog.

IPA sparks national conversation on criminal justice reform

IPA Research Fellow Andrew Bushnell, was all over the media yesterday on TV, Online and on featured across 30 different radio stations highlighting our latest report from our criminal justice program, which finds that Australians are spending more on criminal justice and getting worse results than most comparable countries, underlining the need for criminal justice reform across the country.

As Andrew explains to ABC News, Australian prisons are the fourth most expensive among OECD nations, for that expense we should be getting bang for our buck, yet we are not.

Listen to the exclusive report on ABC RN below.

Andrew explains to 774 Drive yesterday why criminal justice is important to the IPA, and solutions to solve the high cost and poor results in our prison system.

The report was featured as an exclusive to ABC, see the story online here.

The data shows we are spending more than $100,000 a year per prisoner.

As you can see by the data below, prison rates have skyrocketed since the early 1990s.

Andrew Bushnell also wrote an opinion piece for ABC online: The expensive problem with our prisons: Why spending more doesn’t make us feel safer. Read it here.

Bureau of Inconsistencies: Need for Urgent Independent Inquiry

Minister Josh Frydenberg was told by the Australian Bureau of Meteorology on, or about, Wednesday 5th July 2017 that limits had been placed on how cold temperatures could be recorded across mainland Australia.This winter we have experienced record low temperatures.   But only the keenest weather observers have noticed, because the Bureau has been changing the actual values measured by the automatic weather stations.

In particular, the Minister was told that while the Goulburn weather station accurately measured the local temperature as minus 10.4 at 6.30 am on Sunday 2 July, a smart card reader prevented this value from being recorded as the daily minimum on the Daily Weather Observations page.

The smart card reader had been pre-programmed to round-up any value below minus 10 degrees Celsius.  So, instead of entering minus 10.4 into the CDO dataset, the value of minus 10.0 was entered for 2nd July instead.

On 2nd July the value of -10.0 was entered into the CDO dataset, which is meant to be a record of actual temperature measurements at Goulburn. This value, however, represented the rounding-up of -10.4. The value of -10.0 was never actually recorded as the minimum for that day.

On 2nd July the value of -10.0 was entered into the CDO dataset, which is meant to be a record of actual temperature measurements at Goulburn. This value, however, represented the rounding-up of -10.4. The value of -10.0 was never actually recorded as the minimum for that day.

This wrong limit of minus 10.0 was confirmed in an email from the Bureau sent to journalist Graham Lloyd, and also Griffith businessman Paul Salvestrin, on 4th July.

This was the advice from the Bureau on 4th July, then on 28th July the Bureau wrote to the Minister claiming the weather station was faulty, and that it never recorded -10.4 degree Celsius.

No such limits are placed on how hot temperatures can be recorded.

While the Minister has had this advice – about the smart card readers and the limits on cold temperature recordings –  for some weeks, he has claimed publicly that he has full confidence in the Bureau and resisted calls for an independent inquiry.  Further, the Minister has supported the Bureau’s faux solution of replacing the automatic weather station initially at Goulburn and Thredbo, and more recently at many more sites across Victoria and Tasmania.

All-the-while, the Minister has known that the problem is limited to the smart card readers.

To be clear, the problem is not with the equipment; all that needs to be done is for the smart card readers to be removed so that after the automatic weather stations measure the correct temperature, this temperature can be brought forward firstly into the Daily Weather Observation sheet and subsequently into the CDO dataset.

Jennifer Marohasy visiting the Goulburn weather station on 31st July

David Jones is the Manager of Climate Monitoring and Prediction services at the Bureau and would probably have overseen the installation of the smart cards.  Jones is also on-record stating that “Truth be known, climate change here is now running so rampant that we don’t need meteorological data to see it.”

This first appeared at Dr Jennifer Marohasy’s blog

Politically correct the new A+

I recently completed the final semester of my undergraduate degree at a relatively reputable North American university. To fulfil my philosophy minor, I was required to enrol in a certain number of courses offered by that department. And so, in January of this year, I found myself signed up for one in particular named ‘Critical Perspectives on Social Diversity’. An immediate red flag. After four years as a student at this institution, I had already experienced more than my fair share of political correctness. But now I saw myself faced with 12 weeks of sharing a classroom with its most aggressive campaigners.

And yet, I tried to remain optimistic. As a philosophy course (rather than a political science or gender studies course) I hoped that personal biases would be set aside. As a philosophy course, I assumed that discussions would focus on the validity and soundness of argument structure. I was naïve.

The assessment for this course included two equally-weighted ‘comment sheets’ that together comprised about 40% of the total grade. In each ‘comment sheet’ we were asked to critique a paper. The first time around, I wrote honestly. My critique concluded that certain aspects of the author’s proposal to give special rights to those suffering from “enduring injustices” were ambiguous and contradictory. Some sections even made enormous logical leaps, leaving large gaps in the overall flow of the argument. Guessing that I would be one of the only students to say anything negative about the paper, I dedicated an inordinate amount of time towards this task, prioritizing it above other pieces of assessment. I received a B+.

For experimental purposes, the second time around I aligned my opinion with that of my classmates. Despite the second paper’s stance again suffering from similar failings as the first, I instead concluded that it was faultless. Through gritted teeth, I endorsed the author’s stance that testimonies of systematic sexism within minority cultures were “misleading and unrepresentative extrapolations”. I agreed that “cultural freedom” was more important than establishing a system of uniform rights for all. Compared to my first attempt, I dedicated only a fraction of the time and effort towards the task. Unquestionably my writing was of a lesser quality. I received an A+.

Now, I know that my experience in no way parallels the extremity of the atrocities recently carried out at Evergreen State College. Unlike Professor Bret Weinstein, I was neither mobbed nor sworn at for defending my opinion. However, I still experienced first-hand that, as a student, regardless of the quality of your writing, unless you choose to adopt the approved viewpoint you will never be ‘given an A’. And I believe that this is still representative of the increasingly-Orwellian nature of modern academia. Unfortunately, we are no longer being taught to think for ourselves but to regurgitate the rhetoric of our peers. And consequently, in a perverse way, all these attempts to ‘give the minority a voice’ are starting to silence the majority… simply because they are the majority.

Despite what was written on my syllabus, ‘Critical Perspectives on Social Diversity’ was not a course about arguments but was one about conclusions.

And so, fellow students, we are left with a decision – are we willing to sell our souls for the extra GPA points?

Bronwyn Allan is an intern at the Institute of Public Affairs