Taking the data seriously

Embed from Getty Images

I have had some interesting discussions since my last post on the topic of classroom disruption and I have completed a little more research.

Let’s start with the research. This comes from international surveys of teachers and students and I find it interesting for two reasons. Firstly, it should represent a random sample of students of a particular age in each country (although we should perhaps bear in mind allegations that some countries manipulate the sampling). Secondly, such surveys provide a comparison so that we can make inferences about what we should expect; what is typical. It is important to note that this kind of research is descriptive and does not seek to identify cause-and-effect because it would not be an appropriate method for doing this.

The first set of data comes from the Programme for International Student Assessment (PISA) run by the OECD and it is the data that I shared in my last post. Based upon a survey of 15-year-old students, PISA have generated a worrying table using an ‘index of disciplinary climate in science classes’. The table is constructed from responses to survey questions where students have to indicate how much they agree with statements such as ‘there is noise and disorder’ in their science classes. Australia sits near the bottom:

CLICK TO ENLARGE Index of disciplinary climate with Australia highlighted

It also doesn’t seem that this ranking is based purely on low-level disruption because, in Australia, there is a clear link between the index and incidents of bullying:

CLICK TO ENLARGE Exposure to bullying and school’s disciplinary climate with Australia highlighted

Bullying is also highlighted as an issue in the 2015 Trends in International Mathematics and Science Study (TIMSS). This time, students were surveyed in Grade 4 and Grade 8. You can find the relevant charts here and here but I’m not sure about copyright so I won’t reproduce them. Instead, I have made my own graph that seems to show a greater incidence of persistent bullying in Australia than is the general trend across the TIMSS countries:

CLICK TO ENLARGE Incidence of reported bullying based on TIMSS 2015 data

It is possible that this reflects different interpretations of what constitutes bullying in different cultures. Support for this view comes from the fact that New Zealand has figures that are close to, and slightly worse than, Australia. However, culturally similar England has figures that approach the TIMSS average and so I am inclined to think that a real effect is being captured and that New Zealand and Australia have similar bullying problems. The rest of the TIMSS data on behaviour doesn’t stand out much from the TIMSS average (if I’m reading it right).

Finally, we can try to triangulate the PISA and TIMSS data with the OECD’s Teaching and Learning International Survey (TALIS), the last of which took place in 2013. The stand-out finding for Australia was that:

“Almost 10% of Australian teachers work in schools where intimidation or verbal abuse of teachers and staff by students occurs on a weekly basis, and over a quarter work at schools where verbal abuse amongst students occurs frequently. This is considerably higher than the TALIS averages of 3.4% and 16% respectively.”

The Centre for Independent Studies (CIS) referred to the PISA and TIMSS data in its recent submission to the Review to Achieve Excellent in Australian Schools, colloquially know as the ‘Gonski 2.0’ review, and they did so in order to support their call for better classroom management training for teachers. Dr Linda Graham dismissed this call, stating that, “The CIS has provided no credible evidence to support the claim that Australia has high levels of classroom misbehaviour.”

It is hard to understand why this data might not be considered credible evidence. On Twitter, Dr David Zyngier suggested that this was because this evidence is ‘anecdotal’ and based upon self-reports. Zyngier also suggested that teachers have always complained about the behaviour of students, citing a quote that turns out to have been misattributed to Plato.

Nevertheless, the objection to self-report data makes some sense. Relying on students’ and teachers’ responses to surveys is not completely valid because people are prone to bias. However, it is hard to see why Australian students and teachers would be biased to exaggerate misbehaviour in a way that students and teachers in other countries do not.

Ideally, we would objectively observe classrooms rather than rely on second hand accounts. However, once we try to arrange observations we need to select classrooms to observe. We are highly unlikely to obtain anything close to a random sample of classrooms in Australia because only some schools, teachers and the parents will be prepared to be involved in the research and the reasons for this will interact directly with some of the factors we are trying to measure. So PISA, TIMSS and TALIS data at least have the advantage of better sampling.

I’m not even sure that the objection to self-reporting is a consistent one among those who seek to dismiss this evidence. For instance, just a few days ago, David Zyngier tweeted approvingly of a piece of research based upon self-reports:

Similarly, this piece of research about inclusion is based upon a survey of families and school staff in Australia. In fact, self-reports seem to be a mainstay of such studies and this can have powerful effects. In this case, I wonder how a disabled student or his or her parent might respond to a question about whether they have experienced, “Use of restrictive practices to manage behavioural challenges”? Such a question would seem to invite quite subjective responses given that pretty much any school will impose some kind of restrictions upon students’ behaviour, and this may be especially noticeable if a child’s disability is something like Oppositional Defiant Disorder. How should we then interpret the responses to this question? How should we then make sense of a claim of a high incidence of ‘restrictive practices’ that is based, in part, upon responses to this question? I’m not sure, but at least this demonstrates that problems associated with the use of self-reports are not restricted to data from international surveys such as PISA, TIMSS and TALIS.

If students taking surveys are prone to bias then so are academics. One of these biases is confirmation bias; the tendency to quickly dismiss evidence that you don’t like and readily accept evidence that you approve of. I wonder if this is what is going on here: self-reports are bad one minute but absolutely fine the next. Perhaps the data about behaviour in Australian schools is something that many academics simply don’t want to talk about because it conflicts with their beliefs or because it draws discussion towards an agenda at odds with their own. Through no conscious intent, they may be ignoring something significant.

And I believe that it is significant because every child matters, even the quiet ones who don’t demand attention. No child deserves to have their learning disrupted by noise and disorder in the classroom. No child should have an education that is degraded because their school has a discipline problem. No child should live with the anxiety of unpredictable, dangerous or violent behaviour from their peers: children won’t learn maths very well if they’re scared. And no child should wake in the dead of night, tossing and turning about going to school the next day because of the bullies and what they might do. School should be a joyous place where learning is exhilarating, celebratory and, above all, safe. That’s why we should take this data seriously.


5 thoughts on “Taking the data seriously

  1. I have trouble with the validity of the PISA data. I think the cultural bias in students responding to questions about discipline and bullying could be very significant. Also from a statistical point of view, I wonder what the spread of the survey data is for the different countries and within each country. How big is the difference from top to bottom in real numbers? I often wish I knew more about statistics to be able to investigate such things.

    • The cultural argument is dealt with in the article and you don’t acknowledge that so it’s hard to take that point seriously.

      Your other point, absolutely! We always see meaningful things in numbers but without some idea of a relevant concept of standard deviation for the topic we’re looking at.

  2. Pingback: Orderly classrooms benefit the most disadvantaged children | Filling the pail

  3. Pingback: I see no behaviour crisis | Filling the pail

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.