PISA data contains a positive correlation

This is Part 3 of a sequence of posts. You can find Part 1 here and Part 2 here

I was prompted to start this investigation of the recent PISA “Ten questions…” report by the graph below:


Specifically, I wondered what had been plotted on the axes for memorisation and teacher-directed instruction so that I could compare them with PISA 2012 maths scores. It would have been really helpful if PISA labelled the axes in reports like this – this is one of the first things we teach children to do in school science.

I first plotted PISA’s “index of memorisation” against 2012 mean maths scores but the ranking was different to the y-axis above. I then tried plotting a different measure from the report data that was based upon the percentages of memorisation-type responses to the questions that PISA asked (although we must treat this construct with caution – see my first post). Yet again, this did not reproduce the ranking. I have now figured out that PISA used a ratio. They have plotted the percentage of memorisation-type responses to questions divided by the percentage of elaboration-type responses. This now explains why the axis goes from “more memorisation” to “more elaboration”.

If you plot this ratio against PISA 2012 mean maths score there is, again, little correlation:


The x-axis on the PISA graph is similarly a ratio of teacher-directed responses to student-orientation responses. This is where we find our first positive correlation with PISA 2012 maths scores:


So there you have it. Correlations do not necessarily imply causal relationships but clearly a higher ratio of teacher-directed activity to student-orientation is associated with better PISA maths performance. Again, it would be necessary to look at what these constructs are.

Here it is with the PISA axis superimposed on it:


The PISA report does nothing to highlight this relationship.

Note: You can find all of the data for the three posts here.

This is Part 3 of a sequence of posts. You can find Part 1 here and Part 2 here


12 thoughts on “PISA data contains a positive correlation

  1. Pingback: PISA data gets curiouser and curiouser | Filling the pail

  2. Pingback: PISA data on maths memorisation | Filling the pail

  3. David says:

    Hi Greg–so do you think OECD played games with the data to produce a result they wanted or that they just mishandled what they were looking at? I guess I wouldn’t be surprised of the former, given Andreas Schleicher’s exasperation over the adoption of ed tech…

  4. Pingback: PISA Mathematics Lessons: Why Zero-In on “Memorization” and Minimize Teacher-Directed Instruction? | Educhatter's Blog

  5. I am always curious to know what sensible people mean when they repeat the aphorism “correlation does not imply causation”. Do you mean (which is what you are saying) that a correlation between A and B does not imply that A causes B OR that B causes A OR that C causes both A and B?

    • I mean that this doesn’t show that being more teacher-directed *causes* a higher PISA mean score. Why?

      – the construct could be flawed and not actual measuring teacher-directedness

      – a third factor could cause countries to do both well on PISA and score highly on teacher-directedness. If so, increasing teacher-directedness without increasing this third factor would not improve PISA results

  6. I appreciate your caveat is clear but still think this is a bit dubious. Can I suggest you plot separately the Asian jurisdictions (such as the ones where 2nd generation performance in Australian schools suggested there may be a major cultural advantage), developed nations, and emerging economies. I think if you do that it might suggest that a major change e.g. in the UK to match e.g. Ireland’s teacher orientation, would produce a major shift on the x-axis but not on the y-axis. Another headline might be “UK matches balance of most high-performing jurisdictions” (and most commentators are pretty clear that Shanghai is skewed by massive element of selection on SES). I’m very much in favour of lessons being, mostly, teacher-led and I think the evidence is quite strong, but I don’t think this correlation adds to that. Fine job untangling the data though. I think for the UK (and Australia) maybe that top graph is more informative. It looks like we are well out of step with the consensus on the appropriate proportion of memorisation required. I certainly feel that about the GCSE Science curriculum.

    • Actually, just read other parts of the blog. If top graph validity is dubious then my comment on that bit may be garbage, although the wording of the questions that were asked did immediately make me think of Y11 interventions rather than embedded high quality practice (although obviously PISA children won’t have been subject to Y11 yet).

  7. Pingback: How Andreas Schleicher Learned to Stop Worrying and Love Teacher-Directed Instruction – Filling the pail

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.