PISA data gets curiouser and curiouser

This is Part 2 of a sequence of posts. You can find Part 1 here and Part 3 here.

Yesterday’s post on the PISA ‘Ten questions…’ report left me with a nagging worry. I had plotted the PISA mean maths scores against “index of memorisation” rather than the “average percentage across 4 questions” measure that I thought PISA used in the graph that went across the internet (I now no longer think they used this*). It didn’t seem likely, but perhaps there was more of a correlation if we used this latter measure instead.

pisa-mean-maths-against-memorisation-average

Nope.

The R-squared for “index of memorisation” was actually marginally higher at 0.0095.

The data set that contained the average percentage measure also contained a number of other measures that were mentioned in the report. So I thought I would run these too. The first was the “average percentage across five questions” measure of whether teaching is “teacher-directed”. This seems to show a real, negative correlation with some interesting anomalies:

pisa-mean-maths-against-teacher-directed-average

I haven’t examined the questions that are used to establish this construct (or the others below) in the way that I have done for memorisation so I am not sure exactly what this means.

However, one issue stands out as very odd indeed. PISA suggest that the opposite of teacher-directed is a “student-orientation” but look what happens if you plot that:

pisa-mean-maths-against-student-orientation-average

This is by far the largest correlation that I have found. And yet it doesn’t seem to get much of a mention in the PISA report. It is extraordinary that the authors highlight memorisation and teacher-directedness when this elephant is occupying the parlour, stamping its feet and trumpeting the tune of La Marseillaise.

And look at Ireland. It’s a massive outlier in terms of student-orientation but just a little below average on teacher-direction. So what exactly is happening in Irish maths classes? Who is in charge? What do they do?

For completeness, below is a graph of the other measure available – the use of elaboration:

pisa-mean-maths-against-elaboration-average

So it seems that a higher score in any of the constructs – except for memorisation – is a bad thing, of which a higher score on student-orientation is definitely the worst.

Note: The data is available here.

*I now think they have used the ratio of teacher-directed to student-orientation. Some axes labels would be really helpful. This ratio is plotted in Part 3.

This is Part 2 of a sequence of posts. You can find Part 1 here and Part 3 here.

Advertisements

8 Comments on “PISA data gets curiouser and curiouser”

  1. On Ireland: they call it “Project Maths”. I’m a little fuzzy on the timing, but it seems to me that it came in after the point where it would have affected the 2012 PISA results. However, someone from Ireland might have more information on that point.

  2. Iain Murphy says:

    Starting to wonder if beyond the mean score the rest of the data is pretty useless. Most of what is being put on the x-axis requires an understanding of pedagogy and metacognition by students that most schools don’t teach. It looks like you are trying to find the link between pirate no. and global warming.

    Would it not be better to analysis the nature of the tests and whether studying for and constantly doing such tests is producing good mathematicians or interested students or reducing a beautiful craft to the most rote and dull aspects?

  3. […] PISA data gets curiouser and curiouser New teachers’ book list […]

  4. […] PISA data gets curiouser and curiouser → […]

  5. Pique Boo says:

    “this elephant is occupying the parlour, stamping its feet and trumpeting the tune of La Marseillaise.”

    Vive la révolution? 😉

    Interesting, good work and yet another example to suggest that one of more important C-words for living and working in the 21st century is “cynicism”.

  6. […] across countries. I found a weak, negative correlation between both of these and maths performance (here and here). Caro et. al. decided to try to fit curves to these relationships rather than straight […]

  7. […] demonstrated (although this seems to have been buried by the good people of OECD Education) that inquiry-based methods were equally ‘problematic’ in mathematics teaching.  But again, no one in the world of Irish education seemed […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s