*This is Part 2 of a sequence of posts. You can find Part 1 here and Part 3 here.*

Yesterday’s post on the PISA ‘Ten questions…’ report left me with a nagging worry. I had plotted the PISA mean maths scores against “index of memorisation” rather than the “average percentage across 4 questions” measure that I thought PISA used in the graph that went across the internet (I now no longer think they used this*). It didn’t seem likely, but perhaps there was more of a correlation if we used this latter measure instead.

Nope.

The R-squared for “index of memorisation” was actually marginally higher at 0.0095.

The data set that contained the average percentage measure also contained a number of other measures that were mentioned in the report. So I thought I would run these too. The first was the “average percentage across five questions” measure of whether teaching is “teacher-directed”. This seems to show a real, negative correlation with some interesting anomalies:

I haven’t examined the questions that are used to establish this construct (or the others below) in the way that I have done for memorisation so I am not sure exactly what this means.

However, one issue stands out as very odd indeed. PISA suggest that the opposite of teacher-directed is a “student-orientation” but look what happens if you plot that:

This is *by far* the largest correlation that I have found. And yet it doesn’t seem to get much of a mention in the PISA report. It is extraordinary that the authors highlight memorisation and teacher-directedness when this elephant is occupying the parlour, stamping its feet and trumpeting the tune of La Marseillaise.

And look at Ireland. It’s a massive outlier in terms of student-orientation but just a little below average on teacher-direction. So what exactly is happening in Irish maths classes? Who is in charge? What do they do?

For completeness, below is a graph of the other measure available – the use of elaboration:

So it seems that a higher score in any of the constructs – except for memorisation – is a bad thing, of which a higher score on student-orientation is definitely the worst.

*Note: The data is available here.*

**I now think they have used the ratio of teacher-directed to student-orientation. Some axes labels would be really helpful. This ratio is plotted in Part 3.*

*This is Part 2 of a sequence of posts. You can find Part 1 here and Part 3 here.*

### Like this:

Like Loading...

*Related*

On Ireland: they call it “Project Maths”. I’m a little fuzzy on the timing, but it seems to me that it came in after the point where it would have affected the 2012 PISA results. However, someone from Ireland might have more information on that point.

Starting to wonder if beyond the mean score the rest of the data is pretty useless. Most of what is being put on the x-axis requires an understanding of pedagogy and metacognition by students that most schools don’t teach. It looks like you are trying to find the link between pirate no. and global warming.

Would it not be better to analysis the nature of the tests and whether studying for and constantly doing such tests is producing good mathematicians or interested students or reducing a beautiful craft to the most rote and dull aspects?

I am just responding to, and remarking upon, PISA’s analysis.

Pingback: PISA data on maths memorisation | Filling the pail

Pingback: PISA data contains a positive correlation | Filling the pail

“this elephant is occupying the parlour, stamping its feet and trumpeting the tune of La Marseillaise.”

—

Vive la révolution? 😉

Interesting, good work and yet another example to suggest that one of more important C-words for living and working in the 21st century is “cynicism”.

Pingback: PISA evidence for project-based learning in maths | Filling the pail

Pingback: We need an inquiry about inquiry | educationandstuff