PISA’s proves itself wrong… again.

A few days ago, I noted PISA’s definition of good teaching (thanks again to @cbokhove for the link to the working paper). Here’s a reminder:

“In its Analytical Framework (OECD, 2013), PISA defines the three dimensions of good teaching as: clear, well-structured classroom management; supportive, student-oriented classroom climate; and cognitive activation with challenging content (Klieme et al, 2009; Baumert et al, 2010; Lipowsky et al, 2009; Kunter et al 2008).”

I then went on to detail the way that PISA’s own measure of student-orientation shows that the greater the level of student-orientation, the worse a country’s PISA maths results. I went on to show that a greater student-orientation also coincides with a greater number of low achievers and fewer high achievers.

It was the PISA “Ten questions…” report that prompted my initial investigation and one chapter of this is about the ‘cognitive activation’ that is also mentioned in PISA’s definition of good teaching. The chapter suggests that cognitive activation is a good thing. And well it might be. After all, ‘cognitive activation’ is just a fancy way of saying that the students are thinking and I would propose that most people would view this as important.

However, the report makes a couple of claims that I can’t verify. It states that the students who had greater exposure to cognitive activation strategies within each country did better on PISA. Yet this data isn’t supplied – we can only see the average percentage response to each of the cognitive activation questions for each country. It also suggests that cognitive activation leads to better mean results ‘after accounting for other strategies’. Presumably, this is because a heavy use of student-oriented strategies negatively impacts on performance. So you have to strip this out to see what effect cognitive activation has.

Again, I am worried about how PISA have tried to measure cognitive activation:

Items considered in the index of cognitive-activation instruction (PISA):

Thinking about the mathematics teacher who taught your last mathematics class: How often does each of the following happen?
• The teacher asks questions that make us reflect on the problem.
• The teacher gives problems that require us to think for an extended time.
• The teacher asks us to decide on our own procedures for solving complex problems.
• The teacher presents problems for which there is no immediately obvious method of solution.
• The teacher presents problems in different contexts so that students know whether they have understood the concepts.
• The teacher helps us to learn from mistakes we have made.
• The teacher asks us to explain how we have solved a problem.
• The teacher presents problems that require students to apply what they have learned to new contexts.
• The teacher gives problems that can be solved in several different ways.

For the index of cognitive-activation instruction, available responses were: “Always or almost always”, “Often”, “Sometimes” and “Never or rarely”. When answering these questions, students could give positive responses to as many of these teaching practices as they considered correct (no restriction on the number of possible responses).

These questions are clearly inspired by constructivism but I doubt they lead to a tidy response. For instance, the statement that, “The teacher presents problems for which there is no immediately obvious method of solution,” might be selected because it is a deliberate teaching tactic or because the teaching is so poor the student has no idea what is going on.

If you look at the responses to these individual questions then I think there is an issue because they don’t actually correlate very closely with each other. This is surprising, given that they are meant to be measuring the same construct.  For instance, here’s the correlation between questions 1 and 2.


If we examine the average percentage response to these questions and map that against PISA 2012 mean maths score, we again find a negative correlation. So the greater use of these strategies is associated with a poorer PISA performance:


The difference between my result and PISA’s analysis might be explained if cognitive activation strategies are associated with a greater level of student-orientation and it is this that PISA are stripping-out.

Note: You can find my updated data sheet here


6 thoughts on “PISA’s proves itself wrong… again.

  1. Chester Draws says:

    These questions are clearly inspired by constructivism

    I’m less certain of that.

    I see questions that are trying to distinguish between teachers who teach rote patterns for solving problems against those that teach a range of techniques and let the students choose which to use.

    I would consider all those things to be some part of how I teach — though I generally only ask how students attacked a question if they get stuck halfway through.

    Where I differ from Constructivists is that I would only give long and tricky questions after I’d taught all the theory and methods very thoroughly first. Those 10 PISA questions aren’t really focusing on that at all (plus nothing about estimation, guessing and checking etc that the progressives seem to love, but which explicit instruction types generally rank very lowly as strategies). I give questions in a range of contexts as I teach a new skill so that the students don’t get bogged down on only seeing it applied one way, precisely because I am not Constructivist. I want them to see clearly how it applies in a range of situations, not let them discover that for themselves.

    However I don’t get into alternative methods of solving problems with students who struggle, because teaching alternative methods to someone who can’t reliably solve it any way is rather pointless. So perhaps really full on Constructivists would have their students answer those PISA questions differently, because they would give long open ended questions even to their strugglers.

    (In my experience slow Maths students almost always prefer teachers who give one simple method to solve problems — so I am student oriented in giving those kids what they want?)

    • Iain Murphy says:

      Agree with you, Chester.

      Can see Greg’s thinking as some of the questions are trending in this direction “the teacher asks us to decide on our own procedures for solving complex problems” but many others have a strong need in any classroom. To many teachers ignore mistakes for the need to get through content, the constructivist “allow them to find their own solution” or the explicit “they aren’t using the method taught them” are both flawed for many students.

      My worry is that PISA seems to be assessing students metacognition and trying to draw comparisons across countries which just seems doomed to failure or results in convenient results like the one they had posted. Think Greg’s graph of Q1 and Q2 shows that the question results are going to be a scatter gun of results.

  2. Pingback: PISA evidence for project-based learning in maths | Filling the pail

  3. Pingback: Inquiry learning – Filling the pail

  4. Pingback: The battle is over curriculum and assessment – Filling the pail

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.