PISA’s proves itself wrong… again.Posted: October 23, 2016
“In its Analytical Framework (OECD, 2013), PISA defines the three dimensions of good teaching as: clear, well-structured classroom management; supportive, student-oriented classroom climate; and cognitive activation with challenging content (Klieme et al, 2009; Baumert et al, 2010; Lipowsky et al, 2009; Kunter et al 2008).”
I then went on to detail the way that PISA’s own measure of student-orientation shows that the greater the level of student-orientation, the worse a country’s PISA maths results. I went on to show that a greater student-orientation also coincides with a greater number of low achievers and fewer high achievers.
It was the PISA “Ten questions…” report that prompted my initial investigation and one chapter of this is about the ‘cognitive activation’ that is also mentioned in PISA’s definition of good teaching. The chapter suggests that cognitive activation is a good thing. And well it might be. After all, ‘cognitive activation’ is just a fancy way of saying that the students are thinking and I would propose that most people would view this as important.
However, the report makes a couple of claims that I can’t verify. It states that the students who had greater exposure to cognitive activation strategies within each country did better on PISA. Yet this data isn’t supplied – we can only see the average percentage response to each of the cognitive activation questions for each country. It also suggests that cognitive activation leads to better mean results ‘after accounting for other strategies’. Presumably, this is because a heavy use of student-oriented strategies negatively impacts on performance. So you have to strip this out to see what effect cognitive activation has.
Again, I am worried about how PISA have tried to measure cognitive activation:
Items considered in the index of cognitive-activation instruction (PISA):
Thinking about the mathematics teacher who taught your last mathematics class: How often does each of the following happen?
• The teacher asks questions that make us reflect on the problem.
• The teacher gives problems that require us to think for an extended time.
• The teacher asks us to decide on our own procedures for solving complex problems.
• The teacher presents problems for which there is no immediately obvious method of solution.
• The teacher presents problems in different contexts so that students know whether they have understood the concepts.
• The teacher helps us to learn from mistakes we have made.
• The teacher asks us to explain how we have solved a problem.
• The teacher presents problems that require students to apply what they have learned to new contexts.
• The teacher gives problems that can be solved in several different ways.
For the index of cognitive-activation instruction, available responses were: “Always or almost always”, “Often”, “Sometimes” and “Never or rarely”. When answering these questions, students could give positive responses to as many of these teaching practices as they considered correct (no restriction on the number of possible responses).
These questions are clearly inspired by constructivism but I doubt they lead to a tidy response. For instance, the statement that, “The teacher presents problems for which there is no immediately obvious method of solution,” might be selected because it is a deliberate teaching tactic or because the teaching is so poor the student has no idea what is going on.
If you look at the responses to these individual questions then I think there is an issue because they don’t actually correlate very closely with each other. This is surprising, given that they are meant to be measuring the same construct. For instance, here’s the correlation between questions 1 and 2.
If we examine the average percentage response to these questions and map that against PISA 2012 mean maths score, we again find a negative correlation. So the greater use of these strategies is associated with a poorer PISA performance:
The difference between my result and PISA’s analysis might be explained if cognitive activation strategies are associated with a greater level of student-orientation and it is this that PISA are stripping-out.
Note: You can find my updated data sheet here