PISA comprehensively proves itself wrong

We saw yesterday that PISA defines good teaching to have a “student-orientated classroom climate”. In 2012, PISA asked students completing its tests four questions to see how student-orientated their teaching actually was.

The questions were:

How often do these things happen in your mathematics lessons?

• The teacher gives different work to classmates who have difficulties learning and/or to those who can advance faster.
• The teacher assigns projects that require at least one week to complete.
• The teacher has us work in small groups to come up with joint solutions to a problem or task.
• The teacher asks us to help plan classroom activities or topics.

I am not entirely convinced about this construct or the process of comparing different education systems in this way. However, PISA do this themselves and draw quite bold conclusions from the process. So, following their lead, I decided to compare the level of student orientation with PISA maths scores from the same year.

The measure of student orientation that I have used is one that PISA used in their recent report to calculate a ratio of student-orientated to teacher-directed instruction. It is simply the average score when students are asked the four student-orientation questions above. I’m not quite sure how the scoring works but at least this is better than the memorisation construct because the student-orientation construct includes a scaled response to each question: “never”, “occasionally”, “frequently’, and so on.

 

We have already seen the negative correlation between this measure of student-orientation and mean PISA 2012 maths scores:

pisa-21-oct-1

PISA also report the percentage of low achievers and the percentage of high achievers in each country so I thought I would examine how these compare with student-orientation. Give the graph above, you might expect that increased student-orientation correlates with increased numbers of low achievers:

pisa-21-oct-2

However, what about high achievers? They might benefit from more open-ended work? Apparently not:

pisa-21-oct-3

This last correlation is the least strong and there is some variation at the low end. If we don’t simply dismiss this kind of analysis as a nonsense then there are many reasons why this might happen. I would suggest that the degree of alignment between the curriculum and the types of questions used in PISA would at least cause some variation.

The key message is that PISA’s own data suggests that a “student-oriented classroom climate” – at least in the way that they measure it – does not appear to be a feature of good maths teaching. So they’re wrong about that. I wonder if we will hear this from them soon?

Advertisement
Standard

14 thoughts on “PISA comprehensively proves itself wrong

  1. Hi Greg,
    There is a real problem with the reliability of Pisa scores. In an article in Educational Researcher (Rutkowski, L., & Rutkowski, D. (2016). A Call for a More Measured Approach to Reporting and Interpreting PISA Results. Educational Researcher, 45(4), 252-257) Rutkowski & Rutkowski point out several problems. A problem with sampling, problems with measuring achievement and problems with measuring trends. The problems with measuring achievement are of great concern. A Pisa score for any given student is not really reliable, because the student only answers a sample of the questions. The rest of the answers are given for the student using a statistical model. In itself, that could work, but the model partly depends on questions posed to the student about student (background) characteristics. Students do not always answer these questions correctly resulting in an unreliable outcome.

    You can imagine what this means for any correlation computed on the basis of PISA scores.

  2. Chester Draws says:

    • The teacher gives different work to classmates who have difficulties learning and/or to those who can advance faster.

    I do this consistently — some of my students might even reasonably answer this “always”. Yet I am totally opposed to student-directed learning and am pretty much completely explicit instruction.

    Assigning different work to different students is as old as teaching itself. True, in my day the advanced students were in a different class, so the difference was hidden more, but it was still there.

    But I would score “never” consistently on all three other criteria. As in, I don’t think I have ever done them, ever, in my ten years of teaching.

    So am I 25% “student oriented”? Because that would be quite odd.

    • Iain Murphy says:

      Was looking at the questions and thinking exactly the same thing. Then looked at the countries that do well on PISA and which questions they might have had any score on (probably only no. 2, unless they wer mandated certain answers).

    • Hello Chester. Now retired, I used to teach much the same as you describe with reference to assigning different work to students with differing needs at any one time. That is one aspect of what I consider to be good teaching! Occasionally, I put students in groups to work together, but most of the time I used similar ability groupings, so it was a variation of differ me work for different needs. upon seeing the questions students were asked, I see problems with using that construct of student-oriented teaching. My students would probably answer
      ” occasionally”. I am finding this discussion in Greg’s blog very enlightening and thought-provoking with regard to the PISA issue.

  3. Iain Murphy says:

    Greg is there anyway to separate out the question responses to scores. Feel like there is about 10 different teaching styles that are being mushed together to create these 4 questions.

  4. I hadn’t realised that PISA pushes certain teaching ideologies like this.

    And yet we have Ofsted telling us that it has no preference.

    But, more frustratingly, we have the DfE giving us a mixed message – we want to be more like PISA successful countries, but we want a return to traditional teaching.

  5. I think the ‘ten questions’ is a far too concise, biased promotional pamphlet.

    Yet, although the underlying framework pits student-oriented against teacher oriented (a dichotomy I find unhelpful) chapter 2 does not really promote student nor teacher-oriented. Instead, it’s more about cognitive activation, elements of which sound very ‘discovery like’. I’m saying this because -notwithstanding numerous big problems with the questions AND the underlying framework- I felt the report did suggest that a mix of strategies is ‘best’. What is annoying is that the nuanced picture is ‘translated’ in advice that is less nuanced; and even worse is when people ‘run’ with those to promote their hobby horses (memorisation bad, student-oriented is best etc.).

    Some more depth is in this publication: http://www.sciencedirect.com/science/article/pii/S0191491X15300286

  6. Pingback: PISA’s proves itself wrong… again. | Filling the pail

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.