The media’s myopia on PISA

To me, the story was obvious. Whatever your view of the validity of the Programme for International Student Assessment (PISA), there was clearly one newsworthy finding: the use of inquiry learning is associated with worse performance in science.

This is newsworthy precisely because it is the opposite of what we often hear in the media. If you peruse the many articles about how to improve Science Technology Engineering and Maths (STEM) performance, they invariably promote ways of better engaging students with science. And the solution for better engagement is usually variations of inquiry learning. 

Yet I saw no articles in the press that publicised the inquiry learning finding following the release of the PISA results. In fact, one piece in The Age actually promoted more inquiry learning as the solution to Austrialia’s PISA decline. 

Professor Corrigan of Monash University is quoted as saying:

“While the facts of science are important, recognising the way that you do science is equally important and that leads to better performance in PISA, which is about scientific literacy.”

This is the opposite of what PISA found.

Why is this? What’s going on?

One clue is in the experts quoted in The Age. Journalists turn to academics to interpret things like PISA results and academics tend to hold to constructivist ideology in the face of all evidence to the contrary. 

But aren’t journalists supposed to prove and expose faulty dogmas? Where is the challenge?

Laura McInerney perhaps revealed another clue when explaining why SchoolsWeek did not cover the inquiry learning story:

So McInerney thinks this story is only of interest to teachers and teachers don’t read SchoolsWeek.

This is intensely frustrating because the media at large tend to think that pro-inquiry stories are of interest to the wider population. So it seems that there is no way of channeling this narrative. The public must stay misinformed.

I have an hypothesis about what’s happening here. Many kids switch-off from maths and science at school but these subjects are also associated with being smart. There are a number of reasons students drop the sciences. Some simply find it really hard. It’s also true that doing well at science rarely involves getting everything right all the time. This doesn’t sit well with perfectionist students. There are also cultural factors around stereotypes. Scientists are frequently portrayed as nerds who lack sexual desirability. The common stereotype is also at odds with common notions of femininity.

I suspect that many journalists dropped science at school to pursue literature or history. The most comforting narrative to tell yourself – whatever the real reason – is that your teachers simply didn’t make it interesting enough. You were smart and resilient enough to learn science but the teachers were dull. It’s not your fault. Stories about engagement and inquiry learning are therefore very seductive.

It would certainly explain why science gets singled-out for this kind of treatment. After all, any school subject can be really boring, it’s just that it’s more socially acceptable to hang this criticism on science and maths.

Standard

8 thoughts on “The media’s myopia on PISA

  1. Well done — again — Greg!

    I wonder if you’ve seen the same tendency in your part of the world as we have seen here: with the roll-out of both PISA and TIMSS reports there has been a spate of Op Eds and other declarations about how useless, or irrelevant, assessments are, how bad “standardized tests” are in general, and how the things measured are of no value. There is also that letter signed by a large number of “educational scholars” about PISA.

    I consider this a deliberate full-court-press, timed evidently for this release, as damage control. It’s a pretty well-organized and big “Nothing to see here folks” campaign.

    Now you know that if one squints and smooths out the data across Canada things look pretty rosy — at least in Science and Reading. But when you dig into the details you find that a couple of large population provinces are doing quite well (except in math, where it’s mediocre at best … and this except for Quebec whose french language schools are doing fine). And there is an overall downward trend in math. And the provinces that are doing well are biasing the Canadian data, masking the collapse of what used to be high-performing school systems elsewhere in the country.

    Anyway, I’d love to see a takedown of the arguments these guys are rolling out the target the international assessments (in particular) — the bit about “standarized tests” is more of an ongoing debate and there’s plenty of ammo on both sides already. I’d attempt this myself except that, first, I’m not a blogger so I’d have to beg space on someone’s blog and as you know I’m very shy and retiring, and not at all intrusive …

    But the reason is more this: Here in Canada it’s almost entirely the Teachers’ unions and those associated with them who are putting out this propaganda. I am not known as the … er … strongest union booster around, and if I wrote something it would be regarded as entirely political. As you know I’m not shy about my political position and anything I write is liable to be conflated with … other stuff.

    But I think, will all their warts, the international assessments have objective value that is resilient enough to stand up to these critiques, and someone with the chops to explain why, who could not be construed as acting out of unfriendly motives toward the unions would likely get more traction than me.

    It’s just a suggestion

  2. Tempe says:

    Yep, the only mention I saw was by Jennifer Buckingham. The articles on The Conversation seem to think that it’s down to politicians sticking their collective noses in or the idea that Asian students are very competitive and receive tutoring outside of school. The silence is deafening.

  3. Whilst I have grave concerns about inquiry learning, I also have concerns about the PISA research in this area. Aren’t the findings regarding the degree of inquiry, or non inquiry teaching, going on in a country based upon student feedback to questions?. Isn’t this type of research questioning notoriously inaccurate? Is there other research out there highlighting that countries successful in the PISA tests do not use inquiry learning?

    • You are right to caution about over interpreting PISA. However, we need to be clear what we are discussing here. The PISA data shows that *within any given country* – bar a couple – the students who report greater expose to inquiry learning perform worse on PISA science.

  4. Pingback: The media’s myopia on PISA – Think Studio

  5. Richard says:

    Greg you have to realise that constructivism in the Teachers colleges is alive and well and it is difficult to displace.in academia. Constructivism eschews solid empirical based evidence. I would argue that most of these education academics do not understand science, and with it the scientific approach for doing good solid empirical research in education..They are all about “theorising”, what ever that means, and they find it difficult to see what is actually happening in front of their eyes. The end result trainee teachers suffer, and when they graduate are teaching in our schools, their students.

    I have taught in TAFE for 27 years and when doing my Masters in education in1974 I came across a teaching technique called “mastery learning.” It worked then and it still is an effective teaching strategy, But you try and get the constructivist academics to teach it , you must be joking.

    So what does the science and PISA tells us ; explicit instruction works and enquiry based learning is not very effective.

    Good luck Greg with your PhD.

  6. Pingback: Die Belehrungsspiele aus dem Haus der kleinen Forscher « Salman Ansari

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.