When the results of the 2018 round of the Programme for International Student Assessment (PISA) were published late last year, Australia and Finland cemented their long-term declines in performance:
This has been a bitter pill to swallow. Many in education and academia perhaps fear that these results will precipitate more government interference in the Australian education system and that they will face losses as a result. Others may have too much invested in Finland. Some traveled there in the early 2000s and wrote papers about the education system, so they have ego wrapped up in it. And in the early 2000s, Finland was wrongly presented as a example of how an education system could embrace progressivist education ideology and still be academically successful. There are those who still seek to make this case but the decline in results is becoming harder to ignore.
One rationalisation to explain away any decline in the cases of both Australia and Finland is that the students sitting PISA do not try as hard as they used to. The convincing part of this argument is that PISA is a low stakes test for those students who take it – they don’t even find out their result. So why would students try hard? However, this point only matters if there has been a change in the level of effort over time. If students were as disengaged in 2000 as they were in 2018 then this cannot be a cause of the decline in performance.
The evidence people draw upon to make this case tends to be anecdotal. It is a variation of the old ‘the kids of today’ trope where young people get blamed for being lazier or more feckless than previous generations. Perhaps it all has something to do with mobile phones.
So I wanted stronger evidence to draw upon.
PISA has sometimes collected data on the level of effort students report making on the PISA test. They then present this on an ‘effort thermometer’. I am having trouble tracking through all the relevant data because the OECD do not make it easy, but I have found data from the same question asked in both 2003 and 2018. The consistent measure I have found is the average self-reported level of effort on a scale from one to ten (the OECD also ask students to report their effort relative to the effort they would put into a school test but they report that differently for the two years).
In 2003, Finnish students self-reported an average effort of 7.3 and for Australian students it was 7.5. I had to get my ruler out and read these from a graph because they are not reported as figures and so these should be taken as approximations. For context, across all of the countries surveyed, the score ranged from about 6.2 to about 8.7. In 2018, Finnish students reported an average effort of 8.0 and for Australian students it was 7.4. If anything, this would suggest that the effort of Finnish students has perhaps slightly increased over time while that of Australian students has remained pretty static.
This is only one measure, of course, and it has its flaws. Some students might not even bother to fill-in the self-report question, although again we would need to know if the proportion doing this had changed over time in order for it to be relevant. And perhaps students interpret the question differently now because they work off a different baseline? You can pick holes in the data if you wish. But it should give us pause before we accept a blithe dismissal of these countries’ declining PISA performance on the basis of a supposed decline in effort on the test.