Regular readers will be familiar with the phenomenon of Finland’s continuing decline on international assessments such as the Programme for International Student Assessment (PISA) run by the OECD:
Commentators seek to explain this decline in a number of ways. Pasi Sahlberg, former policy advisor in Finland and now with the Gonski Institute in Australia, attributes it to the distraction of digital devices (In August I debated Sahlberg on the educational lessons we may draw from COVID-19 and you can catch that debate here). And in this post, I critically examined the suggestion that a decline in effort from Finnish students is the cause.
A newly published PhD thesis by Aino Saarinen sheds more light on the possible causes. Saarinen examined data on Finnish students from the 2012 and 2015 rounds of PISA with a few questions in mind: Were scores correlated with self-directed learning practices, use of digital learning materials in schools or participation in an early learning programme? Also, was variance in learning outcomes, which is apparently increasing in Finland, associated with any of these factors?
“Frequent use of self-directed teaching practices or digital learning materials at school were associated with students’ weaker learning outcomes in several knowledge domains. Instead, frequent teacher-directed practices were related to students’ higher learning outcomes. Moreover, frequent use of self-directed teaching practices or digital learning materials had more negative impact on students’ learning outcomes in students with (vs. without) risky background.”
Saarinen found little impact from participation in early learning.
Of course, these results do not directly address the question of what has caused the decline. It is possible, for instance, that there was even more self-directed learning in, say, 2006. However, this seems unlikely given the qualitative accounts we have of how Finnish education has changed over the last 20 years or so (see e.g. here and here) and the more recent turn toward approaches such as ‘phenomenon-based learning‘.
And there are limitations on drawing too many inferences from the questionnaires that PISA uses to assess the practices that students are exposed to. They strike me as a little eccentric, with potentially overlapping concepts often present in different constructs. But then, nothing is ever perfect. It would be great to run a statewide randomised controlled trial, but that’s not going to happen and so we have to make our inferences as best we can from the data as it stands.
If we do consider the data as it stands, the discussion highlights – again – that constantly pointing to Finland as an example of educational excellence is flawed. It is a sign either of ignorance of the last fifteen years of data, self-deception or worse.