Education for Learning have been trialing a maths programme in South Australia known as Thinking Maths. Education for Learning is the Australian version of England’s Education Endowment Foundation. It licenses the Education Endowment Foundation’s toolkit for use in Australia and it has started to run randomised controlled trials. This is one of the first of such trials to report its findings. It is an ‘efficacy’ study which means, in the jargon, that its implementation was led by the developers of the programme.
On trend, its title and description position it as something to do with ‘metacognition’, a nebulous term that has become a favourite of the Education Endowment Foundation. The word features heavily in the project description:
“The program’s approach focuses on the following three areas for better teaching and learning of mathematics:
- Using quality task design. For example, clear learning intentions referenced to the Australian Curriculum and delivering engaging lessons with multiple entry and exit points to support students in linking mathematical ideas to solve problems
- Sequencing of conceptual development. For example, encouraging metacognition and growth mindset through the use of effective questioning and differentiating the curriculum by presenting tasks with multiple entry and exit points to cater for students with a wide range of mathematical experience and dispositions.
- Using research-informed effective pedagogies. For example, establishing a culture of learning, encouraging metacognition and conceptual understanding, engaging and challenging students in their learning, and opportunities for professional reflection and networking.”
The alert among you will, however, have noticed a few other buzzwords. We have ‘mindset’ but we also have ‘conceptual understanding’, ‘quality task design’, ‘multiple entry and exit points’ and ‘differentiating the curriculum’. Read further and we find that in the course of developing metacognition and conceptual understanding, teachers will be encouraged to:
“Change from ‘telling students’ to ‘asking students’, encourage students to talk about their thinking and develop their reasoning skills through purposeful questioning. Rather than re-explaining a concept, use questioning to get an insight into the nature of their misconceptions, guide them to expose an inconsistency and allow them to self-correct.” [My emphasis]
This is what is often referred to as ‘reform maths‘ or ‘fuzzy maths’ by its detractors. It is the kind of discovery-oriented maths that has been implemented in Canada alongside a corresponding decline in maths scores on international and national assessments. It is the sort of maths teaching that gurus such as Dan Meyer and Jo Boaler exhort us to practice. The focus is on a mathematical task rather on the explicit teaching of concepts and procedures. By designing the perfect task and by differentiating it appropriately, students are meant to figure out the maths for themselves.
It is a form of educational progressivism applied to maths teaching, and yet notice what is happening. It is speaking a new language of ‘metacognition’ and of ‘mindset’ along with some unquestioned assumptions about differentiation.
Surprisingly, perhaps, the South Australian randomised controlled trial found no effect for the programme when compared to business-as-usual. Why is this surprising? You might predict some kind of expectation effect similar to a placebo effect in a trial of this kind. In other words, teachers in the intervention group should spend more time thinking about their maths teaching and teachers and students should both respond to something new and different. It also seems to have been the case that teachers allocated to the intervention tended to have a greater number of years teaching experience.
If you examine just the results for primary schools involved in the trial, there is a small gain that just reaches statistical significance. This is offset by an almost exactly equivalent loss for secondary schools. In other words, participation in the intervention led to worse outcomes for secondary school students. It also seems to have led to a small but statistically significant rise in maths anxiety for those students involved in the intervention across the two phases. This is important because Boaler and others have made a big deal about maths anxiety, suggesting that reform maths is the antidote. It seems not.
The positive effect for primary school teachers may have been due to an apparent increase in relevant subject knowledge. The negative effect at secondary may have been due to implementation issues. However, if schools could not implement the programme correctly with the programme developers on hand, there seems little reason to think it would work at scale. So that’s the end of that.
I find it baffling that the developers would refer to ‘research informed effective pedagogies’ in their description while designing a programme that is similar to ones that have failed time and again. We do not have definitive research on maths teaching, but the research that is available suggests that explicit teaching that ‘tells’ and involves ‘re-explaining’ is more effective than the reform maths alternatives. Moreover, we have a theoretical framework from cognitive science that explains why this is likely to be the case.
Perhaps Evidence for Learning should test a maths programme based on the principles of explicit teaching. I won’t hold my breath.