Australians should read Tom Bennett’s behaviour report

Australian schools are suffering a crisis of classroom behaviour. Survey evidence from the Organisation of Economic Cooperation and Development (OECD) places us well below the OECD average for in-class disruption. Roughly a third of students in advantaged schools, and half of those in disadvantaged schools, report that in most or every class there is noise and disorder, students don’t listen to what the teacher says, and students find it difficult to learn. Add to this evidence from a different survey that 20% of Year 4 students and 9% of Year 8 students are bullied almost weekly and we have a very worrying picture.

A lot of the research and discussion among academics is not well placed to address these issues. The focus is placed exclusively on those students who misbehave and the educational effects on them of being excluded from classrooms. This is an important question to address but poor behaviour impacts on every student in a class. We need to think at a systems-level. The old trope of blaming and shaming teachers for causing behaviour problems by teaching boring lessons has had its day. Yes, teachers should make efforts to make subjects interesting but we can’t keep students in a high state of excitement all the time. Boredom is part of life. Sometimes you have to do things that you don’t want to do. Get over it.

That’s why I recommend Tom Bennett’s new report to Australian teachers and school leaders. It is written for the British government and its focus is on U.K. schools, yet there is much to transfer to our own system. Politicians should pay attention to the policy recommendations and principals should take heed of the advice on how to intentionally create a positive school culture.

We can’t keep ducking this issue.

Bad times ahead for education in Wales

After my recent post about ‘Curriculum for Excellence’ in Scotland, a number of Welsh teachers contacted me to make me aware of the new curriculum proposals for Wales. In 2015 the Donaldson report was published, setting out a series of recommendations for changing curriculum and assessment arrangements. These are currently in the process of being implemented by the Welsh government, enjoying cross-party support.

In the report, Professor Graham Donaldson seeks to expand the standard definition of ‘curriculum’. For instance, there is a significant section on pedagogy (teaching methods). The report claims that, “To be clear, the recommendations of this Review do not imply an emphasis on any particular teaching approaches: decisions about teaching and learning are very context and purpose specific, and are best taken by teachers themselves.” However, this wears pretty thin, pretty quickly. For instance, collaborative learning is mandated, with the report stating that, “Good teaching and learning encourages collaboration.” “Authentic contexts” are also important, giving a hint of the constructivist philosophy that underpins Donaldson’s approach.

To be fair, Donaldson accepts that direct teaching has a role. However, he seems a little confused about what direct teaching is, suggesting that it is a ‘caricature’ to think of it as involving a whole class (why?) and suggesting that direct teaching implies the ‘scaffolding’ of learning. Yet scaffolding is usually used to describe the kinds of hints and guidance that constructivist teachers employ in investigative and problem-solving contexts.

Some of the intent behind Donaldson’s statements about good teaching becomes clearer when read in conjunction with the proposed curriculum. For instance, Donaldson wants to get rid of conventional subjects, amalgamating them into ‘Areas of Learning’ such as ‘Health and Wellbeing*’, ‘Humanities’ and ‘Science and Technology’. The ‘Mathematics and Numeracy’ area, “…provides strong support for the development of wider skills, particularly critical thinking and problem solving, planning and organisation, and creativity and innovation.” There is a whole section on these wider skills and the curriculum is designed as a way to deliver them. Yet these skills are not ‘wider’; they are highly domain dependent. So attempts to develop them at a general, transferrable level are doomed to fail.

As ever, science is seen as a thing that people do rather than a body of knowledge. Students, “…learn to generate and test ideas, gather evidence, make observations, carry out practical investigations, and communicate with others.” This is a good example of the fallacy of assuming that they way that professional scientists do science is the best way to learn it.

Donaldson wants to add further complexity by mandating cross-curricular themes of literacy, numeracy and ‘digital competence’. He wants students to have lots of choice over activities and experiences, even though research suggests that the choices students make are not optimal for learning. To Donaldson, knowledge is interchangeable in the service of delivering wider skills: “…the spacing of the steps at three-yearly intervals allows for a measure of choice, for example in topics for research, within these intervals if the school sees that as appropriate.” It really doesn’t matter what bit of history you are messing about with as long as you learn to think critically.

If you are still unsure that the intent is to move decisively away from conventional subjects, then Donaldson offers a number of vignettes. Here is a description of a primary school curriculum sequence under his proposals:

“The study of a local river, for example, may be rooted in the Humanities Area of Learning and Experience. However, it opens up wide-ranging opportunities across other areas. It might connect with the Expressive arts Area of Learning and Experience through listening to music, such as Smetana’s Vltava, and composing music or creating visual interpretations or dance or dramatic performances to express the river’s journey from its source to the sea. It offers opportunities to use factual and creative language purposefully to create brochures or poems and to apply mathematical and scientific skills to observe and investigate natural phenomena and measure depth and speed. It enables children and young people to improve their health and well-being by appreciating the joy of fresh air and walking safely in the hills to seek the source of their local stream and using map skills to follow all or part of its journey.”

Very romantic.

In secondary school, he favours project-based learning:

“For example, a school could provide a Year 7 programme for a significant part of the school week that develops a wide range of skills through a themed approach, thereby aiding continuity with primary practice. This approach could involve a series of projects to cover the year, and use the thinking skills methodology of ‘plan, develop and reflect’ as the organising structure. Projects would cover all subjects, although specialist teaching could be provided for literacy, numeracy and areas such as modern foreign languages and PE. The projects could be based on a range of interesting topics that develop different skills and subject areas, for example on topics such as ‘sustainability’ and ‘innovation’. Teams of staff drawn from all subjects would design and deliver the curriculum, while timetabling based on multiple lessons would allow both the flexibility to create larger or smaller teaching groups as well as team teaching.”

This is the approach that failed so dramatically in a recent Education Endowment Foundation trial: Many of the project-based learning schools actually dropped out of the trial and in those schools that were left, project-based learning had a possible negative impact on some groups of students. So it either doesn’t work or it’s hard to do. Regardless, it is not a promising approach.

Alongside these curriculum changes, Donaldson proposes assessment changes. He wants to rely more on unreliable teacher assessment at the same time as making assessment far more complex and reducing the accountability of schools. This means that the negative effects of the new curriculum will take longer to spot. The first clear indications are likely to come from PISA data some way down the track.

My hope is that the Welsh government starts to pay attention to the effects of similar reforms in Scotland and has a rethink before this new curriculum can do too much harm.

*This is going to sound very dated, very quickly

Why test times tables?

Writing in the Times Educational Supplement in London (the TES), David Boorman questions plans to introduce a timed test of times tables to English schools.

Boorman’s first concern is that of ‘maths anxiety’. He worries that timing tests in this way will lead to more anxious students. I have no doubt that timed tests can make students anxious – although the evidence is hard to pin down – but I also see it as part of a teacher’s job to allay those fears and present such tests as a normal part of school life.

Boorman can’t see the need for a time limit, asking, “How long did Shakespeare take to write his plays? Does anyone care?”. I think this misses a key point: We want to test whether students have retained times tables as facts in their long term memory rather than checking whether they can work them out using working memory. If we ensure that time is limited then we are more likely to test the former than the latter. This is important because in higher level maths, students are rarely asked to simply work out a single multiplication. Instead, such skills will be embedded in a larger problem. Working memory is severely constrained and so it is important to free it up to focus on the problem in hand. If students memorise times tables then they release working memory capacity to focus on other aspects of the problem.

Boorman’s second point is a concern about a ‘times table check mindset’. He points out that many students might be able to find 5 x 7 but then be unsure of the answer to 7 x 5; something I am quite prepared to accept. But he uses this example to set time tables knowledge in opposition to an understanding of multiplication.

We often see this argument in the rhetoric about early mathematics education. And yet I have seen no convincing evidence to suggest that knowing facts somehow gets in the way of understanding. Yes, knowing times tables facts is not sufficient but I don’t think anyone ever claimed that it was. Ideally, students know their facts and also have a deeper appreciation of multiplication. In the case of the 5 x 7 example, the student would need to be taught about the commutative property of multiplication e.g. by considering a rotating rectangle. Why can’t we teach times tables and also teach commutativity?

Finally, Boorman demonstrates quite a limited view of why we want student to learn times tables. He complains that they are not useful in everyday life:

“I’m often told by friends and colleagues that they can help at the shops. However, I’m sceptical at best. Take six times eight, for example. When and why would one purchase 48 of an item? Of course, it could be six items priced at 8p each, but then what items are priced at 8p?”

Boorman’s colleagues have obviously got the wrong end of the stick. What is it about maths that makes people demand that every individual skill needs to have some mundane, everyday use? We don’t do that with other subjects. Nobody goes into a primary school class, observes a session of clay modelling and exclaims, “But when will students need to be able to do this at the supermarket!?” And nobody stops children from writing stories on the basis that they will never need to write stories in real-life; that only professional authors need to be able to write stories. And yet you hear these arguments about maths all the time.

Times tables are not particularly useful on their own and they are not intended to be. Maths is not a flat subject; it’s hierarchical with basic skills feeding into more complex skills. As we have seen, a facility with times tables frees up working memory to solve other aspects of a problem. For instance, an important skill in senior maths is to be able to factorise quadratic equations and this is much easier to do if you know your times tables. If students don’t know their tables then this limits their access to higher level maths. You can do all the investigations, critical thinking activities and genius hours you like but if students can’t do maths then you are effectively shutting the door on most STEM careers.

Perhaps most worrying of all, David Boorman is a lecturer in primary education and so he has the opportunity to promote these views about times tables to the next generation of primary school teachers.

Learning lessons from the failure of Scotland’s “Curriculum for Excellence”

In 2011, I attended a presentation by Dylan Wiliam in Melbourne. As ever with Wiliam, it was a lively and provocative session that made me think. I distinctly remember the discussion of ‘Pareto Improvements’ which represent a great way of thinking about improving a school or education system.

At that time, Scotland had just rolled-out its ‘Curriculum for Excellence’. Wiliam mentioned this in order to make a point that it wasn’t the curriculum that would deliver excellence but the quality of teaching and learning, something that could be enhanced by the use of formative assessment. A good teacher with a bad curriculum is likely to do better than a bad teacher with a good curriculum. So Scotland could name their new curriculum a ‘Curriculum for Excellence’ but excellence is not necessarily what they would get as a result.

I was convinced by this argument at the time. However, in the intervening six years I have changed my mind. Teaching is important but it is difficult to untangle from the curriculum. Some of the effects we attribute to teachers are actually effects of teacher+curriculum. If you don’t teach kids their multiplication tables then they won’t learn them, no matter how good a teacher you are. A good curriculum helps teachers and a bad curriculum really messes everything up.

Scotland’s Curriculum for Excellence (CfE) was developed as a result of an inquiry set-up by the then Labour government in Scotland. The inquiry reported its findings in 2004. However, the Scottish Nationalist Party (SNP) came to power in 2007 and it was up to the new government to oversee implementation. This represented something of a perfect storm. The SNP is effectively a single-issue political party dedicated to gaining Scotland’s independence from The United Kingdom. That’s what motivates their membership and their politicians. You can just imagine the new government’s relief at being presented an off-the-shelf solution for improving education, drawn-up by the experts. Who could reproach them for following expert advice? After all, we should leave politics out of education, right?

Unfortunately, the path of CfE has not been smooth. Its detractors are pulling their hair out while even its fans admit that it needs a bit of work. John Swinney, Scotland’s education minister, has brought in reforms that include the introduction of more assessment.

These reforms are the result of a review conducted by the Organisation of Economic Cooperation and Development (OECD) and commissioned by the Scottish government. The OECD noted:

“Curriculum for Excellence represents an ambitious departure seeking to develop a coherent 3-18 curriculum around capacities and learning, rather than school subjects, with a different approach to assessment from that in place before. It is complex as it is organised around four capacities (covering 12 attributes and 24 capabilities across the four); five levels, from early to senior; seven principles, six entitlements, ten aims, and four contexts for learning; eight curriculum areas and three interdisciplinary areas; and several hundreds of Experiences and Outcomes.”

I think there are two linked aspects of this statement that are critical: the departure from a system based upon school subjects and the complexity of the system that replaced it.

No doubt, the calls for reform of CfE have been bolstered by the latest round of PISA results that show a decline in Scotland’s performance since 2009. The comparison with neighbouring England is illuminating:

So what has gone wrong?

Firstly, subject-areas exist for a reason. They represent well-connected bodies of knowledge and understanding; the kind of thing you might be able to draw on a (very big) concept map. They have an internal logic that I suspect matches the way that we construct schema in our long-term memories.

If you can work your way through the complexity of CfE (and take a look at these Key Links/Documents if you want to appreciate just how complex it has become) then you find that a focus on ‘learning’ actually means a focus on inquiry and investigation. For instance, the draft numeracy and mathematics benchmarks essentially force these teaching methods on teachers by defining them as outcomes. In student-friendly language, we read statements such as, “I have experimented with everyday items as units of measure to investigate and compare sizes and amounts in my environment, sharing my findings with others,” and, “I have investigated how whole numbers are constructed, can understand the importance of zero within the system and can use my knowledge to explain the link between a digit, its place and its value.” We read the statement, “I use practical materials and can ‘count on and back’ to help me understand addition and subtraction, recording my ideas and solutions in different ways.” This is constructivist maths involving the use of practical investigation and multiple strategies.

[As an aside, why didn’t anyone involved consider the absurdity of writing in student-friendly language, “I am developing a sense of size and amount by observing, exploring, using and communicating with others about things in the world around me”? A student working towards this benchmark can’t yet count so how is it remotely possible that he or she will understand this statement?]

You would think that if the experts were suggesting the adoption of a curriculum based upon inquiry learning and constructivist principles then inquiry learning and constructivism would be supported by the best available evidence. And yet they are not. The evidence actually stacks-up against these methods. So why do they make it into a curriculum? They are there because they fit the ideology of the educationalists involved; educationalists who would rather question the applicability of scientific evidence to education than question their own deeply held beliefs.

I am not sure whether the complexity of CfE is an unavoidable consequence of abandoning subject disciplines and embracing ephemeral inquiry and interdisciplinary principles or whether the complexity arises out of the need to give sufficient prominence to the menagerie of learning theories sponsored by different educationalists. In a sense it doesn’t really matter. We have seen a very similar pattern arise in Canada with the adoption of a constructivist approach to teaching mathematics and in Queensland, Australia, with Productive Pedagogies. These reforms didn’t work because they can’t work and teachers end-up imploring the authorities to, “just tell me what to teach.”

It is the kind of mess you create if you take politics out of education and leave it to the experts.

Note: Back in 2011, Dylan Wiliam probably set-out a more nuanced view of the role of curriculum than I have given him credit for. At the time, I had not researched these areas as much as I have now and so I was only able to take on fairly simple messages.

Should you use test scores to help you choose a school for your child?

Last week, a piece was published in The Conversation by Stewart Riddle. It raised some interesting points about the factors parents should consider when choosing a school. Riddle argued that results from the Australian National Assessment Program – Literacy and Numeracy (NAPLAN) that are available to parents via the MySchool website do not offer a good guide. This is because most of the difference in student performance can be attributed to socioeconomic background. Parents should instead visit a school and determine if it feels right for them.

I took issue with Riddle’s claim that, “Test results say nothing about teaching quality,” because this seemed like quite an extreme position. Clearly, test results must be influenced by teaching quality, even if that influence is compounded by other factors. Otherwise, we are in a position of denying teachers and schools any agency over academic outcomes; it’s all down to fate. That doesn’t seem like a reasonable position to hold.

A day later, a new analysis of the 2015 PISA and TIMSS results was released by the Australian Council for Educational Research (ACER). ACER’s Director of Educational Monitoring and Research, Dr Sue Thomson, made the following comment:

“It also matters which school a student attends. PISA shows that the school a student attends has an impact on outcomes. Disadvantaged students in average socioeconomic level schools, for example, are almost a year of schooling higher than those in disadvantaged schools. Similarly, disadvantaged students in advantaged schools are more than one year of schooling higher than those in average socioeconomic level schools.”

One of the reasons for this difference may be the different disciplinary climate between more and less advantaged schools. Within an overall context where Australia has a pretty dismal disciplinary climate:

“About one-third of the students in advantaged schools, and about half of those in disadvantaged schools, reported that in most or every class there was noise and disorder, students didn’t listen to what the teacher said, and students found it difficult to learn.”

So it’s not just about the feels on open day. The choice of school that a parent makes for his or her child can have a significant impact on academic outcomes. This seems to be at odds with the narrative that all schools are effectively the same. Socioeconomic background is just one factor that varies between schools but other school characteristics are also likely to matter. Real-world data is inherently messy but we can certainly point to examples of schools that are performing way above their socioeconomic destiny.

I have experience of working in a school in a socioeconomically disadvantaged part of London for seven years, during which time test scores rose dramatically. This was accompanied by a clear improvement in disciplinary climate and this was no accident: improving behaviour was a key part of our strategy to improve results. I can’t prove that one thing caused the other without running an experiment, but if we can’t make schools better then I’m not sure what it is that teachers and school leaders are trying to do.

And there is another interesting point to note about the ACER analysis. If test scores really do only tell us about the level of social advantage in a school population and nothing else then they still act as a good guide for parents because the data suggests that children generally do better in more socially advantaged schools. If, as seems more likely, they tell us about school quality more generally then they are also a good guide.

This is an analysis of just one metric. Parents choose schools on a range of factors such as arts provision, pedagogical approaches or a sense of duty to support the local school. Nevertheless, test scores are hardly irrelevant to the decisions they make.

Understanding the PISA 2015 findings about science teaching

I have shared the following graphic a few times. It shows that frequent use of enquiry-based learning, as defined by the Organisation for Economic Cooperation and Development (OECD), is associated with worse scores on the science component of the Programme for International Student Assessment (PISA). It is based upon surveying students about their experiences in science lessons and then matching this to PISA science performance:

I often point this out to advocates of inquiry learning. Nobody likes cognitive dissonance and so the response I usually receive is, “Well yes, everything done to the extreme is a bad idea. This just tells us about those students who are exposed to inquiry learning in all or most lessons.”

So it is worth going to another chart. This chart shows how an ‘index’ of enquiry-based learning affects overall results. The index isn’t just about enquiry in all or most lessons but about the relative amount. As you can see, more ‘enquiry-based learning’ is associated with worse science PISA results.

You will note that one of the factors that is associated with better PISA science scores is the ‘index of teacher-directed instruction’. Another is the ‘index of adaptive instruction’. This latter term is explained by PISA in the following quote:

“PISA asked students how frequently (“never or almost never”, “some lessons”, “many lessons” or “every lesson or almost every lesson”) the following happens in their science lessons: “The teacher adapts the lesson to my class’s needs and knowledge”; “The teacher provides individual help when a student has difficulties understanding a topic or task”; and “The teacher changes the structure of the lesson on a topic that most students find difficult to understand”. The index of adaptive instruction combines these three questions to measure the extent to which students perceive that their science teachers adapt their instruction based on students’ needs, knowledge and abilities.”

So students seem to do best with explicit teachers who respond to feedback. This is not surprising given the wealth of research on explicit instruction that shows exactly this. And teaching is all aided by a positive discipline climate where students are not hindered by the behaviour of others. Who would have thought it?

If you want to see what the ‘index of enquiry-based learning’ pattern looks like between countries then I have created the following chart from the available data:

That’s quite a strong negative correlation.

It’s worth reminding readers that PISA is a test of application and not of simple recall. Many of the test items require students to evaluate experimental designs and so on. This is why they use the dreadful term ‘scientific literacy’ to describe it. Andreas Scheicher who heads the programme is not an educational traditionalist.

All data and graphics were obtained from Volume 1 and Volume 2 of the reports that can be found here.

Knowledge debate at the Global Education and Skills Forum

Regular readers of this blog will be interested in the following livestream of a Varkey Foundation debate involving Nick Gibb, Andreas Schleicher, Daisy Christodoulou and Gabriel Zinny:

At this point, it would be really good to be able to link to a curated list of my PISA blog posts but I can’t because I don’t have one. I think that needs to go on the to-do list.