Nothing to prove (but I will, anyway…)

It is generally acknowledged that standard classroom teaching – with the notable exception perhaps of early primary education – is usually a variant of explicit instruction. We are not necessarily talking about the most effective forms of explicit instruction here. Much of my early teaching, although explicit, didn’t make sufficient use of opportunities to collect and give feedback.

I suspect the fact that forms of explicit instruction have persisted is because they represent a balance between effectiveness and effort, both for the teachers and students. It’s what our teachers did and so it’s our default setting. In fact, this cause-and-effect is a common complaint of those who agitate for revolutions. I have observed many student teachers over the years and, despite what they are told in college, it is instinctive for them to want to stand at the front of the room and explain things. I even suspect that this is what our ancestors did many thousands of years ago when sharing their ideas. This is why such approaches are considered ‘traditional’.

Therefore, if you propose a change to this default setting then it is you who carries the burden of proof. You will need to demonstrate that your method is superior to business-as-usual. And it’s no good just showing that your method, enacted under the most favourable possible conditions, is better than the default approach. You need to show that your method enacted on a cold Thursday afternoon by an ordinary teacher is more effective than the default approach under the same circumstances.

I reckon that Dylan Wiliam has managed to do this, just about. I am convinced by his argument that use of formative assessment strategies improves upon standard instruction and that these gains are scalable and capable of being enacted by normal teachers with full timetables. In fact, some of his proposed measures represent efficiencies – finding out now that your students don’t understand something is far better than waiting three weeks until the end-of-unit assessment and then laboriously writing the same piece of feedback on each paper; feedback with no chance of being acted upon. So, not only does Wiliam present empirical evidence, he weaves it into a story that makes logical sense. We have a theory here.

Proponents of ‘constructivist’ approaches to teaching such as inquiry learning, problem-based learning or project-based learning have significantly failed to do this. Note the number of names that I had to trot-out there. Devotees will expand upon the differences between them but they share some essential similarities. They are broadly in the ‘progressive‘ tradition of education that sees learning as a naturalistic process and that emphasises the need for students, at least in part, to find stuff out for themselves rather than simply have things explained to them.

Advocates certainly see this as a change to what is typical in classrooms. In his famous TED Talk, Dan Meyer presents a broadly constructivist position under the imperative that mathematics lessons need a ‘makeover’. This therefore places the burden of proof firmly with him and with those who are arguing for such changes.

And yet the debates that I get involved in tend to result in constructivists trying to shift the burden of proof on to me. I am somehow supposed to prove that their particular method doesn’t work under any circumstances. If not, they feel perfectly justified in promoting it far and wide. Often, there is negative evidence but, at this point, the constructivist will quibble that I haven’t shown that there are no circumstances at all in which it might work (which I obviously can’t show). Sometimes they will say that the research measured the wrong outcomes and that if it had measured the right outcomes then it would have shown a different result. They rarely present evidence of these special cases where the method does work or where the right outcomes were measured.

The constant name-changes are also problematic. The ‘maker-movement’, for instance, is clearly a constructivist-inspired pedagogy and yet it hasn’t been around long enough to have had its effectiveness researched. No doubt, if I were to offer a critique, I would in turn be criticised for not presenting appropriate evidence. All the evidence against constructivist approaches in general would, presumably, be set aside because this is completely different. Similarly, I am not aware of any studies that test the effectiveness of ‘Mantle of the Expert’ and yet it bears enough similarity to constructivist strategies that I would want to see strong evidence before advocating its adoption by schools. I, and many others, suspect that the evidence thing is one of the reasons for such fluid nomenclature.

If constructivists offer any justification at all then this is often an appeal to ‘theory’ such as by citing Piaget. However, developmental psychologists no longer accept Piaget’s ideas. They therefore do not exemplify the scientific meaning of the word ‘theory’ which stands for something consistent with known evidence. I know that there are those who do not like medical analogies but imagine you were to go to the doctors and be offered a new therapy called ‘water treatment’ where you were required to drink a cup of water at each one of a number of specified times of day in order to ensure that your ‘humours’ were in ‘balance’, based upon the ‘four humours theory’. Imagine if, when you question the evidence for this approach, you are asked for your evidence that it doesn’t work.

Of course, this would never happen because, unlike education, medicine takes its approach to evidence seriously.

Nevertheless, I think it worth stating some of the evidence for explicit instruction and against constructivist approaches. So, here’s my list.

1. Kirschner, Sweller and Clark reviewed a number of studies and the literature on cognitive load theory whilst critiquing constructivist approaches.

2. Barak Rosenshine reviewed the evidence from process-product research and found that more effective teachers used approaches that he called ‘direct instruction’ and which I would call ‘explicit instruction’ in order to distinguish it from the more scripted Direct Instruction programmes developed by Engelmann and other (such as DISTAR). Most of this is paywalled but he did write a piece for American Educator.

3. Project Follow Through, the largest experiment in the history of education, is generally considered to have demonstrated the superiority of Engelmann’s Direct Instruction (DI) programmes to other methods, including those base upon constructivism. It is important to note that DI was not just the best on tests of basic skills but it performed at, or near, the top on problem solving, reading comprehension and for improving self-esteem.

4. An analysis that compared students’ maths and science scores on the TIMSS international assessment showed a correlation between higher performance and the teacher adopting a ‘lecture style’.

5. A RCT from Costa Rica showed that an ‘innovative’ constructivist-based approach produced worse performance than the business-as-usual control.

6. A meta-analysis found a small effect size for ‘guided discovery learning’ over business-as-usual conditions and a negative effect size for pure discovery over explicit instruction. Whilst this might be seen as evidence for guided discovery learning, it is worth bearing in mind that the studies included were not generally RCTs and so the experimental conditions would have favoured the intervention (which is why Hattie sets a cut-off effect size of d=0.40). The definition of guided discovery learning also included the use of worked examples which are generally considered to be characteristic of explicit instruction.

7. An analysis of the introduction of a more constructivist approach to teaching mathematics in Quebec showed an association with a decline in test scores.

8. One of my favourite studies ran a constructivist maths intervention against an explicit one (as well as business-as-usual) and found the explicit intervention was superior.

9. Klahr and Nigam found that the minority of students who were able to discover a scientific principle for themselves didn’t understand it any better than students who were taught it.

10. Studies of teacher expertise are broadly consistent with the earlier findings from process-product research as described by Rosenshine.

11. Findings on the best way to teach cognitive strategies (such as reading comprehension) also echo the findings of the process-product research i.e. that an explicit approach is more effective. (You may, as I do, still question the value of teaching such strategies or, at least, the time devoted to it). [Paywalled]

12. Classic worked-example studies show the superiority of learning from studying worked examples over learning by solving problems for novice learners. Worked examples are a feature of explicit instruction whereas problem solving (without prior instruction) is a feature of constructivist approaches.

[There are others – I’ll add to this list as I remember them]


Teaching math in the 21st century: A review

I have not known what to think about the U.S. Common Core initiative for some time now. When first mooted, it seemed like a good idea. I subscribe to E D Hirsch Jr’s view of the importance of knowledge. Part of that argument is that we should identify a common curriculum that students follow in order to ensure systematic exposure to ideas. This is something that many high performing states already have in place and, to my view, a positive feature of the U.K. and Australian systems. Don’t mistake me, there are many and various problems with the Australian Curriculum as currently composed and yet it is far better to have something to argue about than nothing at all. Well, that’s what I thought.

In the U.S., for some reason that I still do not understand, the idea of a common ‘curriculum’ is politically impossible. ‘Standards’, however, appear more palatable, even though standards and a common curriculum pretty much amount to the same thing (the U.K. and Australian curriculums are actually expressed in terms of standards with plenty of flexibility in how these are interpreted on the ground). Despite the window-dressing, Common Core has been hugely divisive. Given the polarisation of U.S. politics, I couldn’t quite decide whether the concerns were legitimate or whether they were those peculiarly American ‘concerns’ about the ‘government’ and how it wants to take away people’s God-given right to carry automatic weapons and shoot lots of people when they feel sad. But I digress…

Thanks to Barry Garelick and his book, “Teaching math in the 21st century,” I now have a window into the world of common core mathematics. And it’ not looking too healthy. Michael Gove, erstwhile U.K. education minister utilised the title of the movie ‘The Blob’ to describe the workings of the education establishment and how, whatever you did to fight it, it would ooze back in around you and finish you off. The emotive nature of this description is problematic but it is an eerily accurate description of the capture of Common Core described in Garelick’s book.

Common Core mathematics was clearly intended to raise the standard of maths teaching in the U.S. to bring it more in line with higher performing countries. That was and is the plan. As a set of standards, it wasn’t supposed to imply a particular teaching method. Yet we hear the tale of Calvin who is struggling a bit in maths and has been diagnosed with ADHD. His counsellor, Teresa, hails the coming dawn of Common Core, “So no more ‘here’s the assignment from the book, and there’ll be a test on the material next week. It’s more about understanding.” This would be a good thing, she claims, because Calvin won’t have to memorize procedures. Calvin objects that he likes memorizing procedures and has memorized the quadratic formula. Garelick – Calvin’s maths teacher – offers praise for this feat but Teresa chides that this won’t be required under Common Core. With understatement, Garelick echoes my thoughts: “How a student could be deemed to understand the quadratic formula without knowing it was puzzling.”

Similar doublethink is on display when Sally from the district turns-up for one of the Monday morning meetings. A grizzled old teacher mentions that some of the ‘below grade’ students have gaps in their knowledge. “Some of them don’t know the basic math facts, or how to do basic operations,” he complains.

“That’s because they haven’t been taught how to think,” Is Sally’s utterly ridiculous answer. I am reminded of Carl Bereiter’s quip that trying to teach children how to think is like trying to teach them how to digest.

The grizzled old teacher bites back but it goes nowhere. The power lies with Sally.

You might ask what Garelick is doing all this time. Surely, he’s issuing hard truths and fighting the good fight? Well, no. Garelick is mainly trying to stay employed. He is a late entrant into the teaching profession after a full career in the real world and nobody seems to want to give him a permanent contract. This is why he ends-up with various long-term cover positions. He is a model of diplomacy, trying his best to follow the plans of the regular class teachers. You can tell that he’d rather keep his counsel and stay in work doing a job he enjoys.

And it’s clear that he enjoys it even if, like most new teachers, some of his classes make him nervous. Much of the book consists of anecdotal tales of his interactions with students; of his frustrations and his palpable excitement when he hits on a way of making progress with them. He confides his scepticism about Common Core in us, the reader, but you can see that he is genuinely trying to do his best. Like any new teacher, he is trying to figure out an effective way to run a class – one day, teachers will be actually trained in such things. In his struggles, he has to confront a discipline system that is almost impossible to use. “There are a variety of methods one can use to discipline students: detentions, referrals, sending the student outside of class, contacting the parents. I was confused about using them.” Of course you were, Barry. Can an experienced old fox let you in on a little secret? You are meant to be confused about them because you are not supposed to use them. Put up a motivational poster instead.

Garelick gives Common Core more credence than I probably would at this stage in the process. He knows it’s daft but you can detect a sense of guilt; he’s not quite sure if he’s right. And so he has to keep proving to himself that kids really do understand things better if they practice and master basic procedures. “See!” he pleads with us, “It really works. Will anybody listen?”

And that is the charm of the book. Self-effacing and humble, Garelick just wants to take us to the places he has been and check his own thinking.

Yet mixed with the naivety of someone new to the profession, there are some flashes of wisdom that can only come from someone who has seen a lot of life outside the classroom. At one point, he makes me think in a new way about a common trope. Enacted Common Core, as one of its things-we-fluffy-non-mathematicians-imagine-real-mathematicians-do, prioritises the ability to solve ‘non-routine’ problems, otherwise known as ‘problems that students haven’t been taught how to solve’.

“What I find inauthentic is the prevailing group-think which holds that judging math ability should be based on how well students in K-12 are able to apply prior knowledge to problems that are substantially different than what they have seen before. In the working world (which the education establishment tries to emulate by insisting that students be given “real-world” problems) most people employed in technical fields are expected to apply their skills to variants of well-studied problems. For those who need to solve problems of a substantially new nature, it takes weeks, months and years.”

Quite right, Barry. This fluffy kind of Common Core is never, I’m afraid, going to bring the U.S. up to the same standard as the Far East with its rigour and procedural fluency. The strange obsession with eccentric definitions of ‘understanding’ and the commitment to problem-solving as some kind of generic, trainable skill will misdirect teachers from what is important. What is more, in those states that had good standards prior to its implementation, Common Core may represent a dumbing-down.

I fear for the future of  maths in America. You should read this book and see whether you agree.

Note: The Author was kind enough to send me a review copy of his book.


How is reading being taught in the wild?

Around the turn of the century, a U.S. panel reported on the evidence about how best to teach children to read. They were crystal clear; a systematic phonics programme was best. This was seen by many as the definitive end of the ‘reading wars’ that pitted whole-language advocates against promoters of a phonics-based approach. Whole-language was a theory of learning to read that emphasised whole words, ‘real’ books and students ‘constructing’ their own meaning. As such, it aligned with ‘constructivist’ views of teaching that remain fashionable in schools of education.

However, like Fukuyama’s declaration of the ‘end of history,’ hopes for an end to the debate on reading represented a false dawn. Whole-language advocates rebadged their approach as ‘balanced literacy,’ implying that phonics was now a part of it, but only one component part. Many people have come to accept their rhetoric that spending 5 hours per day doing nothing but decoding and perhaps a little maths would be harmful to primary school children; perhaps the greatest straw-man argument in education. The idea that phonics proponents have no interest in comprehension and only care about training children to ‘bark at print’ has held enormous rhetorical power, despite the fact that many phonics experts would subscribe to the ‘simple view of reading’ where decoding and comprehension go hand-in-hand to understand a text. What they reject are alternative ways to try to decode words such as rote memorizing lists of whole words – ‘sight words’ – or guessing what a word might be from a picture or perhaps from the first sound in the word. They would claim that this is a lost opportunity to practice phonetic decoding and represents a danger to students who rely on these methods when later, more complex texts simply cannot be decoded in this way. This is consistent with the science which finds little support for ‘multi-cuing’ strategies.

I have mentioned anecdotally that I suspect that there is still a great deal of learning of sight words and instruction in multi-cuing going on in schools. Despite the supposed ‘balanced’ approach to literacy, I also think insufficient phonics instruction takes place and, when it does, it’s a little haphazard. These views reflects my own experience and the experiences of those with whom I’ve been in contact over social media in a range of countries, as well as the sorts of arguments that you seeing playing-out amongst teachers and in the media. There are plenty of academics out there involved in training teachers who remain deeply sceptical about phonics. Others tip their hats to phonics but emphasise that it is not sufficient or mount ad hominem attacks on people with phonics programmes to sell. They often highlight the differences between students and suggest varied instruction to cater for these difference whereas evidence suggests that successful approaches tend to work across the range of students.

However, I have been challenged by Linda Graham to provide stronger evidence for my position. Short of going out into the wild and surveying hundreds of schools, it is hard to tell exactly what the situation is. Even if we were to conduct such a study, there would likely be an experimenter effect where teachers would present the strategies that they thought we were looking for – a key problem with lesson observation. It is an easier proposition if you already accept the efficacy of phonics approaches – surely we would not be seeing a decline in reading scores. However, to a phonics sceptic this is begging the question.

When I asked my contacts about this, my attention was drawn to the 2005 Australian government inquiry into the teaching of reading. The comments made are pertinent to teacher training but I would suggest that we can certainly draw inferences about the likely impact on school instruction. To quote from the executive summary:

“The evidence is clear, whether from research, good practice observed in schools, advice from submissions to the Inquiry, consultations, or from Committee members’ own individual experiences, that direct systematic instruction in phonics during the early years of schooling is an essential foundation for teaching children to read. Findings from the research evidence indicate that all students learn best when teachers adopt an integrated approach to reading that explicitly teaches phonemic awareness, phonics, fluency, vocabulary knowledge and comprehension. This approach, coupled with effective support from the child’s home, is critical to success…

Much curriculum design, content, teaching and teacher preparation seems to be based, at least implicitly, on an educational philosophy of constructivism (an established theory of knowing and learning rather than a theory of teaching). Yet the Inquiry found there is a serious lack of supporting evidence for its effectiveness in teaching children to read. Further, too often emphasis is given to the nature of the child’s environment or background rather than on how a teacher should teach, resulting in insufficient attention being given to both ‘what’ and ‘how’ teachers should teach children to read and write. Whereas the ‘starting’ levels of children from less advantaged backgrounds is lower than those from more advantaged backgrounds, findings from a large body of evidence-based research consistently indicate that quality teaching has significant positive effects on students’ achievement progress regardless of their backgrounds…

The Inquiry found that the preparation of new teachers to teach reading is uneven across universities, and that an evidence-based and integrated approach including instruction in phonemic awareness, phonics, fluency, vocabulary knowledge and text comprehension needs to be adopted. The Inquiry also found that systematic support for classroom teachers to build the appropriate skills to teach reading effectively, is clearly inadequate…”

In the body of the report, we can find data to support these conclusions. For instance, they surveyed the 34 higher education institutions that provide teacher training in Australia. Asked about the proportion of Bachelor of Education course credits that were devoted to the teaching of reading, they found that:

“…this share varies considerably across the 34 teacher education institutions, from a low of less than two per cent to a high of over 14 per cent. All but three institutions devoted less than 10 per cent of total credit points to the teaching of reading, and half of all institutions devoted five per cent or less of total credit points to this activity.”

They review various studies and note that:

“Drawing on quantitative data of teachers’ perceptions of the quality of teacher preparation in Australia, Louden et al. (2005b) conclude that, on the whole, beginning primary teachers are not confident about teaching some specific aspects of literacy, namely viewing, spelling, and grammar as well as phonics. Moreover, barely a third of senior staff in schools thought that beginning teachers were prepared to teach literacy. A further report based on the perceptions of some school principals and experienced teachers also concluded that new teachers are graduating without sufficient specific strategies to improve literacy or numeracy standards (Parliament of Victoria Education and Training Committee, 2005).”

They conducted focus groups and found:

“The literacy competence of student teachers was raised as an issue in all focus group discussions. Participants reported that many students lacked the literacy skills required to be effective teachers of reading and needed help to develop their foundational literacy skills. The literacy of student teachers is assessed in some way in most courses, and some participants indicated that the students who do not have appropriate levels are required to undertake specific remedial course work. This approach seems to be ad hoc, with no national approach to determining entry standards in literacy.”

These statements would seem to provide some indirect evidence for my view that insufficient phonics instruction takes place in schools. If trainee teachers have poor literacy skills themselves, lack knowledge of phonics and don’t spend a great deal of time learning about reading then this is an inevitable result. Whatever you think about phonics, you have to acknowledge that a systematic approach to teaching it requires a command of the detail. It is no surprise that teachers would turn to alternative rote-learning and guessing strategies if they lack this knowledge.

Of course, you could argue that things have significantly changed in the ten years since this report was compiled. However, if you were to do this then I think the onus would be on you to supply the evidence of this change.

If things haven’t changed in the last ten years then I think this also raises serious policy issues (to be honest, rereading this report to look for this evidence was quite a shocking experience for me). Instead of a counsel of complacency, education schools need to rethink their programmes. They need to stop criticising phonics experts for their attempts to make a living and perhaps collaborate with them to run substantial courses for their students. This would be my preferred option because I would want university education to remain a critical part of teacher training.

If this doesn’t happen then policymakers have a couple of other alternatives. They could, for instance, expand programmes where teachers train predominantly in schools. This has already been done in the UK but you would have to question whether the schools are any better placed to teach phonics than the universities. Still, the competition might drive up standards. The teacher registration boards also have a potential role to play. If they start requiring a certain level of phonics knowledge in order to register teachers then the universities will soon have to change their ways. Nobody will want a teaching degree that does not enable you to become registered as a teacher at the end of it.


Irrational faith in markets

I wish to address an interesting question. Can the introduction of markets, quasi-markets and choice into public education make it more effective? Firstly, you have to accept the premise of the question; you need to agree that there are more and there are less effective ways of going about teaching things to children. This seems obvious to me but there are eccentric characters out there who will question it.

And what do I mean by the introduction of such markets? I am talking about efforts to make state schools look more like businesses in the private sector, subject to competition but with increased freedom around how to operate. This includes initiatives such as charter schools in the U.S., academies and free schools in the U.K. and independent public schools in Australia. Other possible arrangements in the future could include transferable education vouchers that parents can spend on a school of their choosing.

I do believe that these arrangements have some benefits. I was never convinced about the consultants who worked for local authorities in the U.K. and who came into schools to further particular agendas. And I also think that schools should be able to recruit their own staff; something that many state-run systems centralise and do on behalf of schools.

These market-style initiatives also allow the opportunity for ‘proof of principle’ schools to arise. Many good ideas simply cannot get off the ground or are quickly stifled in monolithic systems if they don’t accord with the prevailing orthodoxy. It is hard to imagine anything like the Ark chain of academies or Michaela Community School in the old U.K. system and, whatever you think of it, KIPP required the advent of Charter Schools in the U.S.

However, specific cases do not demonstrate that such policies work to improve the whole system. In fact, I think that too much faith has been placed in these organisational measures. Without winning the argument about methods, we are doomed to repeat the failures of the past under new and shiny logos.

Why is this? Well, it is assumed that if parents have a choice then they will choose more effective schools over less effective ones. Let us set aside, for a moment, the question of whether parents really do have much of a choice under these models. This matter would make a blog post of its own. 

I think parental choice should work well for issues of behaviour management. If a school has poor behaviour then this will be manifest and obvious. Generally, this should drive better behaviour policies. However, you will also get behaviour sink schools as the parents of disruptive students remove them from mainstream schools that have what they perceive to be unreasonable behaviour policies and send them to the more permissive ones which will no doubt spring-up to cater for this demand.

However, how is a parent to judge teaching standards? Many teachers don’t know what effective teaching looks like so how can parents tell? An education takes about 12 years and so the results of failure are not immediately obvious. Transparency over results will help but these are not perfect measures and can be spun.

After all, there’s a sucker born every minute. The theory of choice assumes parents have the information and understanding to make the right choices. But we see the failure of markets such as this all the time. There are expensive cosmetic products with sciencey-sounding ingredients that clearly enough people buy to make them worth marketing. I’ve mentioned before that my local pharmacy stocks homeopathic products right next to proper drugs and that they are packaged in a similar way. They cannot work, by any known scientific laws and yet, presumably, people buy them. The educational equivalent, of course, will be schools spouting psychobabble that parents will not have the requisite knowledge to properly analyse.

Our friends on the right are likely to interupt at this point and say, ‘So what? Parents should have the freedom to choose; the right to use their money as they wish, even if we think they’ve made the wrong choice. It is not up to a nanny state to impose these choices upon them.’ 

This is a respectable ideological stance. However, it does not actually address the question that I asked at the start of this post. It is quite beside it. The question that I want answered is whether all of this will make schools in general more effective. Freedom may be desirable but these ideas have been sold as a way to make schools better.

And let’s just hang on a minute before we get all Adam Smith about it. A parent choosing a government-funded school is spending the taxpayers money, not their own. Are we really saying that it is ethical to take money from a 25-year-old, single nurse and then give it to a parent to spend on fluffy nonsense?

I do think that these new models of school will be part of our future and I do see advantages to the freedoms that they may bring. Good regulation can mitigate some of my concerns. However, it is quite naive to think that by tinkering about with administrative arrangements you can fix education and avoid the carnage of the pedagogical battlefield. 

There are many wars still to fight so keep your swords sharp.


Dismissing Direct Instruction (DI)

I am not a cheerleader for Direct Instruction (DI) programmes. To be clear here, I am using the convention of capital letters to refer to the curricula developed by Siegfried Engelmann and colleagues. DI units are a set of scripted lessons that teachers are meant to deliver pretty faithfully. Personally, I have always viewed the planning of lessons – at least in collaboration with others – as part of a teacher’s role and so I struggle a little with this idea. However, this may just be enculturation. As proponents point out, we don’t expect pilots to design the plane and we can recognise the talent of an actor even if she is reading lines written by someone else.

So I have a sense of ambivalence. However, I am also aware of the powerful evidence for DI programmes and the almost visceral hatred they arouse in their critics. There are dodgy analyses that attempt to substantiate the extraordinary claim that participation in an early years DI programme causes criminal behaviour in adolescence. And then there are attempts to obscure or flatly ignore the evidence from the largest education experiment in history; Project Follow Through.

Earlier this week, I tweeted a link to an article in The Australian reporting the initial results of a trial of DI curricula in Cape York, Australia. The schools involved serve disadvantaged students. Noel Pearson is a community leader who has introduced DI in an effort to raise educational attainment. This is far from a conventional approach and has attracted much criticism in Australia.

In response to my tweet, I was soon being sent links to such criticism of DI. The first was an article from Chris Sarra and it is a tour de force of persuasive writing. Sarra characterises DI as you might expect; a stifling programme that restricts teacher autonomy. Fair enough; this is relevant to my own ambivalence. However, I also recognise that the key question is whether it works. After all, education systems are here so that kids learn stuff not to promote the cause of teacher autonomy. So this argument offers no resolution.

In fact, a killer blow is missing throughout. Despite asserting that, “If [Noel] Pearson is serious about having his views seen as worthy in reputable education dialogue, his energies are best spent on highlighting what is good about Engelmann’s Direct Instruction as this will require some effort,” little evidence of harm or ineffectiveness is produced. Moreover, the substantial evidence in favour of DI is not really dealt with or critiqued. Indeed, Sarra concedes, “To be fair, and not wanting to cherry pick the data, Hattie does rate positively the effectiveness of Direct Instruction.” And he even expects positive evidence to emerge from Cape York. “The data will of course show some improvement and this should not surprise us.” With this statement, Sarra allows us no way in which to prove him wrong (an unfalsifiable position should always trigger alarm bells).

Sarra makes a point that DI programmes talk about wolves, for example, and yet students in Cape York will not have encountered any wolves. Is this a problem? I don’t know how the participating schools are structured but I suppose that if local culture and conditions are not a part of the school curriculum then this really would be a problem. Does Engelmann’s DI fill-up all of the school day? Even if there are no wolves in Cape York isn’t it still useful for students to know the meaning of the word ‘wolf’ in order to engage in world news, literature and culture? However, I do take the broader point that a DI programme better tailored to Australia would be preferable. Is Sarra arguing for funding to be made available to create one? No.

In fact, Sarra makes dark allusions to the fact that DI is proprietary and that the materials are expensive (as an example of cost, the teacher’s guide for a writing program that I recently investigated is about $150 US). There are two important points to make here. Firstly, someone has to plan lessons. The alternative to buying-in a programme is for teachers to plan lessons themselves and, whichever way you look at it, this incurs a cost. Freed up from such planning, teachers could do something else with the time. Or maybe they could just do less in total and spend more time with their families.

Secondly, do we really think it wrong that we have to pay for it? How else could Engelmann develop the programme given that government agencies – the only source who could perhaps create something like this and make it available to schools for free – tend to be full of educationalists who disapprove of DI in a similar way to Sarra? Yes, I have to pay money for hayfever pills and I would rather be given them for free. However, the fact that I pay for them doesn’t mean that they don’t work. Do we think that Engelmann should take a regular job and put his curricula together at weekends as some kind of altruistic hobby?

You sometimes see a similar argument made about Systematic Synthetic Phonics (SSP) programmes. Excluded from universities for their views, proponents of SSP often have to work in the private sector and market their materials to schools. This is then used against them in ad hominem attacks implying that you can’t trust a word they say because they have something to sell. Let us perhaps look at the evidence instead.

However, it seems that Sarra thinks proponents of DI aren’t the kind of rational people who weigh the evidence, “Engelmann DI advocates are not like most quality educators. They are zealots convinced they have the one true faith and the rest of us are heretics.” That doesn’t sound very nice. Who’d want to be a Zealot? I’ll get my coat…

Substantive evidence is also lacking from Alan Luke’s piece, “Direct Instruction is not a solution for Australian schools.” It reads as a cry of anguish from the education establishment.

The article actually starts pretty well, making a useful distinction between DI and other forms of explicit instruction; a distinction that I am always keen to make when arguing for the latter. It states the usual criticisms of the restrictiveness of the approach which are fair game. However, it is let down in my view by the section, “Does DI improve students’ achievement and participation levels?”. Firstly, it does not mention Project Follow Through, even to dismiss it. It is hard to conceive of a discussion of the evidence base for DI that makes such an omission. Instead, we have an ungenerous, highly qualified and debatable statement about the limited effectiveness of the programme:

“Reading the research, I have little doubt that DI – and other approaches based on explicit instruction – can generate some performance gains in conventionally-measured basic skills of early literacy and numeracy.”

Do you see what he’s done there? There are ‘some’ gains in ‘conventionally-measured’ ‘basic’ skills. What does the qualified ‘some’ signify? How can it be quantified? Why does it matter that these skills are ‘conventionally-measured’? Presumably, there are indications that alternatives to DI are better on unconventional measures or this statement would not be relevant. What are these measures? Where is this evidence? It is certainly not what Follow Through showed. How about the notion of gains being on skills that are merely ‘basic’? They sound a bit low-level and unimportant and yet, without basic skills, it is pretty much impossible to develop any sophisticated ones. Even so, this statement is misleading. In Follow Through, DI demonstrated the greatest gains of any programme on both basic skills and more sophisticated processes such as reading comprehension and mathematical problem solving. Hmmm…

Again, Luke mentions cost and notes that the dastardly Engelmann has copyrighted his materials.

However, Luke is generally more measured than Sarra and he does offer us crazy zealots a ray of hope. His view is potentially falsifiable. He notes – in reference to a previous report on Cape York – that, “In Australia, the recent ACER report on the Cape York implementation of DI does not provide any clear scientific evidence that DI delivers generalisable cohort achievement gains.” So there is a possibility, just a possibility, that as this evidence accumulates, Luke might change his mind.


The innovative school

Anyone who follows discussions about education in traditional media, social media or even academia will see innovation and risk-taking being presented as particularly good things that we should have more of. This idea is part of the zeitgeist and is no doubt influenced by idealised notions of thrusting silicon valley startups that spurt innovation all over the place.

However, startups have a relatively high rate of failure. Now, it probably doesn’t matter too much if the world misses out on another dancing-cat app. However, in our line of business, the failure of the harvest is far more significant. These are people’s lives that we are talking about.

Early this year, I was fortunate enough to attend the ICSEI conference in Cincinnati. One of the highlights was hearing David Reynolds discussing the High Reliability Schools project of which he is a part – interestingly, he was sharing a platform with Robert Marzano who is involved in a U.S. project of the same name but where the details seemed quite different.

Reynolds suggested that perhaps a better metaphor for a school is an airline. Airlines obsess about safety. That is their primary concern. Yes, they want to serve you palatable food, get your bags to the right destination and give good customer service, but sometimes they fall a little short on these criteria. Why? Because the whole organisation is geared around safety.

What would an ‘innovative’ airline look like? It would perhaps have a new model for selling seats or it might provide a new form of in-flight entertainment. Perhaps it would open up a new route. However, it certainly won’t involve the pilots deciding on innovative flight-paths or making modifications to customised planes. The core business of airlines is safely getting passengers from A to B and they are extremely conservative about this. New aeroplanes have to undergo extensive testing over many flight hours in order to demonstrate their safety i.e. their effectiveness as measured against an airline’s main objective.

However, the same does not seem to be true of innovative schools. The closest analogy for not crashing the planes is schools not allowing children to fail to learn to read. We have powerful evidence on how best to teach reading. The use of systematic, synthetic phonics is endorsed by three separate national inquiries into the issue from the US, UK and Australia. None of these support alternatives such as the use of multiple cues (guessing from context, from the picture or from the sound of the first letter). Yet, even in this area, there is a reluctance among teachers to use the most reliable methods.

Response to intervention is a complementary approach that actively seeks out the students who are falling behind in learning to read. In this sense, it can tell us early if one of our planes is likely to crash. It is similar to the kinds of checklists that airlines will follow to ensure safety. Why do more schools not adopt such a model?

Instead, we have innovative schools that tend to be places where strange pedagogical fads flourish; ordinary lessons are abandoned on Fridays in favour of cross-curricular projects, every student is given an iPad in the expectation that this will lead to miracles occurring, or perhaps administrators turn up at lessons with stopwatches to ensure that no teacher talks for more than two minutes.

We have a lot of references to ‘deeper’ learning or learner engagement but the central point that advocates of these approaches need to demonstrate – that these approaches are effective – is just assumed and debating it usually doesn’t get much air-time. Surely, everyone would want deeper learning, wouldn’t they? Who could possibly be against student control of learning? If you’re unconvinced then I have this motivational poster…

I have a view as to why this innovation narrative thrives. The dominant education theories are largely lacking in evidence of effectiveness. Educational theorists indirectly absorb early twentieth century ideas from thinkers such as John Dewey and William Heard Kilpatrick and then talk themselves into disastrous schemes like whole language learning. When you have no evidence to support the approach that you are pushing then the language of innovation and modernity can be deployed instead. Come on guys, take a risk!

It is worth remembering that whole language was perpetrated on an entire generation of young people from the 1970s to the 1990s and its influence is still significant today in balanced literacy programmes and interventions such as Reading Recovery. This, despite the fact that numerous studies that have reviewed all of the available evidence – such as those cited above – have found little to no empirical support for it. It therefore demonstrates an argument against accepting any products of educational theory without strong empirical evidence.

A similar innovation saw Canada, since around the Year 2000, move away from traditional forms of maths instruction in favour of more ‘constructivist’ approaches. The learning of times-tables was downplayed and students were encouraged to find multiple ways to solve problems, inventing their own strategies rather than learning the standard methods; precisely the sort of maths promoted in this video. It was all coordinated under the auspices of the Western and Northern Canadian Protocol. The adoption of this approach has coincided with significant declines in Canadian students’ maths performance on international tests relative to their own performance at the start of the century, with the Canadian media laying the blame at the door of the new curricula.

It is hardly surprising. You may not care for the kinds of lab-based tests that have demonstrated the worked-example effect. However, even in ‘ecologically valid’ classroom intervention studies, constructivist approaches underperform explicit teaching.

Never mind. Who wants to worry about crashing the planes when you can be innovative?


Why Michaela Community School must be really rubbish

Now, we all knew that MCS would be rubbish, right? I mean, it’s a free school run by the wrong sorts: you know, folks who aren’t like us. But now we have evidence to prove it’s rubbish:

1. People keep visiting, writing blogs and saying how impressed they are with it. You can’t get more damning than that.

2. The kids walk through the corridors in silence from lesson to lesson, purposefully. I mean, that’s just not right, is it? What are they like the Borg or something? And when do we ever expect adults to do that? (don’t answer -it was rhetorical and I don’t have the time). Everyone knows that effective schools have noisy corridors and stuff. Everyone knows.

3. Er…


A proposal to make Star Wars more interesting

If the logic of education applied more broadly:

It is well known that movies are intrinsically boring. This is because they lack the element of participant choice. Once you have decided to watch a particular film, your options are taken away and you simply have to sit and listen as you fulfil your role as the receiving end of a one-way communication. It is a form of ‘transmission’ that does not draw upon the experiences of the movie-goer or recognise the movie-goer’s vocation to become more human. 

This lack of options is compounded in the example of a family or dorm-room movie night where a viewer may be compelled to watch a film chosen by others. In this scenario, movie-goers are coerced into watching something that may have no relevance at all to their own lived experiences. Country-dwellers may be forces to watch a movie about life in the city or in a foreign state. This distantification amplifies ambivalence and a sense of alienation, providing a wellspring for anger and the embrace of extremist perspectives.

Well, here is my modest suggestion. Let us take a typically boring movie such as ‘Star Wars’. Let’s face it, Star Wars suffers greatly from a relevance problem given that no potential viewer can to relate to life in a galaxy far, far away. Setting this obvious flaw aside, what could be done to relieve the tedium?

Well, self-determination theory offers us a possibility. In Star Wars there is a scene where the characters have to destroy the Death Star. Instead of allowing the movie to simply tell viewers how this is done – a typically transmissive model – there is an opportunity here to workshop solutions amongst the audience. We could pause the film, give out big sheets of paper and some coloured pens and ask the viewers to get into small groups and suggest their own strategies for Death Star destruction.

A suitable facilitator may then move around the various groups, nodding sagely, asking questions and provoking interactions before the session concludes with a gallery walk and the opportunity for facilitator and participants alike to voice a variety of non-committal and vague thoughts about what has been presented.

Of course, this is likely to take far longer than intended and so the end of the movie will never be shown. Perhaps this is for the best.