Welcome
Posted: August 4, 2013 Filed under: Uncategorized Leave a commentThis is the homepage of Greg Ashman, a teacher living and working in Australia. Nothing that I write or that I link to necessarily reflects the view of my school.
I have recently written an article for The Conversation:
I also blog and write articles about education for the Times Educational Supplement – the TES. Here are some links:
Master the mysterious art of explanation
Taking a critical look at praise
Great Scott! Let’s push the brain to its limits
The science fiction that doing is best
Make them sit up and take notice
For great rewards, sweat the small stuff
Stand-out teaching – minus differentiation
I also have some posts on the researchED website workingoutwhatworks.com
Dismissing Direct Instruction (DI)
Posted: July 24, 2015 Filed under: Uncategorized Leave a commentI am not a cheerleader for Direct Instruction (DI) programmes. To be clear here, I am using the convention of capital letters to refer to the curricula developed by Siegfried Engelmann and colleagues. DI units are a set of scripted lessons that teachers are meant to deliver pretty faithfully. Personally, I have always viewed the planning of lessons – at least in collaboration with others – as part of a teacher’s role and so I struggle a little with this idea. However, this may just be enculturation. As proponents point out, we don’t expect pilots to design the plane and we can recognise the talent of an actor even if she is reading lines written by someone else.
So I have a sense of ambivalence. However, I am also aware of the powerful evidence for DI programmes and the almost visceral hatred they arouse in their critics. There are dodgy analyses that attempt to substantiate the extraordinary claim that participation in an early years DI programme causes criminal behaviour in adolescence. And then there are attempts to obscure or flatly ignore the evidence from the largest education experiment in history; Project Follow Through.
Earlier this week, I tweeted a link to an article in The Australian reporting the initial results of a trial of DI curricula in Cape York, Australia. The schools involved serve disadvantaged students. Noel Pearson is a community leader who has introduced DI in an effort to raise educational attainment. This is far from a conventional approach and has attracted much criticism in Australia.
In response to my tweet, I was soon being sent links to such criticism of DI. The first was an article from Chris Sarra and it is a tour de force of persuasive writing. Sarra characterises DI as you might expect; a stifling programme that restricts teacher autonomy. Fair enough; this is relevant to my own ambivalence. However, I also recognise that the key question is whether it works. After all, education systems are here so that kids learn stuff not to promote the cause of teacher autonomy. So this argument offers no resolution.
In fact, a killer blow is missing throughout. Despite asserting that, “If [Noel] Pearson is serious about having his views seen as worthy in reputable education dialogue, his energies are best spent on highlighting what is good about Engelmann’s Direct Instruction as this will require some effort,” little evidence of harm or ineffectiveness is produced. Moreover, the substantial evidence in favour of DI is not really dealt with or critiqued. Indeed, Sarra concedes, “To be fair, and not wanting to cherry pick the data, Hattie does rate positively the effectiveness of Direct Instruction.” And he even expects positive evidence to emerge from Cape York. “The data will of course show some improvement and this should not surprise us.” With this statement, Sarra allows us no way in which to prove him wrong (an unfalsifiable position should always trigger alarm bells).
Sarra makes a point that DI programmes talk about wolves, for example, and yet students in Cape York will not have encountered any wolves. Is this a problem? I don’t know how the participating schools are structured but I suppose that if local culture and conditions are not a part of the school curriculum then this really would be a problem. Does Engelmann’s DI fill-up all of the school day? Even if there are no wolves in Cape York isn’t it still useful for students to know the meaning of the word ‘wolf’ in order to engage in world news, literature and culture? However, I do take the broader point that a DI programme better tailored to Australia would be preferable. Is Sarra arguing for funding to be made available to create one? No.
In fact, Sarra makes dark allusions to the fact that DI is proprietary and that the materials are expensive (as an example of cost, the teacher’s guide for a writing program that I recently investigated is about $150 US). There are two important points to make here. Firstly, someone has to plan lessons. The alternative to buying-in a programme is for teachers to plan lessons themselves and, whichever way you look at it, this incurs a cost. Freed up from such planning, teachers could do something else with the time. Or maybe they could just do less in total and spend more time with their families.
Secondly, do we really think it wrong that we have to pay for it? How else could Engelmann develop the programme given that government agencies – the only source who could perhaps create something like this and make it available to schools for free – tend to be full of educationalists who disapprove of DI in a similar way to Sarra? Yes, I have to pay money for hayfever pills and I would rather be given them for free. However, the fact that I pay for them doesn’t mean that they don’t work. Do we think that Engelmann should take a regular job and put his curricula together at weekends as some kind of altruistic hobby?
You sometimes see a similar argument made about Systematic Synthetic Phonics (SSP) programmes. Excluded from universities for their views, proponents of SSP often have to work in the private sector and market their materials to schools. This is then used against them in ad hominem attacks implying that you can’t trust a word they say because they have something to sell. Let us perhaps look at the evidence instead.
However, it seems that Sarra thinks proponents of DI aren’t the kind of rational people who weigh the evidence, “Engelmann DI advocates are not like most quality educators. They are zealots convinced they have the one true faith and the rest of us are heretics.” That doesn’t sound very nice. Who’d want to be a Zealot? I’ll get my coat…
Substantive evidence is also lacking from Alan Luke’s piece, “Direct Instruction is not a solution for Australian schools.” It reads as a cry of anguish from the education establishment.
The article actually starts pretty well, making a useful distinction between DI and other forms of explicit instruction; a distinction that I am always keen to make when arguing for the latter. It states the usual criticisms of the restrictiveness of the approach which are fair game. However, it is let down in my view by the section, “Does DI improve students’ achievement and participation levels?”. Firstly, it does not mention Project Follow Through, even to dismiss it. It is hard to conceive of a discussion of the evidence base for DI that makes such an omission. Instead, we have an ungenerous, highly qualified and debatable statement about the limited effectiveness of the programme:
“Reading the research, I have little doubt that DI – and other approaches based on explicit instruction – can generate some performance gains in conventionally-measured basic skills of early literacy and numeracy.”
Do you see what he’s done there? There are ‘some’ gains in ‘conventionally-measured’ ‘basic’ skills. What does the qualified ‘some’ signify? How can it be quantified? Why does it matter that these skills are ‘conventionally-measured’? Presumably, there are indications that alternatives to DI are better on unconventional measures or this statement would not be relevant. What are these measures? Where is this evidence? It is certainly not what Follow Through showed. How about the notion of gains being on skills that are merely ‘basic’? They sound a bit low-level and unimportant and yet, without basic skills, it is pretty much impossible to develop any sophisticated ones. Even so, this statement is misleading. In Follow Through, DI demonstrated the greatest gains of any programme on both basic skills and more sophisticated processes such as reading comprehension and mathematical problem solving. Hmmm…
Again, Luke mentions cost and notes that the dastardly Engelmann has copyrighted his materials.
However, Luke is generally more measured than Sarra and he does offer us crazy zealots a ray of hope. His view is potentially falsifiable. He notes – in reference to a previous report on Cape York – that, “In Australia, the recent ACER report on the Cape York implementation of DI does not provide any clear scientific evidence that DI delivers generalisable cohort achievement gains.” So there is a possibility, just a possibility, that as this evidence accumulates, Luke might change his mind.
Social justice and knowledge
Posted: July 20, 2015 Filed under: Uncategorized 3 CommentsIn my previous Labour Teachers post, I argued that taking a political stance did not necessarily imply views on how to teach. However, if our priority is social justice then there are a number of arguments of which left-leaning teachers should be aware.
Continues here at Labour Teachers
The innovative school
Posted: July 19, 2015 Filed under: Uncategorized 5 CommentsAnyone who follows discussions about education in traditional media, social media or even academia will see innovation and risk-taking being presented as particularly good things that we should have more of. This idea is part of the zeitgeist and is no doubt influenced by idealised notions of thrusting silicon valley startups that spurt innovation all over the place.
However, startups have a relatively high rate of failure. Now, it probably doesn’t matter too much if the world misses out on another dancing-cat app. However, in our line of business, the failure of the harvest is far more significant. These are people’s lives that we are talking about.
Early this year, I was fortunate enough to attend the ICSEI conference in Cincinnati. One of the highlights was hearing David Reynolds discussing the High Reliability Schools project of which he is a part – interestingly, he was sharing a platform with Robert Marzano who is involved in a U.S. project of the same name but where the details seemed quite different.
Reynolds suggested that perhaps a better metaphor for a school is an airline. Airlines obsess about safety. That is their primary concern. Yes, they want to serve you palatable food, get your bags to the right destination and give good customer service, but sometimes they fall a little short on these criteria. Why? Because the whole organisation is geared around safety.
What would an ‘innovative’ airline look like? It would perhaps have a new model for selling seats or it might provide a new form of in-flight entertainment. Perhaps it would open up a new route. However, it certainly won’t involve the pilots deciding on innovative flight-paths or making modifications to customised planes. The core business of airlines is safely getting passengers from A to B and they are extremely conservative about this. New aeroplanes have to undergo extensive testing over many flight hours in order to demonstrate their safety i.e. their effectiveness as measured against an airline’s main objective.
However, the same does not seem to be true of innovative schools. The closest analogy for not crashing the planes is schools not allowing children to fail to learn to read. We have powerful evidence on how best to teach reading. The use of systematic, synthetic phonics is endorsed by three separate national inquiries into the issue from the US, UK and Australia. None of these support alternatives such as the use of multiple cues (guessing from context, from the picture or from the sound of the first letter). Yet, even in this area, there is a reluctance among teachers to use the most reliable methods.
Response to intervention is a complementary approach that actively seeks out the students who are falling behind in learning to read. In this sense, it can tell us early if one of our planes is likely to crash. It is similar to the kinds of checklists that airlines will follow to ensure safety. Why do more schools not adopt such a model?
Instead, we have innovative schools that tend to be places where strange pedagogical fads flourish; ordinary lessons are abandoned on Fridays in favour of cross-curricular projects, every student is given an iPad in the expectation that this will lead to miracles occurring, or perhaps administrators turn up at lessons with stopwatches to ensure that no teacher talks for more than two minutes.
We have a lot of references to ‘deeper’ learning or learner engagement but the central point that advocates of these approaches need to demonstrate – that these approaches are effective – is just assumed and debating it usually doesn’t get much air-time. Surely, everyone would want deeper learning, wouldn’t they? Who could possibly be against student control of learning? If you’re unconvinced then I have this motivational poster…
I have a view as to why this innovation narrative thrives. The dominant education theories are largely lacking in evidence of effectiveness. Educational theorists indirectly absorb early twentieth century ideas from thinkers such as John Dewey and William Heard Kilpatrick and then talk themselves into disastrous schemes like whole language learning. When you have no evidence to support the approach that you are pushing then the language of innovation and modernity can be deployed instead. Come on guys, take a risk!
It is worth remembering that whole language was perpetrated on an entire generation of young people from the 1970s to the 1990s and its influence is still significant today in balanced literacy programmes and interventions such as Reading Recovery. This, despite the fact that numerous studies that have reviewed all of the available evidence – such as those cited above – have found little to no empirical support for it. It therefore demonstrates an argument against accepting any products of educational theory without strong empirical evidence.
A similar innovation saw Canada, since around the Year 2000, move away from traditional forms of maths instruction in favour of more ‘constructivist’ approaches. The learning of times-tables was downplayed and students were encouraged to find multiple ways to solve problems, inventing their own strategies rather than learning the standard methods; precisely the sort of maths promoted in this video. It was all coordinated under the auspices of the Western and Northern Canadian Protocol. The adoption of this approach has coincided with significant declines in Canadian students’ maths performance on international tests relative to their own performance at the start of the century, with the Canadian media laying the blame at the door of the new curricula.
It is hardly surprising. You may not care for the kinds of lab-based tests that have demonstrated the worked-example effect. However, even in ‘ecologically valid’ classroom intervention studies, constructivist approaches underperform explicit teaching.
Never mind. Who wants to worry about crashing the planes when you can be innovative?
Motivation in Science and Mathematics
Posted: July 19, 2015 Filed under: Uncategorized 7 CommentsI created a graphic so that I can quickly address a question that arises often.
Why Michaela Community School must be really rubbish
Posted: July 14, 2015 Filed under: Uncategorized 4 CommentsNow, we all knew that MCS would be rubbish, right? I mean, it’s a free school run by the wrong sorts: you know, folks who aren’t like us. But now we have evidence to prove it’s rubbish:
1. People keep visiting, writing blogs and saying how impressed they are with it. You can’t get more damning than that.
2. The kids walk through the corridors in silence from lesson to lesson, purposefully. I mean, that’s just not right, is it? What are they like the Borg or something? And when do we ever expect adults to do that? (don’t answer -it was rhetorical and I don’t have the time). Everyone knows that effective schools have noisy corridors and stuff. Everyone knows.
3. Er…
A proposal to make Star Wars more interesting
Posted: July 13, 2015 Filed under: Uncategorized 3 CommentsIf the logic of education applied more broadly:
It is well known that movies are intrinsically boring. This is because they lack the element of participant choice. Once you have decided to watch a particular film, your options are taken away and you simply have to sit and listen as you fulfil your role as the receiving end of a one-way communication. It is a form of ‘transmission’ that does not draw upon the experiences of the movie-goer or recognise the movie-goer’s vocation to become more human.
This lack of options is compounded in the example of a family or dorm-room movie night where a viewer may be compelled to watch a film chosen by others. In this scenario, movie-goers are coerced into watching something that may have no relevance at all to their own lived experiences. Country-dwellers may be forces to watch a movie about life in the city or in a foreign state. This distantification amplifies ambivalence and a sense of alienation, providing a wellspring for anger and the embrace of extremist perspectives.
Well, here is my modest suggestion. Let us take a typically boring movie such as ‘Star Wars’. Let’s face it, Star Wars suffers greatly from a relevance problem given that no potential viewer can to relate to life in a galaxy far, far away. Setting this obvious flaw aside, what could be done to relieve the tedium?
Well, self-determination theory offers us a possibility. In Star Wars there is a scene where the characters have to destroy the Death Star. Instead of allowing the movie to simply tell viewers how this is done – a typically transmissive model – there is an opportunity here to workshop solutions amongst the audience. We could pause the film, give out big sheets of paper and some coloured pens and ask the viewers to get into small groups and suggest their own strategies for Death Star destruction.
A suitable facilitator may then move around the various groups, nodding sagely, asking questions and provoking interactions before the session concludes with a gallery walk and the opportunity for facilitator and participants alike to voice a variety of non-committal and vague thoughts about what has been presented.
Of course, this is likely to take far longer than intended and so the end of the movie will never be shown. Perhaps this is for the best.
A response to Dan Meyer
Posted: July 12, 2015 Filed under: Uncategorized 13 CommentsFollowing my previous post, I have been engaged in a discussion on Twitter with Dan Meyer. I like Dan. He comes across as a nice guy and has the rare ability to stick to the issues without making things personal. There are a few people out there who could learn from that.
If you are unfamiliar with Dan then you should check out his blog and his TED talk. He is a big deal.
Dan has taken issue with me about a few points on Twitter and so I thought that I would take this opportunity to expand on my thoughts a little.
Lecturing
I do not generally describe explicit instruction as ‘lecturing’ so why not? Well, lecturing implies a lack of interaction. It is clear to me that effective instruction involves ensuring students’ attention. In my own teaching, I attempt to achieve this by peppering any ‘lecturing’ with questions. I decide which student will answer which question (I rarely ask volunteers) and so all of my students know that they could be called upon at any time. I find that this concentrates the mind. If I am presenting something new then I will ask students to tell me how to do the non-new bits e.g. the linear algebra that drops out of the end of a transformations problem.
This looks very different from a classical university lecture where the size of the audience militates against this kind of interaction. The problem with describing explicit instruction as ‘lecturing’ is that I find people then quote studies to me that show that university-level lecturing with interactive clickers or with short review breaks is more effective than straight lecturing, thus demonstrating that explicit instruction is flawed (See papers here, here, here and here that have been forwarded to me periodically by Doug Holton – note that a common problem in interpreting these studies is trying to figure out what the “interactive” condition actually means). Clearly, explicit instruction can be a highly interactive approach and so this evidence tells us little of relevance.
“Lecturing” also does not seem to capture the phases of a lesson where students are practicing independently whereas the way that explicit instruction is defined does cover this. Explicit instruction has a long track history and is often also referred to as ‘direct instruction’. However, the latter term tends to be confused with the highly-scripted programs developed by Siegfried Engelmann and others which are basically one particular type of explicit instruction. This is why I avoid this term.
Barak Rosenshine is something of an expert in explicit instruction. He was involved in analysing the process-product research of the 1950s to 1970s that aimed to uncover the differences between more and less effective teaching. His concept of explicit instruction / direct instruction was drawn upon my Engelmann in developing his own approach. Rosenshine provides an excellent description of explicit instruction in this AFT article although he doesn’t name it as such, preferring to simply discuss ‘principles of instruction’. He goes into more detail here, this time referring to ‘direct instruction’ (paywalled).
Why do teachers seek alternatives to explicit instruction?
If explicit instruction is as effective as I claim then why do teachers seek out alternatives?
I spent 13 years feeling guilty about the way that I taught; that I should be making greater use of whistles and bells. This is because the dominant view in education asserts this. Of course, in reality, many teachers will use forms of explicit instruction because the alternatives are often unworkable. I rapidly figured-out that if I taught in certain ways then I’d have to teach the stuff again later. But I thought this was a flaw in me.
Universities and teacher training materials instruct teachers that ‘constructivist’ teaching practices are more effective, even though the evidence does not really support this. Consultants, school leaders and Dan Meyer himself are all resources that teachers could be expected to consult if they want to improve their practice and they will get a broadly similar message from each. Where could a teacher even find out about explicit instruction and its effectiveness? Well, I am trying to do my bit in a small way but it hardly compares.
The sadness is that this means that teachers often miss out on training in how to make their default explicit instruction much more effective. One light in the darkness is the work of Dylan Wiliam around formative assessment (which was basically my route into a different way of thinking).
Indeed, teaching seems to suffer from the “How Obvious” problem that Greg Yates points to in his classic paper. When presented with the findings of research, teachers tend to declare them obvious. However, when student teachers are asked to identify the attributes of effective teaching that comes from this research they generally cannot. “Not a single student cited the effective teacher’s ability to articulate clearly, or to get students to maintain time engagement.”
BORING!
I have briefly mentioned that the alternative to explicit instruction may be described as ‘constructivist’ teaching. I don’t want to become bogged-down in this – I am aware that constructivism is actually theory of learning and not of teaching and I have no problem with it in this regard; we link new knowledge to old etc. If it is true then, no matter how we teach, our students will learn constructively. However, some educationalists clearly do see implications for how we should teach.
Over the years, similar approaches have been described in many ways; discovery learning, guided discovery, problem-based learning, project-based learning (see William Heard Kilpatrick for an early description), inquiry learning and so on. Many of these date from a time before the constructivist theory of learning and so reflect a broader current in educational thought, epitomised by the progressive education movement of the early 20th century. Constructivism should really be seen as part of this tradition.
As every age invents a new name for it, so a new enthusiasm develops and, without much in the way of supporting evidence, armies of evangelists go out into the world and proclaim the new ‘effective practices’. A recent example can be seen in Canada. Around the year 2000, constructivist maths was pushed heavily in schools via public policy (see the WNCP) and consultants. Since this time, Canadian results in international maths tests such as PISA and TIMSS have generally declined (and I don’t mean relative to other countries, I mean in absolute terms). It is only a correlation but if this new approach was so effective then should we not have seen the reverse trend?
I am generally cautious about comparing different countries on these measures but I do think it significant when a single participant such as Canada or Finland declines relative to itself.
On the basis of quite thin evidence (see Kamii and Dominick and a nice replication from Stephen Norton which finds the precise opposite result or look at this poorly controlled study), teachers have been urged to abandon standard algorithms or to avoid drilling students in multiplication tables and number bonds. Cognitive load theory – of which I am a student – would predict this to be a disastrous move (see my slides from prior to starting my PhD).
So, if there is a lack of evidence then what justification is there for people to continue to support constructivist approaches? One example can be seen in the various reactions to Project Follow Through and goes something like this, “OK so Direct Instruction may be good for rote memorisation of basic facts but our kind of instruction does more intangible things over a greater period of time that cannot be measured”. Or, “Direct Instruction causes criminality”. Neither of these is true (see here and here).
Another approach is to say that people who favour explicit instruction neglect motivation; that motivation is key to learning and explicit instruction is just like really boring man!
Firstly, if motivation really is so important to learning and if explicit instruction is so demotivating then surely this would render explicit instruction ineffective. The evidence suggests otherwise.
For my second point, I need to be careful. The constructivists have set a rhetorical trap here that I might fall in. I will freely admit that entertainment is not my top priority when planning lessons. My top priority is that students should learn. However, this does not mean that I want my students to be bored. If they can learn something and I can make it interesting then I would always want to do both.
It is not clear to me why explicit instruction is intrinsically more boring than any other method. Yes, it can be boring but so can problem-based learning or anything else. I’ve observed students completing one of Dan’s activities who were not particularly turned-on by it. However, I would not conclude that it was therefore intrinsically boring. Perhaps it was pitched at the wrong level or the teacher hadn’t introduced it correctly. Motivation is a complicated thing.
I have written before in the context of science and asked which activity is more boring; completing an investigation to test the strength of different wet paper towels or a whole-class discussion of whether aliens exist? Advocates of alternative methods often make life harder for themselves by also insisting that all learning be nailed to mundane aspects of everyday life. For instance, I have noted that in David Perkins’ recent book he suggests, “students plan for their town’s future water needs or model its traffic flow.” Yawn! Boring! (see, I can do it too).
I have this little proof that I like to demonstrate to show that 0.999.. = 1. Every year, it provokes heated debate. What do I do? I show the students the proof, on the board, at the front of the room and then we discuss it. I suppose that I could ask the students to get into groups and try to come up with their own proof. I suspect that most would not and would also find the activity a bit boring.
The point is that nobody owns motivation. If your explicit instruction is boring then why not make it more interesting? Why does this problem have to imply a change of method? If that’s all constructivism’s got then it’s time for a rethink.
