It was probably around 2005. I was driving to work in London listening to news on the car radio. They were running a piece on Antisocial Behaviour Orders (ASBOs) and, as it progressed, I became irritated.
ASBOs were effectively restraining orders sought by local authorities and issued by the courts. They forbade those subjected to them of committing certain acts on the basis that these were antisocial. ASBOs were issued for things such as vandalism or abusive behaviour.
ASBOs attracted a great deal of criticism, much of it justified, but I was baffled by the argument I was hearing on the radio. Apparently, ASBOs did not ‘work’ because around 60% were breached. This appeared to be accepted uncritically by the presenters.
If the 60% figure was true, I did not think this conclusion was justified. It seemed that in 40% of cases, ASBOs were successful in halting the targeted behaviour. I don’t expect miracle cures for human behaviour and so this seemed pretty effective to me.
I get a similar feeling when someone gestures towards a real or notional detention hall and says, “Detentions don’t work. Look, it’s the same kids in here week after week.”
There are two important points to make. Firstly, in considering kids who are already in detention, we are selecting on the dependent variable. If detentions are intended to have a deterrent effect then we have no way of knowing how many students are not there because they were deterred. To attempt to answer this question, we would need to investigate the effect of detentions on the whole school population, not just those in detention.
The second point is that we should not have the same students in detention every week. Something else needs to kick in at this stage; a new layer of intervention. This could be punitive, such as a day in isolation, but it doesn’t have to be. Sometimes explicit teaching of organisational or anger management strategies might be the best way forward. Sometimes parental involvement may help. For instance, I’ve asked parents to shadow their child for a day, especially if I think the child is presenting a distorted picture of the school to his or her parents. Sometimes, there are managerial options around seating arrangements or where students eat lunch that can stop problems before they arise.
And all of this needs to sit within an overarching philosophy. You may not care for Michaela Community School’s pyramid of behaviour – being ‘top of the pyramid’ is when you behave well because it is ‘who I am’ – but a model like this provides a common reference and common language for a school community.
If you are looking for a one-size-fits-all miracle cure for all human behaviour then no, detentions don’t work. However, as part of a thoughtful and strategic approach to managing whole-school behaviour then they can certainly have a part to play.
This week, Pauline Hanson stood up in the Australian Senate and expressed an ill-informed opinion about the inclusion of kids with autistic spectrum disorders in regular classrooms. This is nothing new. Hanson has form for all sorts of wacky views such as the ones she has expressed on vaccines.
Hanson is the leader of One Nation; a fringe right-wing populist party. She’s like a female version of Donald Trump but without the intellectual grunt. While some commentators have calmly and proportionately rebutted her claims, others have been whipped-up into a feeding frenzy. This is unfortunate because an attack on Hanson by the establishment is only likely to increase her profile among those who would consider voting for her. Hanson is trolling us.
The context for Hanson’s remarks was a debate about a new funding deal for Australian schools; a bill that has now passed the parliament. The deal is complicated because Australian schools are supported through a mix of federal and state funding. In addition, federal government also funds a proportion of the costs of non-government schools. This reduces fees so that, for instance, regular families are able to send their children to catholic schools. So although it’s a contentious arrangement, no politician is going to throw away votes by removing this support.
The new deal increases funding overall while cutting the amount going to some non-government schools. The system will be simplified to a needs-based model that supersedes the mess of separate deals that currently exist. The Labor opposition have criticised the deal mainly on the basis that they would have spent more money.
This all provides plenty of fodder for policy wonks with big calculators. However, potentially the most significant long-term impact of the new deal has hardly had an airing this week.
Simon Birmingham, the education minister, wants to ensure that any extra funding improves outcomes. He seems frustrated with an Australian context of increasing educational expenditure coupled with the kind of declining performance that can be seen in the results of international assessments such as PISA and TIMSS.
Later this year, a review will commence into the best (and worst) ways to spend money in the education system. I am hopeful that this will be driven by evidence rather than educational theory. If so, this could represent a quiet revolution. We might be able to move on from balanced literacy, constructivism and the project-based learning fad in favour of teaching methods that actually work. Let’s hope so.
I was intrigued when I saw the headline, ‘Australia better at teaching English than the English’ in an article for the West Australian. You might be intrigued too.
You might be wondering how such a judgement could be made. You might think that the researcher in question, Paul Gardner, had his hands on literacy assessment data from both countries. You may further imagine some kind of statistical analysis demonstrating that any differences were down to the teaching and not other factors. You might expect this analysis to be contentious and open to discussion.
What are you, a POSITIVIST!!??
You need to free your mind from always asking for ‘evidence’ and ‘data’ and be more respectful of research. Go read Biesta or someone like that and you’ll see why. It’s not up to me to educate you. Suffice it to say that education is far more complex than healthcare because it involves human beings and so you can’t use positivist paradigms based on deficit thinking.
No, Gardner came to his conclusion by reading the curriculum documentation of the two countries. Having done so, he reckoned that the Australian curriculum documentation was better.
This is all solidly based in theory. Gardner conducted a ‘discourse and content analysis’. He applied ‘Cox’s five models of English’ and ‘Kalantzis et. al’s four paradigms of literacy’. In doing so, Gardner found that the curriculum in England is really narrow and didactic whereas the Australian one includes broader sociolinguistic views of language. Which obviously means it’s better.
And this, of course, is why we shouldn’t have a phonics check in Australia like they do in England.
So now you know.
A draft programme has been released for the researchED Melbourne conference taking place on Saturday 1st July at Brighton grammar. As if to prove the point that researchED is a grassroots, shoestring initiative, there are still a few typos, including in the blurb for my talk:
If you would like to purchase a ticket then you can do so here:
I will be discussing the history of education. In particular, I will propose that educational ideas that are presented as shiny and new are often nothing of the sort and have a rich lineage. Moreover, many of these ideas have been proven wrong repeatedly by successive generations and are inconsistent with up-to-date understandings of cognition.
If you want to get a feel for my arguments then read this post.
I love that researchED is a conference organised by teachers, for teachers. As such, it feels different from conferences run by academics or commercial interests. I value conferences about education organised by any interested group but I do believe that researchED adds something a little different and special, particularly in the mix of teachers, academics and policy specialists that it manages to attract. If you’re familiar with traditional education conferences then you might hear something at researchED that surprises you.
You may have noticed that I have tweeted a few graphs comparing the PISA results of two different countries.
I created a Google Sheet in order to do this. I have made this available so that you can have a play with this sheet too. You will need a Google login.
First, you will need to download a version from this link:
If you are signed in to your Google account then you will be asked if you want to make a copy. If you are not signed-in then you will first be prompted to do so.
Once you have your own copy, you should be able to change the two countries from the United States and Scotland to whichever two countries you like. Just select from the drop-down lists in the yellow boxes. Make sure you select the country with the most complete PISA record in the first yellow box because it is this country that defines the years that will appear:
You will then generate three graphs; one for each country and one combined graph that looks like this:
A few points to note
I grabbed the average scores from Wikipedia because these were in the most user-friendly format. I did cross-check a few of these with PISA’s own reports and found no mistakes but it is possible that some scores are not correct if they were incorrect in Wikipedia.
I have added England and Scotland at the end of the list of countries. I would eventually like to include all of the regions and provinces that form part of a larger entity but that have their own results e.g. Alberta. However, I have found this data much more difficult to scrape off the internet – it requires trawling through many reports that are often in different formats. If anyone wants to take on this work then that would be great.
I am starting to put together something similar for TIMSS.
The concept of a ‘learning profile’ is critical to the model of differentiated instruction developed by Carol Ann Tomlinson and colleagues. It is this model that is often cited by researchers, alongside the diagrams-of-brains approach known as ‘Universal Design for Learning’ or ‘UDL’. For this reason, it is probably worth exploring the concept of learning profile more fully and to asking whether we should differentiate our lessons in response to these profiles.
According to Tomlinson, learning profiles consist of four elements: Learning style, Intelligence preference, Gender and Culture. I find Tomlinson difficult to pin down on the first two of these. Sometimes, she seems to conflate learning styles with multiple intelligences, yet at other times she is at pains to point out the distinction between them.
Learning styles theories are conceptually different to multiple intelligences. The former suggest that particular students learn best in particular ways. For instance, the VAK learning styles scheme suggests that some students are visual learners, others are auditory learners and still others are kinaesthetic learners who like to learn through movement. Multiple intelligences theories hypothesise that intelligence is made up of a number of components and that students may possess more of some of these components than others. The most famous multiple intelligence theories are probably the ones proposed by Howard Gardner and Robert Sternberg.
Tomlinson is aware of the controversy around the idea of taking learning styles into account when planning instruction. In a recent presentation, she goes as far as to summarise the arguments and evidence against the idea. This includes large-scale reviews that show no advantage in teaching students according to their preferred learning style. You might think this would have caused Tomlinson to alter her theory in the light of this new evidence, perhaps by restricting the concept of learning profile to gender and culture alone, even if this might still raise other issues.
But no, learning profiles persist in their original forumlation and, instead, we are presented with an argument that I found confusing. While learning style is still part of a learning profile, a learning profile is not a synonym for learning style. We are advised; “Do not assign students to work based on learning style, intelligence preference, gender, or culture,” and, “Do teach in multiple modes.”
This is sound advice. The findings of cognitive science suggest that it is valuable to make use of multiple modes when teaching. For instance, a diagram accompanied by a verbal explanation is likely to be superior to an explanation alone due to the finding that we are able to process visual and auditory information simultaneously.
What I really don’t understand is what this has to do with a model of differentiation. Differentiation implies that different students, to some extent at least, be treated differently. An exhortation to use multiple modes is just as relevant to a teacher using the dreaded one-size-fits-all teaching method as it is to differentiation. And yet Tomlinson still places the concept of learning profile at the heart of her theory about differentiated instruction.
It seems to me that learning profiles are something of a zombie. At one time, reasonable people might believe that learning styles and multiple intelligences were relevant to classroom practice. However, the verdict is now in and we know that they are not. Rather than killing off this part of the theory, learning profiles lumber on through a series of rationalisations.
So I don’t think we should be differentiating according to learning profiles. My concerns about the kind of differentiation that Tomlinson’s theory represents remain unchanged. There is little empirical evidence to support it and by labeling students and making assumptions about what is best for them, we potentially limit what they may achieve. I would rather start with an open mind than a box to place students in.
There’s a lazy form of news story that crops up in the U.K. from time to time. A headteacher, usually new to the post, will attempt to turn a school around by enforcing some of its rules. This will annoy a few students and parents. A journalist then writes up these complaints, naming the school and publicly shaming the head.
There is a problem of balance here. As a public servant, the headteacher is in no position to rebut or add context to the specific claims that are made. So the claims just hang there. Obviously, a responsible journalist should seek out other parents and students to triangulate the views of a few, but that rarely happens.
In a recent case, a headteacher had announced plans to leave his post at a school and move on to another school. A local journalist saw some negative comments on Facebook and decided that there was a story in this. Apparently, a small group of parents and former students were unhappy with the headteacher enforcing uniform rules.
The piece was originally titled, “Parents and former pupils say ‘good riddance’ to former headteacher…” In a twist to the usual tale, it provoked a massive backlash in the paper’s own comments section and on social media. Since the backlash, the paper in question has closed the comments (huh?), renamed the article and edited it slightly.
Despite being happy to name and shame a public servant in the local rag, the journalist in question seems uncomfortable with the public criticism it has provoked. This is, perhaps, a double standard.
I am not aware of any online criticism that was abusive towards the journalist but if there was any then I strongly condemn it. I have seen a lot of comments that fairly and reasonably criticise the article and voice support for the headteacher concerned.
Bear in mind that the school in question was marked as ‘requires improvement’ by Ofsted when this headteacher took over and has improved since. So there are plenty who are prepared to speak up on his behalf.
I think this episode demonstrates the emerging relationship between old and new media. The tired, easy journalism of the past is now open to public scrutiny in a way not previously seen. You can’t silence critics by simply closing your comments section.