Concept maps are rubbish

For many years, I have given the following advice to students who are preparing for an exam.

“There are two ways to revise; answering questions or turning information from one form into another. Both mean that you have to process information in some way. On the other hand, reading your notes or just copying them out can be done without really thinking and so these aren’t very effective.”

I don’t know what I based this advice upon, other than my own intuition. I was quite late to education research, as I suspect many teachers have been. So I was particularly interested when in 2013, Dunlosky et. al. published their findings about the most effective study strategies. Testing was in there – and that was the main message that seems to have come out from the various articles and blogs that have been written on the back of this paper (as an aside, it is also worth mentioning the recent debate as to whether testing works equally well for complex items as it does for simpler ones).

Rereading and similar strategies, such as highlighting, were found to be ineffective. However, the report is more ambiguous when it comes to my advice about, “turning information from one form into another.” By this, I meant making revision notes, flashcards or posters and Dunlosky et. al. found that ‘summarising’ had low utility because many students don’t do it very well.

One information processing strategy that is particularly popular amongst educational researchers and some teachers, and that is not directly addressed in the Dunlosky et. al. study, is the notion of constructing ‘concept maps’.

Concept Map for Energy produced by NASA - Public Domain

Concept Map for Electricity produced by NASA – Public Domain

I speculate that concept maps are popular because they apparently mirror the ‘schema’ that psychological theories propose for the way that information is organised by the mind. You might expect that constructing a concept map of an area of study would require you to rehearse the different relationships between the concepts and so remember and retrieve that information better.

Which all sets the scene for a fascinating piece of research conducted in 2011 by Jeffrey “Testing Effect” Karpicke and Janell Blunt. They compared testing-based study strategies with concept map construction. In all of their experiments, testing was superior. This was true even when the final test – the “post-test” which we analyse to determine the effect of the experiment – required students to construct a concept map. So concept mapping is not even the best preparation for concept mapping.

This is quite a clever experimental variation because there is a general principle in education research that students tend to learn what you teach them. So, with a cunning experimental design, you can bias the results. If you read much research you start to see it everywhere. For example, imagine you were comparing an inquiry learning approach to gravity, where students perform experiments with balls on ramps, make hypotheses and so on with an explicit approach that teaches them about gravity. If your post-test focuses on hypothesising, results tables and so on then you’ll get a different outcome than if your post-test contains physics questions.

So the fact that Karpicke and Blunt found that testing beat concept maps under conditions most suited to concept maps is significant. However, one objection was raised; the students in the study had little experience in concept mapping and so perhaps they were not doing it very effectively.

Fortunately, we now have a 2015 replication of the Karpicke and Blunt study which also deals with the experience issue. Lechuga et. al. conducted their study in Spain and essentially found the same as Karpicke and Blunt. Crucially, they also assessed students who had prior experience with concept mapping. Retrieval was still more effective than concept mapping for this group, although by a smaller margin.

This illustrates how educational research should work. We have a replication and we have variations to the experimental design to take account of different explanations and hypotheses.

On this basis it is clear that investing time and resources in concept mapping is the wrong choice. It may be better to advise your students to test themselves and answer trial questions instead.


Fuzzy maths jumps the shark

By September 1977, Happy Days was an enormously popular TV Show. Initially a supporting character, Fonzie had began to dominate and, in an infamous plot line, he is seeing performing a water-skiing jump over a confined shark. It is debatable as to whether this really was the point at which the show began its long decline; it continued in production for another seven years. But it did demonstrate a high point of absurdity for a program that was originally about a romanticised version of 1950s American family life. And so the phrase “jump the shark” has entered the lexicon to represent such a tipping point.

I wonder whether we are now reaching the high water mark of fuzzy maths; that movement launched in 1989 by the National Council of Teachers of Mathematics (NCTM) in the U.S.; a movement that eschews what it sees as ‘rote’ memorisation of maths facts and procedures in favour of prioritising understanding. Fuzzy maths seems to have taken over much of North American maths education. Despite efforts to make the new Common Core State Standards pedagogically neutral, there is evidence that they are being used to pursue a fuzzy maths agenda. In Canada, large-scale implementation of fuzzy maths is associated with a parallel decline in test scores.

Given the current popularity of fuzzy maths, let me nominate a candidate for the jump-the-shark moment.

The Telegraph is reporting an image taken of a Year 3 maths test that has been posted on Reddit. It shows a question:

“Use the repeated addition strategy to solve : 5 x 3″

The answer given by the student is correct, 15, but it is marked as incorrect due to the way the child has worked it out. He or she has written “5 + 5 + 5” and the teacher indicates that it should instead be “3 + 3 + 3 + 3 + 3”.

This is obviously barking mad. But it demonstrates the kind of hole that fuzzy maths sucks us into. Clearly 5 + 5 + 5 = 3 + 3 + 3 + 3 + 3 =15. It is all the same. This demonstrates a key property of multiplication; that it is commutative i.e. 5 x 3 = 3 x 5.

However, it seems as if the teacher does not want the student to know this yet and that the student is meant to strictly interpret “5 x 3” to mean “five lots of three.” This is a reasonable interpretation. However, the commutivity of multiplication has so suffused our culture that allowing only this interpretation is quite unreasonable. If I gave you a shopping list that had “tin of beans x 3” on it, you would not interpret this to mean “tin of beans lots of 3” you would interpret it to mean “3 tins of beans”.

Similarly, it is quite legitimate to interpret “5 x 3” to mean “five, three times”.

And so, in the name of ‘understanding’ we head in to being both confusing and wrong. Never mind the fact that we really don’t want students to have to work out 5 x 3 using a repeated addition strategy. It is essential that such basic maths facts are memorised so that precious working memory resources may be devoted to higher level aspects of problem solving. The correct answer should be sufficient in this case. And what message is this poor students getting about maths?

The next question on the paper is equally bizarre. Asked to draw an array, the student draws it the wrong way around. Yes, the array might have 24 elements and the answer might be 24 but, for some strange reason, the teacher wants 4 rows of 6 columns and not 6 rows of 4 columns.

You might just put this down to one teacher being idiosyncratic. You may suggest that fuzzy maths does not really require this sort of thing and this particular teacher is operating under a misconception. You may think that nobody would defend this, even those who are committed to fuzzy maths.

Not so.

The NCTM defended the marking of the paper. Diane Briars of the NCTM commented, “We want students to understand what they’re doing, not just get the right answer.”

Funnily enough, this would seem to achieve the precise opposite.

UPDATE: It has been brought to my attention in the comments below that the defence that is attributed by The Telegraph to Diane Briars is identical to statements that she is reported to have made in this news report from May 2014. So perhaps she did not defend it after all. Can anyone shed light on this?


The contradictory world of AITSL

We have a whole rhetoric about discovery learning, constructivism, about learning styles that has got zero evidence for them anywhere.”

Professor John Hattie, now Chair of AITSL, Keynote address to AITSL, November 2011

I feel a certain amount of discomfort in addressing this issue. From time-to-time, The Australian Institute for Teaching and School Leadership (AITSL) releases a new set of illustrations to demonstrate how teachers can meet the Australian Professional Standards for Teachers; a set of standards that it curates.

These illustrations generally consist of a video where a dedicated and probably quite proud teacher talks about and demonstrates his or her practice. It is not my wish to hold these committed professionals up to criticism and so all I intend to do is link once to the whole body of science and mathematics illustrations. If you wish to track through them then you will find the specific items that I will refer to.

The purpose here is to point out that AITSL is still releasing new illustrations that are totally at odds with the expressed views of its Chair, John Hattie; illustrations that are also at odds with the evidence more broadly on effective teaching methods.

For instance, one exemplar school is described in the following way in the illustrations:

“Through this approach [The School] seeks to develop student skills in collaboration, team-work, compromise, creativity and problem-solving, and to accentuate more activity-based discovery learning, project-based learning, and genuine team-based inquiry which are all important in developing STEM knowledge and skills.” [my emphasis]

The video demonstrates how a teacher uses a range of strategies in an attempt to teach difficult concepts which seem more related to computer science than the maths teaching that the video is supposed to be about. Perhaps this is the ‘computational thinking’ that Labor leader Bill Shorten recently took to Twitter to promote. The teacher also sticks notes under the students’ chairs as a way of allocating them to different groups. It is unfortunate that graduate teachers will see this and think it is what experienced teachers do – we’ve all performed these kinds of tricks when being observed but they are time-consuming to prepare and there are far more efficient ways of arranging students in groups.

In another new video, a physics teacher explains that, “I don’t really want to give them more explicit instruction than that; I would rather hold back a bit of information and then have to come in later and give some more help, than to lay it all out in front of them.” Which is quite in keeping with a discovery learning, constructivist approach. Is this good practice? Is this the standard that all Australian teachers should meet? If so, why?

In the “Learning Context” section of another new maths video we are reliably informed that;

“These innovations are based on or derived from the understanding that students learn best through discovery, research & development processes and realistic applications.” [my emphasis]

And in a new science video, the teacher tells us that the students, “need to explore a little bit so that they can build some of that knowledge themselves before we go into a more detailed explanation.” A statement entirely consistent with constructivist approaches to teaching.

In general, the science and mathematics sections seem to focus on what you might call ‘higher order thinking skills‘ or things that aren’t really skills at all like ‘collaboration’ or ‘creativity’. In fact, simply doing lots of group work in order to encourage ‘collaboration’ is unlikely to be effective due to the problem of social loafing. Yet these illustrations promote group work and don’t explain how this problem can be dealt with (see here for some suggestions if you’re really that committed to group work). Instead, a new teacher is likely to conclude that group work = good.

The science illustrations generally focus on scientific inquiry with lots of hypothesising and experimenting and not much actual science teaching taking place. This confuses epistemology with pedagogy.

And we must bear in mind that all of the illustrations that I have mentioned here have been released since John Hattie took over as Chair.

Hattie AITSL


Public Money

Teachers who work in government schools have their salaries paid from public money. In Australia, even independent schools get a significant amount of funding from the government. And yet I think we sometimes lose sight of who we are working for.

For instance, it is perfectly valid for any taxpayer to comment on the education system. Yet dismissing the views of non teachers is common. 

And when teachers argue against standardised testing then they need to answer a simple question: how else do you propose to be accountable to the people who are picking up the bill?

Because, morally, we are be accountable.

Standardised testing is not perfect. There are aspects of Australia’s NAPLAN tests that I dislike and the VCE in Victoria has its idiosyncrasies, having been steered by those with particular agendas. But I would far rather have external measures of this kind than no external measures at all.

Evidence shows that teachers’ own assessments are biased and can act to the disadvantage of the most vulnerable groups in the school population. Standardisation acts as a check on this.

Sadly, the idea that we should do away with external checks and simply be trusted as a profession to get on with things is untenable. This is a profession that has embraced learning styles and does things because of Marxist critical theory. 

Yet there are hopeful signs. In recent years we have seen the emergence of teacher-led movements such as researchED that challenge woolly notions and ask for evidence. Social media has created a forum for an influential subgroup of teachers to engage with research and argument in a way that previously did not exist.

Over time, if this reflectiveness develops then I think we will grow-up as a profession and, amongst other things, we will see those standardised test results start to shift.

Once we are there, the bloke down the pub will no longer feel compelled to tell us how to do our jobs.


A tale of two swimming classes

I had a horrible experience of learning to swim.

My memory is perhaps unreliable but I was clearly slow to pick-up the skills. My grandfather was paying for the swimming classes; something that he tended to point out. I remember my instructor making a big deal about us jumping in to the pool during one lesson. I was full of excitement to share this new achievement with my parents. Until I realised that they were distinctly unmoved.

To compound matters, my mum’s best friend had a son my age and we’d meet up a lot to play together while our parents socialised. The best friend’s son was an exceptional swimmer and it seemed to me that discussing this fact was one of the most popular topics of conversation.

When it came time for my two little girls to learn to swim we had a decision to make. There was the class on the industrial estate that was run by a former Olympic swimmer and there was the class at the health centre. My sister-in-law sent her kids to the one on the industrial estate and gave the impression it was a bit of a sink-or-swim place. One of her children had cried about going there.

So, conscious of my own childhood experience, we sent the girls to the class at the health centre. 

And they absolutely loved it.

They were excited to go every Friday. They used ‘noodles’ in the class for buoyancy – long tubes of foam. They engaged in behaviour that looked very much like swimming and the instructors were friendly and encouraging. At the end of each lesson, the girls were given a lolly.

There was a catch. After about twelve months the girls really could not swim. They had a great attitude towards swimming; they just couldn’t actually do it.

With deep misgivings, we enrolled them at the school on the industrial estate.

It was a different world. From the moment they entered the water the girls were given explicit instruction. It wasn’t unfriendly; it was businesslike with no time to waste. A supervisor – often the former Olympic athlete – kept an eye on the whole pool and would spot things and intervene. Sometimes, the supervisor would pull an instructor aside and have an animated discussion.

The girls were getting lots of explicit feedback and so were the instructors. After about three lessons, both girls were swimming short distances unaided.

This gave them a sense of achievement. Soon they were moving up a class and were given certificates to mark this; a source of some pride. Now they loved swimming and were gaining a sense of achievement from it. No tears. 

Of course, this little vignette does not prove anything. Perhaps the first class had readied the girls for the second one. But I know what I think.

As a child, I eventually learnt to swim. My mum took me to a different, more intensive class and I picked it up quite quickly. So I was probably a casualty of poor instruction. 

Unfortunately, by then I had decided that I was a bad swimmer; that there was something wrong with me; that it was my fault. So I’ve never been able to enjoy swimming. I don’t swim recreationally as an adult and even the smell of a chlorinated pool makes me feel anxious. Of course, I get over myself for the sake of my girls and I swim with them. It’s just that there’s no love there.

I wonder if this is some peoples’ experience of learning to read.


Improving critical thinking

A few months back, I jumped into a discussion of this article in The Conversation about teaching critical thinking skills. If you read it and then read the comments it is worth noting two things.

Firstly, there is a certain irony that a professor of critical thinking who has written an expository article will not provide evidence for specific claims and insists that he is not there to educate me.

Secondly, it is evident that I am thinking critically about the article. Did I apply some stepwise ‘analysis’ or ‘evaluation’ procedure in order to do this? No. What I read conflicted with what I (thought I) knew and so I responded. I suspect that this happens automatically; we are hard-wired to compare new information with old. In fact this is a tenet of constructivist learning theory. If I am right then it is not something we need to be taught how to do. Instead, we need to build a body of knowledge to compare new information with.

Today, Paul Bruno tweeted out a link to a summary of a new meta-analysis of whether college enhances critical thinking (paper available here $).

The study found that it does but that this comes just as much from studying regular subjects as from specific critical thinking courses. And standard courses double-up; you learn literature and develop critical thinking by studying a literature course. There is no extra gain in studying critical thinking and you also don’t learn any literature. From the highlights:

“Students are learning critical-thinking skills, but adding instruction focused on critical thinking specifically doesn’t work. Students in programs that stress critical thinking still saw their critical-thinking skills improve, but the improvements did not surpass those of students in other programs.”

So, case closed? 

Maybe not. I think the error here is when critical thinking courses assume that because a word such as ‘analysis’ is used in different contexts then this means that the actual thinking in these contexts is very similar and so a general skill of analysis exists which can be trained. This seems at odds with the evidence.

However, certain powerful kinds of knowledge could indeed aid critical thinking. Most critical thinking courses teach logical fallacies and give examples. Such examples can be used by students to then compare with statements that they meet in the future – this is a useful enhancement of the knowledge base.

It might also be useful to give specific examples of faulty scientific thinking such as the rationale behind alternative medicines like homeopathy. If a student is later presented with the notion of magnetic pain relief, for instance, she might recognise some similarities.

And perhaps a little history of pyramid investment schemes would be helpful. In fact, the great hope of history is that we can learn from it and avoid repeating failures.

Ultimately, however, it is knowledge of deeper principles that we need. I have no time for homeopathy because I understand that it is scientifically absurd.


The new Australian Curriculum is profoundly flawed

I have been worrying for some time about the results of the Australian Curriculum review. Despite encouraging signs, the end result is deeply depressing in at least one aspect; the new collapsed “HASS” curriculum involves teaching children hardly any worthwhile content. In this post, I raised the following concern:

“We need to be aware just how horrible “Humanities and the Social Sciences,” could be in the wrong hands. We could have Dewey-inspired approaches that start with the child and their place in the world etc. rather than learning about the Romans or the Egyptians or about the countries of the world.”

Well, that’s what we’ve got.

A quick search of the HASS document finds no mention of the Romans or Egyptians until Year 7. Instead, in an almost complete adoption of the APPA’s submission to the review, we have a vague, inquiry-based curriculum, light on content.

Year 1 is typical. Here students will be, “given opportunities to explore how changes occur over time in relation to themselves, their own families, and the places they and others belong to.” The history ‘knowledge and understanding’ is described as follows:

“The content in the history sub-strand provides opportunities for students to develop historical understanding through key concepts including continuity and change, perspectives, empathy and significance. The content for this year focuses on similarities and differences in family life over recent time (continuity and change, perspectives) and how people may have lived differently in the past (empathy). Students’ understanding is further developed as they consider dates and changes that have personal significance (significance). As students continue to explore the past and the present, they begin to speculate about the future (continuity and change).”

‘Empathy’ as content.

Students will be investigating the Inquiry Questions, “How has family life changed or remained the same over time? How can we show that the present is different from or similar to the past? How do we describe the sequence of time?

And of course, this is all written-out in a confusing and unusable way, complete with loads of little icons to indicate general capabilities such as ‘critical and creative thinking’ that the more sensible submissions to the review sought to do away with due to the fact that they cannot really be taught.

So a complete win there for Deweyan ideology; the idea that social studies have to start with the child’s immediate universe and work outwards. This, in turn, represents a total rejection of the scientifically-based Core Knowledge argument that knowledge of the world aids reading comprehension. So it also potentially hobbles any gains available through the new curriculum’s much vaunted emphasis on phonics.

Moreover, it is a boring and woolly curriculum for young children to follow. As a father of two young daughters, I can confirm that they are hungry for knowledge of the world, ancient and modern, near and far, and that their imaginations are not limited to their immediate vicinity. It is hard to even comprehend such a narrow perspective except through the lens of ideology. Of course, children with middle class parents will still gain a lot of world knowledge from home and so the curriculum will act to aggravate inequality.

What a complete waste of everyone’s time.

Australia adopts the thinking of John Dewey from circa 1900

ACARA adopts the thinking of John Dewey, circa 1916


The effect of Reading Recovery

Earlier this year, Horatio Speaks wrote a blog post about Reading Recovery and its derivative, ‘Switch-On Reading’. I didn’t pick this up at the time but it has come to my attention due to the subsequent discussion. Stephen Gorard, prior to making points about anonymous bloggers that I would reject, made a valid argument about effect size. This is something that keeps coming up and so I’d like to address it.

Basically, most education research is badly designed. Controls are poor, there are high attrition rates, a lack of random sampling and so on. Rather than reject pretty much all of the research on these grounds – The ‘What Works Clearinghouse’ strategy – John Hattie made the case in his 2009 book for taking it into account, but at the same time setting a reasonably high bar for the magnitude of any effects. From quantitative studies it is usually possible to calculate an ‘effect size’. A size of 0.0 means no effect and a size above 1.0 would be most extraordinary. Hattie sets the bar at 0.4 for effects worth considering.

The problem is that Hattie treats both poorly controlled studies and well designed studies in the same way. This means that the effect size of worked exampleS at 0.57 is not really comparable with other effect sizes because the worked example effect experiments were proper randomly controlled trials (RCTs). So the ranking of effects that Hattie generates is dubious (I won’t get into the more general debate about the usefulness and validity of effect sizes here).

When the Education Endowment Foundation conduct proper RCTs of various interventions, it is therefore a little unfair to insist that the effect sizes should be above 0.4. Anything above 0.0 is worth considering in this case, a point that Gorard makes. An additional four months of progress for students in a reading intervention compared to their peers is worth having. This was the finding of the Switch-On Reading study which generated an effect size of 0.24.

However, before you rush out and sign your school up for Switch-On Reading, you might want to consider that the study that was conducted was a complete waste of time.

Reading Recovery style interventions have been evaluated many times in a broadly similar way and so the results could quite easily have been predicted. Yet this does not mean that Reading Recovery is effective. Far from it.

The problem is the control group. We virtually always see Reading Recovery compared with no intervention at all. It seems plausible to me that any series of 20 minute one-to-one reading sessions with a capable other would have some effect on reading. And such sessions could be quite cheap and easy for schools to arrange.

When I was at primary school I was involved in a type of intervention like this. I can’t recall the name of it and so I’m unable to search the literature for the evidence. I was in about Year 4 or 5 and a group of us gave tuition to students who must have been in Year 1 or 2. The little ones would read to us. When they got stuck we simply told them the word; we were absolutely forbidden from helping them sound it out. The horror! Imagine that, we could have killed the love of reading that these struggling, disengaged readers had.

I wouldn’t be surprised if this intervention had an effect size of about 0.24. However, it would be much cheaper than Reading Recovery with its requirement for specially-trained teachers.

What we really need to know about Reading Recovery is whether it has any effect over and above that of any other kind of one-to-one tuition. If not, we can dispense it and just go for the tuition.


The 7 hipster beards of education

I was walking through town earlier today with my family when we saw a hipster fall off his unicycle.

It brought to mind a outing to see a band a few months ago with a friend. I looked around the bar and realised that I was the only man in there who was not sporting a beard. I think such beards have something to tell us about education.

Now before you ask me to check my privilege and stop oppressing hirsute organic-coffee drinkers or darkly mutter that I am perhaps in cahoots with Gillette and Remington, I would like to stress that I have absolutely nothing against beards. A good friend of mine has had a neat and mildy attractive goatee since at least 1999. Hipster beards are a current fashion and this is exactly how such cultural phenomena are supposed to work, even if they leave me a little bemused.

So exactly why am I all caught-up in beards, like a crumb of potato chip or some egg? Well, because I don’t think that education should be subjected to fashions in the same way as men’s faces and yet I think there are quite a few of these around. Such as…

1. Blaming children for having the wrong mindset

Carol Dweck wrote a useful book summarising her research on the affective side of learning. Unfortunately, we have morphed this into yet another way for adolescents to feel insecure and inadequate. There are posters spelling-out just how bad you are if you have a ‘fixed mindset’, drawing a stark contrast with those sorted individuals who have ‘growth’ mindsets. A child does badly on a test – possibly because she hasn’t been taught very well – and feels bad about it. What do we conclude? That’s she’s got a fixed mindset.

And it is a ‘she’, isn’t it? It’s all about these smart girls. What’s their problem, eh? They should be satisfied with being taught badly just like all the lazy boys are.

2. Saying “I don’t teach content, I teach children”

What can this possibly mean? Taken one way, it is trivially true but then, when you think about it for a bit, it’s manifestly false. It’s a classic deepity. Exactly what are you teaching these children?

3. The Maker Movement

Kids need to make things out of toilet rolls, plastic and some wires. For at least one hour per week. Because innovation.

4. Common Core means that you’ll have to start teaching in a very specific way and I can provide training on that

The fact that Common Core doesn’t apply anywhere outside of the U.S. makes the constant barrage of this stuff particularly irritating. The fact that people are using Common Core maths to push dodgy problem-solving strategies makes it doubly so.

5. Motivational posters and infographics

Please stop.

6. Randomly Skyping a class full of kids in a different country

I honestly can’t work this one out. I suppose it’s because technology.

7. Flipping Classrooms

To be fair, this has been going on for a while but I just don’t see the logic. Lecturing works best when it’s interactive, peppered with questions and the lecturer can read the responses of the audience; ‘they look bemused – I might try explaining that again.’ Suggesting that we can parcel this off into a homework task where a kid watches a video seems a little implausible. What if they don’t watch the video or don’t really follow much of it?

It is an odd mix of devaluing explicit instruction – classes are freed-up for all those wonderful questions – whilst insisting that students receive the worst possible kind of explicit instruction. Incidentally, this is why MOOCs don’t work.

A bar full of beards

When I was at university, we used to say that having a beardy bloke in the bar was a sign of good luck. Nowadays there are beards everywhere. See if you can spot some educational hipster beards yourself. Feel free to add them to the comments.


By Anna reg (Own work) [GFDL ( or CC BY-SA 3.0 at (, via Wikimedia Commons