Concept maps are rubbish

For many years, I have given the following advice to students who are preparing for an exam.

“There are two ways to revise; answering questions or turning information from one form into another. Both mean that you have to process information in some way. On the other hand, reading your notes or just copying them out can be done without really thinking and so these aren’t very effective.”

I don’t know what I based this advice upon, other than my own intuition. I was quite late to education research, as I suspect many teachers have been. So I was particularly interested when in 2013, Dunlosky et. al. published their findings about the most effective study strategies. Testing was in there – and that was the main message that seems to have come out from the various articles and blogs that have been written on the back of this paper (as an aside, it is also worth mentioning the recent debate as to whether testing works equally well for complex items as it does for simpler ones).

Rereading and similar strategies, such as highlighting, were found to be ineffective. However, the report is more ambiguous when it comes to my advice about, “turning information from one form into another.” By this, I meant making revision notes, flashcards or posters and Dunlosky et. al. found that ‘summarising’ had low utility because many students don’t do it very well.

One information processing strategy that is particularly popular amongst educational researchers and some teachers, and that is not directly addressed in the Dunlosky et. al. study, is the notion of constructing ‘concept maps’.

Concept Map for Energy produced by NASA - Public Domain

Concept Map for Electricity produced by NASA – Public Domain

I speculate that concept maps are popular because they apparently mirror the ‘schema’ that psychological theories propose for the way that information is organised by the mind. You might expect that constructing a concept map of an area of study would require you to rehearse the different relationships between the concepts and so remember and retrieve that information better.

Which all sets the scene for a fascinating piece of research conducted in 2011 by Jeffrey “Testing Effect” Karpicke and Janell Blunt. They compared testing-based study strategies with concept map construction. In all of their experiments, testing was superior. This was true even when the final test – the “post-test” which we analyse to determine the effect of the experiment – required students to construct a concept map. So concept mapping is not even the best preparation for concept mapping.

This is quite a clever experimental variation because there is a general principle in education research that students tend to learn what you teach them. So, with a cunning experimental design, you can bias the results. If you read much research you start to see it everywhere. For example, imagine you were comparing an inquiry learning approach to gravity, where students perform experiments with balls on ramps, make hypotheses and so on with an explicit approach that teaches them about gravity. If your post-test focuses on hypothesising, results tables and so on then you’ll get a different outcome than if your post-test contains physics questions.

So the fact that Karpicke and Blunt found that testing beat concept maps under conditions most suited to concept maps is significant. However, one objection was raised; the students in the study had little experience in concept mapping and so perhaps they were not doing it very effectively.

Fortunately, we now have a 2015 replication of the Karpicke and Blunt study which also deals with the experience issue. Lechuga et. al. conducted their study in Spain and essentially found the same as Karpicke and Blunt. Crucially, they also assessed students who had prior experience with concept mapping. Retrieval was still more effective than concept mapping for this group, although by a smaller margin.

This illustrates how educational research should work. We have a replication and we have variations to the experimental design to take account of different explanations and hypotheses.

On this basis it is clear that investing time and resources in concept mapping is the wrong choice. It may be better to advise your students to test themselves and answer trial questions instead.


It’s a bit of fun

Please vote in my poll (yes, we are going to be so tired of polls pretty soon):

//platform.twitter.com/widgets.js


Fuzzy maths jumps the shark

By September 1977, Happy Days was an enormously popular TV Show. Initially a supporting character, Fonzie had began to dominate and, in an infamous plot line, he is seeing performing a water-skiing jump over a confined shark. It is debatable as to whether this really was the point at which the show began its long decline; it continued in production for another seven years. But it did demonstrate a high point of absurdity for a program that was originally about a romanticised version of 1950s American family life. And so the phrase “jump the shark” has entered the lexicon to represent such a tipping point.

I wonder whether we are now reaching the high water mark of fuzzy maths; that movement launched in 1989 by the National Council of Teachers of Mathematics (NCTM) in the U.S.; a movement that eschews what it sees as ‘rote’ memorisation of maths facts and procedures in favour of prioritising understanding. Fuzzy maths seems to have taken over much of North American maths education. Despite efforts to make the new Common Core State Standards pedagogically neutral, there is evidence that they are being used to pursue a fuzzy maths agenda. In Canada, large-scale implementation of fuzzy maths is associated with a parallel decline in test scores.

Given the current popularity of fuzzy maths, let me nominate a candidate for the jump-the-shark moment.

The Telegraph is reporting an image taken of a Year 3 maths test that has been posted on Reddit. It shows a question:

“Use the repeated addition strategy to solve : 5 x 3″

The answer given by the student is correct, 15, but it is marked as incorrect due to the way the child has worked it out. He or she has written “5 + 5 + 5” and the teacher indicates that it should instead be “3 + 3 + 3 + 3 + 3”.

This is obviously barking mad. But it demonstrates the kind of hole that fuzzy maths sucks us into. Clearly 5 + 5 + 5 = 3 + 3 + 3 + 3 + 3 =15. It is all the same. This demonstrates a key property of multiplication; that it is commutative i.e. 5 x 3 = 3 x 5.

However, it seems as if the teacher does not want the student to know this yet and that the student is meant to strictly interpret “5 x 3” to mean “five lots of three.” This is a reasonable interpretation. However, the commutivity of multiplication has so suffused our culture that allowing only this interpretation is quite unreasonable. If I gave you a shopping list that had “tin of beans x 3” on it, you would not interpret this to mean “tin of beans lots of 3” you would interpret it to mean “3 tins of beans”.

Similarly, it is quite legitimate to interpret “5 x 3” to mean “five, three times”.

And so, in the name of ‘understanding’ we head in to being both confusing and wrong. Never mind the fact that we really don’t want students to have to work out 5 x 3 using a repeated addition strategy. It is essential that such basic maths facts are memorised so that precious working memory resources may be devoted to higher level aspects of problem solving. The correct answer should be sufficient in this case. And what message is this poor students getting about maths?

The next question on the paper is equally bizarre. Asked to draw an array, the student draws it the wrong way around. Yes, the array might have 24 elements and the answer might be 24 but, for some strange reason, the teacher wants 4 rows of 6 columns and not 6 rows of 4 columns.

You might just put this down to one teacher being idiosyncratic. You may suggest that fuzzy maths does not really require this sort of thing and this particular teacher is operating under a misconception. You may think that nobody would defend this, even those who are committed to fuzzy maths.

Not so.

The NCTM defended the marking of the paper. Diane Briars of the NCTM commented, “We want students to understand what they’re doing, not just get the right answer.”

Funnily enough, this would seem to achieve the precise opposite.

UPDATE: It has been brought to my attention in the comments below that the defence that is attributed by The Telegraph to Diane Briars is identical to statements that she is reported to have made in this news report from May 2014. So perhaps she did not defend it after all. Can anyone shed light on this?


The contradictory world of AITSL

We have a whole rhetoric about discovery learning, constructivism, about learning styles that has got zero evidence for them anywhere.”

Professor John Hattie, now Chair of AITSL, Keynote address to AITSL, November 2011

I feel a certain amount of discomfort in addressing this issue. From time-to-time, The Australian Institute for Teaching and School Leadership (AITSL) releases a new set of illustrations to demonstrate how teachers can meet the Australian Professional Standards for Teachers; a set of standards that it curates.

These illustrations generally consist of a video where a dedicated and probably quite proud teacher talks about and demonstrates his or her practice. It is not my wish to hold these committed professionals up to criticism and so all I intend to do is link once to the whole body of science and mathematics illustrations. If you wish to track through them then you will find the specific items that I will refer to.

The purpose here is to point out that AITSL is still releasing new illustrations that are totally at odds with the expressed views of its Chair, John Hattie; illustrations that are also at odds with the evidence more broadly on effective teaching methods.

For instance, one exemplar school is described in the following way in the illustrations:

“Through this approach [The School] seeks to develop student skills in collaboration, team-work, compromise, creativity and problem-solving, and to accentuate more activity-based discovery learning, project-based learning, and genuine team-based inquiry which are all important in developing STEM knowledge and skills.” [my emphasis]

The video demonstrates how a teacher uses a range of strategies in an attempt to teach difficult concepts which seem more related to computer science than the maths teaching that the video is supposed to be about. Perhaps this is the ‘computational thinking’ that Labor leader Bill Shorten recently took to Twitter to promote. The teacher also sticks notes under the students’ chairs as a way of allocating them to different groups. It is unfortunate that graduate teachers will see this and think it is what experienced teachers do – we’ve all performed these kinds of tricks when being observed but they are time-consuming to prepare and there are far more efficient ways of arranging students in groups.

In another new video, a physics teacher explains that, “I don’t really want to give them more explicit instruction than that; I would rather hold back a bit of information and then have to come in later and give some more help, than to lay it all out in front of them.” Which is quite in keeping with a discovery learning, constructivist approach. Is this good practice? Is this the standard that all Australian teachers should meet? If so, why?

In the “Learning Context” section of another new maths video we are reliably informed that;

“These innovations are based on or derived from the understanding that students learn best through discovery, research & development processes and realistic applications.” [my emphasis]

And in a new science video, the teacher tells us that the students, “need to explore a little bit so that they can build some of that knowledge themselves before we go into a more detailed explanation.” A statement entirely consistent with constructivist approaches to teaching.

In general, the science and mathematics sections seem to focus on what you might call ‘higher order thinking skills‘ or things that aren’t really skills at all like ‘collaboration’ or ‘creativity’. In fact, simply doing lots of group work in order to encourage ‘collaboration’ is unlikely to be effective due to the problem of social loafing. Yet these illustrations promote group work and don’t explain how this problem can be dealt with (see here for some suggestions if you’re really that committed to group work). Instead, a new teacher is likely to conclude that group work = good.

The science illustrations generally focus on scientific inquiry with lots of hypothesising and experimenting and not much actual science teaching taking place. This confuses epistemology with pedagogy.

And we must bear in mind that all of the illustrations that I have mentioned here have been released since John Hattie took over as Chair.


Hattie AITSL


Public Money

Teachers who work in government schools have their salaries paid from public money. In Australia, even independent schools get a significant amount of funding from the government. And yet I think we sometimes lose sight of who we are working for.

For instance, it is perfectly valid for any taxpayer to comment on the education system. Yet dismissing the views of non teachers is common. 

And when teachers argue against standardised testing then they need to answer a simple question: how else do you propose to be accountable to the people who are picking up the bill?

Because, morally, we are be accountable.

Standardised testing is not perfect. There are aspects of Australia’s NAPLAN tests that I dislike and the VCE in Victoria has its idiosyncrasies, having been steered by those with particular agendas. But I would far rather have external measures of this kind than no external measures at all.

Evidence shows that teachers’ own assessments are biased and can act to the disadvantage of the most vulnerable groups in the school population. Standardisation acts as a check on this.

Sadly, the idea that we should do away with external checks and simply be trusted as a profession to get on with things is untenable. This is a profession that has embraced learning styles and does things because of Marxist critical theory. 

Yet there are hopeful signs. In recent years we have seen the emergence of teacher-led movements such as researchED that challenge woolly notions and ask for evidence. Social media has created a forum for an influential subgroup of teachers to engage with research and argument in a way that previously did not exist.

Over time, if this reflectiveness develops then I think we will grow-up as a profession and, amongst other things, we will see those standardised test results start to shift.

Once we are there, the bloke down the pub will no longer feel compelled to tell us how to do our jobs.


A tale of two swimming classes

I had a horrible experience of learning to swim.

My memory is perhaps unreliable but I was clearly slow to pick-up the skills. My grandfather was paying for the swimming classes; something that he tended to point out. I remember my instructor making a big deal about us jumping in to the pool during one lesson. I was full of excitement to share this new achievement with my parents. Until I realised that they were distinctly unmoved.

To compound matters, my mum’s best friend had a son my age and we’d meet up a lot to play together while our parents socialised. The best friend’s son was an exceptional swimmer and it seemed to me that discussing this fact was one of the most popular topics of conversation.

When it came time for my two little girls to learn to swim we had a decision to make. There was the class on the industrial estate that was run by a former Olympic swimmer and there was the class at the health centre. My sister-in-law sent her kids to the one on the industrial estate and gave the impression it was a bit of a sink-or-swim place. One of her children had cried about going there.

So, conscious of my own childhood experience, we sent the girls to the class at the health centre. 

And they absolutely loved it.

They were excited to go every Friday. They used ‘noodles’ in the class for buoyancy – long tubes of foam. They engaged in behaviour that looked very much like swimming and the instructors were friendly and encouraging. At the end of each lesson, the girls were given a lolly.

There was a catch. After about twelve months the girls really could not swim. They had a great attitude towards swimming; they just couldn’t actually do it.

With deep misgivings, we enrolled them at the school on the industrial estate.

It was a different world. From the moment they entered the water the girls were given explicit instruction. It wasn’t unfriendly; it was businesslike with no time to waste. A supervisor – often the former Olympic athlete – kept an eye on the whole pool and would spot things and intervene. Sometimes, the supervisor would pull an instructor aside and have an animated discussion.

The girls were getting lots of explicit feedback and so were the instructors. After about three lessons, both girls were swimming short distances unaided.

This gave them a sense of achievement. Soon they were moving up a class and were given certificates to mark this; a source of some pride. Now they loved swimming and were gaining a sense of achievement from it. No tears. 

Of course, this little vignette does not prove anything. Perhaps the first class had readied the girls for the second one. But I know what I think.

As a child, I eventually learnt to swim. My mum took me to a different, more intensive class and I picked it up quite quickly. So I was probably a casualty of poor instruction. 

Unfortunately, by then I had decided that I was a bad swimmer; that there was something wrong with me; that it was my fault. So I’ve never been able to enjoy swimming. I don’t swim recreationally as an adult and even the smell of a chlorinated pool makes me feel anxious. Of course, I get over myself for the sake of my girls and I swim with them. It’s just that there’s no love there.

I wonder if this is some peoples’ experience of learning to read.


Improving critical thinking

A few months back, I jumped into a discussion of this article in The Conversation about teaching critical thinking skills. If you read it and then read the comments it is worth noting two things.

Firstly, there is a certain irony that a professor of critical thinking who has written an expository article will not provide evidence for specific claims and insists that he is not there to educate me.

Secondly, it is evident that I am thinking critically about the article. Did I apply some stepwise ‘analysis’ or ‘evaluation’ procedure in order to do this? No. What I read conflicted with what I (thought I) knew and so I responded. I suspect that this happens automatically; we are hard-wired to compare new information with old. In fact this is a tenet of constructivist learning theory. If I am right then it is not something we need to be taught how to do. Instead, we need to build a body of knowledge to compare new information with.

Today, Paul Bruno tweeted out a link to a summary of a new meta-analysis of whether college enhances critical thinking (paper available here $).

The study found that it does but that this comes just as much from studying regular subjects as from specific critical thinking courses. And standard courses double-up; you learn literature and develop critical thinking by studying a literature course. There is no extra gain in studying critical thinking and you also don’t learn any literature. From the highlights:

“Students are learning critical-thinking skills, but adding instruction focused on critical thinking specifically doesn’t work. Students in programs that stress critical thinking still saw their critical-thinking skills improve, but the improvements did not surpass those of students in other programs.”

So, case closed? 

Maybe not. I think the error here is when critical thinking courses assume that because a word such as ‘analysis’ is used in different contexts then this means that the actual thinking in these contexts is very similar and so a general skill of analysis exists which can be trained. This seems at odds with the evidence.

However, certain powerful kinds of knowledge could indeed aid critical thinking. Most critical thinking courses teach logical fallacies and give examples. Such examples can be used by students to then compare with statements that they meet in the future – this is a useful enhancement of the knowledge base.

It might also be useful to give specific examples of faulty scientific thinking such as the rationale behind alternative medicines like homeopathy. If a student is later presented with the notion of magnetic pain relief, for instance, she might recognise some similarities.

And perhaps a little history of pyramid investment schemes would be helpful. In fact, the great hope of history is that we can learn from it and avoid repeating failures.

Ultimately, however, it is knowledge of deeper principles that we need. I have no time for homeopathy because I understand that it is scientifically absurd.