Some comments on explanations in maths

Barry Garelick and Katharine Beals wrote an excellent piece in The Atlantic on their scepticism about requiring students to write explanations for maths problems. Dan Meyer then wrote a response and I, along with many others, jumped in on the comments. It’s a great discussion but I wonder whether a couple of points touched on by Garelick and Beal have been missed along the way. They are:

1. Understanding is latent and cannot be measured directly

2. There is no reason to think that prose explanations are better at exposing understanding than correct or incorrect solutions

It seems that many people – let’s call them the ‘explanationists’ – think explanations are important because they give us direct access to what the student understands. Well, they might. In class, I ask my students to explain their thinking all the time. However, in the numerous examples that people tend to post online (this is my favourite), these explanations sit in assessments and the debate in the US seems to centre around Common Core ‘aligned’ tests.

The issue raised by the explanationists is that perhaps students have ‘rote’ learnt a procedure. If this is the case, they will be able to get the right answer without understanding why. Even showing the correct mathematical steps – the ‘workings’ as I would call them – is not sufficient for the explanationists because a student might have ‘rote’ learnt these too. I am quite sceptical about this elevation of ‘understanding’ as the primary aim of maths and maybe we could leave it at that, as some more traditionally-minded teachers are inclined to do.

Yet strangely, we may take the argument of the explanationists and actually use it against explanations. Who is to say that these cannot be ‘rote’ learnt either? Have these teachers never taught any other subjects? I do. I teach VCE physics and I teach lots of explanations. For instance, if asked to explain the role of a split-ring commutator in a D.C. Motor, I tell students to write:

“The split ring commutator reverses the direction of the current through the coil every half turn, thereby keeping the torque in the same direction and therefore the coil rotating in a constant direction.”

could rely on them to construct a response from their own understanding. However, this would be fraught. It’s a pretty tricky thing to explain, there is a chance that they won’t cover all of the points and they might say something like “keeping the torque the same” rather than “keeping the torque in the same direction”. This would be technically wrong for the kind of motor that students are typically asked about. And so I teach them an answer that I’ve derived from examiners’ reports.

I therefore cannot really tell whether students understand the principle of the split-ring commutator by analysing responses to this question, although I suspect it is far easier to memorise an answer if you have a good understanding of what it means.

You may disagree with the principle of me teaching this explanation in this way. You might hiss that I am “teaching to the test”. Perhaps, but that’s not what this post is about. The point is that it is quite possible to memorise an explanation. If we think students might be motivated to ‘rote’ memorise a procedure for a test then the same motivation might make them ‘rote’ memorise an explanation. Of course, a skilful questioner will vary the questions to avoid some of this. This is why externally set tests are so valuable – you don’t know what’s going to come up. But a good question will expose a misunderstanding just as easily as asking students to write an explanation.

For instance, I am going to draw heavily on Dylan Wiliam here and propose the following question:

fractions question

 

 

 

 

A student who selects ‘B’ is likely to have the misunderstanding that the larger the denominator, the smaller the fraction. This is a classic maths misconception because it results from overgeneralising something that is true. A student who picks ‘C’ is very likely to have the correct understanding. Of course, we can’t be sure. It could be chance. Or, the student might have been trained in a procedure for answering this kind of question without really understanding how it works (I’m not sure what that would look like). Even something as simple as seeing a question like this before might prompt a student to pause, remembering that it wasn’t as straightforward as they had first thought, rather than just writing down ‘B’.

Judicious use of such questions is at least as likely to expose a students’ understanding as requiring written explanations.

Of course, everyone irrationally hates multiple-choice questions. Perhaps we should be assessing students using real-world investigations that are marked against rubrics? Would such an extended response solve the problem of memorising procedures and/or explanations and ensure that we are assessing true understanding? No. Talk about this with your colleagues who teach English or history and you will realise that assessment through rubrics is no picnic. Rubrics are just as gameable as any other form of assessment:

How rubrics fail - Greg Ashman

For some students, maths is a respite from literacy-based activities; reports, investigations, researching stuff on Google. If we require written explanations in maths tests then we are making maths performance contingent on literacy ability. This will disadvantage those who can do the maths and understand the maths but have low literacy, such as those who are still learning English. It means that our test is not valid; it is not measuring the thing that it is supposed to be measuring.

Given that the language of maths – the symbols and operators – have been developed over time to precisely describe notions that are often quite difficult to put into words, it also seems a little perverse to insist on such backward translation from maths into English as evidence of understanding when there is a good chance that it is nothing of the sort.

Standard

14 thoughts on “Some comments on explanations in maths

  1. Chester Draws says:

    Many of the best Maths students will look at you blankly if you ask for “an explanation”. They have internalised the meaning of the symbols to the point whereby they implicitly understand them and struggle to put them into words. I personally find it really hard to explain what absolute value means, for example, even though I understand it perfectly. It is what it is, and explaining is doesn’t make it any better.

    If we think about using a language, we are doing it wrong. It is only when we get to the point that we use a language fluently — without thinking consciously of each step — that we are really mastering it.

  2. Tempe says:

    I just asked my year 6 daughter to answer the fractions question. This was a child who received D grades up until yr 4 when she started Kumon. With the help of Kumon she bumped up to a B grade, because now she not only understands but can actually do the maths. She answered the fractions question almost instantaneously correctly. I put this largely down to having just completed the Kumon fraction section. This meant she has spent the last couple of months simplifying, adding & subtracting fractions with like & unlike denominators etc. In addition (as I home school) I taught her – as best as I was able – some conceptual stuff, but she spent far more time on just practicing working with them. Lastly, she spent some time on IXL sorting fractions ie Do I think she understands? Yep, it was very clear to me that she did…

  3. One of the clearest signs of understanding that we mathematicians watch for in student work is that it is well-organized and sensible, so that the arrangement on the page shows intelligent design and that intangible “elegance” that is hard to pin down but you “know it when you see it”. When I look at the samples of what many educationists consider effective “explanations” that “show understanding” in mathematics I often see the opposite: Numbers are blown apart and arranged on the page in an arbitrary fashion … connecting lines clutter things up supposedly showing lines of thought but they cluster randomly like unmatched socks in a drawer. Explanations are random, and often follow no logical sequence, and avoid standard words as if knowing how to follow templates shows “rote” learning and no original thought. Actually we don’t look for original thought as much as we look for well-disciplined thought. The latter often reflects greater understanding, especially when the subject is highly technical.

    It is rare nowadays for students to be coached by teachers to learn to present their work well. This means that students must then use valuable working memory resources just to keep track of the least important aspects of complex calculations … bookkeeping … and focussing attention on meaning at the wrong level altogether.

  4. At the crux of this is the fear that there is ‘shallow’ learning occurring which appears to deliver performance in the short-term without the depth required for future study. It is true that there are different ways in things might be ‘learnt’ for a test and that some ways seem to deliver short-term performance with less good performance in the long term. I think you are right to point out that requiring ‘explanations’ is not really a remedy for this.

    However, I do believe you can test the depth of understanding beyond simply whether the answer given was right/wrong. You could achieve this by assessing ‘certainty’ as a proxy for depth-of-understanding. The mark schemes use negative marking to motivate pupils to be honest (i.e. over-stating is discouraged by higher stakes for higher certainty). The results would give a good insight into whether the understanding was deep or shallow.

    For more info have a look at: http://www.ucl.ac.uk/lapt/

    PS This is my pet topic. If anyone wants to experiment with it, I’m keen to hear about it.

  5. I love Nic’s comment. I recommend again superforcasters where they go in depth in to measures of measuring answers and describe the Brier score which is correctness Plus confidence integrated over time.

    This solves a huge part of MC concerns. It us also more real world. In the real world you do math when you don’t have someone to tell you the answer. Confidence in correctness time correctness is the true measure of anyone’s answers.

  6. Glad you enjoyed the article. You are correct about the possibility of memorizing the explanation itself. This is addresed in the article:

    “How do we know, for example, that a student isn’t simply repeating an explanation provided by the teacher or the textbook, thus exhibiting mere “rote learning” rather than “true understanding” of a problem-solving procedure?”

    I often refer to this as “rote understanding” which seems to get people who were initially angry at me even angrier.

    Barry Garelick

  7. Pingback: Good writing Part 1: Mark schemes warp teaching and assessment | David Didau: The Learning Spy

  8. Pingback: Germane load: The right kind of mental effort? | Evidence into Practice

  9. Pingback: The Golden Mean: Aristotle and KS3 History | to learn is to follow

  10. Pingback: Assessment and progression in history – Musings of a history teacher

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.