Why the Scientific American article on maths education doesn’t add up

PISA recently released a report about the data that they have collected on maths teaching and learning strategies. I analysed some of this data and related it to the claims that PISA made. The report was quickly followed by an article in Scientific American.

The Scientific American article focused on one area of the PISA report in particular – the rate at which students report using “memorisation” strategies. In the working paper used as a basis for this report, the measure used to quantify memorisation is explained. Students were asked the following questions:

“For each group of three items, please choose the item that best describes your approach to mathematics.

Labels (not shown in the questionnaire): (m) memorisation (e) elaboration (c) control

a) Please tick only one of the following three boxes.

1 When I study for a mathematics test, I try to work out what the most important parts to learn are. (c)

2 When I study for a mathematics test, I try to understand new concepts by relating them to things I already know. (e)

3 When I study for a mathematics test, I learn as much as I can off by heart. (m)

b) Please tick only one of the following three boxes.

1 When I study mathematics, I try to figure out which concepts I still have not understood properly. (c)

2 When I study mathematics, I think of new ways to get the answer. (e)

3 When I study mathematics, I make myself check to see if I remember the work I have already done. (m)

c) Please tick only one of the following three boxes.

1 When I study mathematics, I try to relate the work to things I have learnt in other subjects. (e)

2 When I study mathematics, I start by working out exactly what I need to learn. (c)

3 When I study mathematics, I go over some problems so often that I feel as if I could solve them in my sleep. (m)

d) Please tick only one of the following three boxes.

1 In order to remember the method for solving a mathematics problem, I go through examples again and again. (m)

2 I think about how the mathematics I have learnt can be used in everyday life. (e)

3 When I cannot understand something in mathematics, I always search for more information to clarify the problem. (c)”

I am not convinced that these memorisation options represent actual memorisation strategies. Also, the questions are asked in a way that forces a discrete choice. The accepted practice in psychology is to use a scale of agreement with any given statement (e.g. a Likert scale). Without this, we have a validity and reliability problem. For instance, a student might partly agree with all three responses to question a) but when they are forced to select one response then this will be recorded as 100% agreement with that option and 0% agreement with the alternatives. This is the same reason why the Myers-Briggs personality test is invalid and unreliable.

It is therefore hardly surprising that I could find no correlation between the “index of memorisation” that PISA derive from these responses and a country’s PISA mean maths score. These questions probably do not reliably measure the use of memorisation.

Yet the Scientific American article makes a number of claims about memorisation on the basis of this data. Unfortunately, the authors provide no references and they seem to be in possession of data that is not presented in the PISA report (if either author reads this post then I would be grateful for this data). Nevertheless, I think some of these claims are highly unlikely and I wonder whether the authors may have made an error.

I will list these claims below and then comment on them.

1. In every country, the memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.

I cannot tell how a “memoriser” is defined from the PISA report. For instance, is it a person who answers with a class (m) response to all of the questions above, three of them, two of them? Similarly, data on the number of such memorisers in each country is not provided.

I would not be surprised to find out that, in any given country, these memorisers are the lowest achievers but I am not sure what this would tell us. As Robert Craigen points out in a comment on a previous post, memorisers might have resorted to some of these strategies due to poor teaching. They may also have less understanding of, or interest in, the survey questions.

However, I find it highly unlikely that countries with high numbers of memorisers correlate with teens doing poorly on the PISA math assessment. Presumably, countries with higher numbers of memorisers will have a higher overall index of memorisation. If not, this would require the remaining non-memorisers to use far fewer memorisation strategies than the overall mean. If you plot percentage of maths low achievers against index of memorisation then there is no correlation.

pisa-low-maths-against-memorisation-index

2. Further analysis showed that memorizers were approximately half a year behind students who used relational and self-monitoring strategies.

3. In no country were memorizers in the highest-achieving group, and in some high-achieving economies, the differences between memorizers and other students were substantial.

Again, I would like to see the data here but I can believe it.

4. In France and Japan, for example, pupils who combined self-monitoring and relational strategies outscored students using memorization by more than a year’s worth of schooling.

Why select just two countries like this? Again, I don’t have the underlying data but, if I did, it wouldn’t tell us much. It is fraught enough to try to make comparisons across many education systems of different sizes and with different cultures. At least if we include all of them then we might pick up some general trends. I’m sure it would be possible to prove almost anything with just two examples.

5. The U.S. actually had more memorizers than South Korea, long thought to be the paradigm of rote learning.

Again, we would need to know the definition of a “memoriser”.

6. Unfortunately, most elementary classrooms ask students to memorize times tables and other number facts, often under time pressure, which research shows can seed math anxiety. It can actually hinder the development of number sense.

I would love to see this research. Victoria Simms recently reviewed a book by one of the authors of the Scientific American article and found a similar claim:

“Boaler suggests that reducing timed assessment in education would increase children’s growth mindsets and in turn improve mathematical learning; she thus emphasises that education should not be focused on the fast processing of information but on conceptual understanding. In addition, she discusses a purported causal connection between drill practice and long-term mathematical anxiety, a claim for which she provides no evidence, beyond a reference to “Boaler (2014c)” (p. 38). After due investigation it appears that this reference is an online article which repeats the same claim, this time referencing “Boaler (2014)”, an article which does not appear in the reference list, or on Boaler’s website. Referencing works that are not easily accessible, or perhaps unpublished, makes investigating claims and assessing the quality of evidence very difficult.”

7. In 2005 psychologist Margarete Delazer of Medical University of Innsbruck in Austria and her colleagues took functional MRI scans of students learning math facts in two ways: some were encouraged to memorize and others to work those facts out, considering various strategies. The scans revealed that these two approaches involved completely different brain pathways. The study also found that the subjects who did not memorize learned their math facts more securely and were more adept at applying them. Memorizing some mathematics is useful, but the researchers’ conclusions were clear: an automatic command of times tables or other facts should be reached through “understanding of the underlying numerical relations.”

This claim does at least provide a clue as to where to find the evidence although it is a little odd. The neuroscience part of the claim is essentially irrelevant to teachers – why care what ‘brain pathways’ are used? Teachers generally have no opinion on this. We need to focus instead on the quality of learning.

I think I have found the paper. Unusually, it does complete both a neuroscience imaging study and a behavioural study on the quality of learning, as suggested in the Scientific American claim. The participants were 16 university students or graduates. They did a series of trials where they were given two numbers, A and B. In the ‘strategy’ condition, students were given a formula to apply such as ((B-A)+1)+B)=C in order to work out the answer, C. In drill instruction, they were given A, B and the response, C to simply memorise. Surprisingly, the memorisers did pretty well on a later test but, wholly unsurprisingly, they could not extend this to transfer tasks involving new values for A and B. This is entirely consistent with the findings of cognitive load theory were problem solving so occupies our attention that we cannot infer the underlying rule. The strategy example is much more like following a worked example.

However, none of this bears much relationship to memorisation strategies in the PISA report. Is anyone attempting to teach students all of the possible questions that they might be asked and all of the possible numerical answers to these questions? In fact, the use of formulas like in the above “strategy” condition is often criticised as the “rote” learning of formulas and I imagine that this is what maths memorisers – if well-defined – would be trying to memorise.

This research does not seem to apply to the learning of basic maths facts such as multiplication tables. Teachers attempt to teach these to the point of memorisation but the underlying rule is not withheld. Tables are built up from counting patterns, arguments about groups of the same size and so on. Patterns are highlighted like the ones in the 11 and 9 times tables and a few more facts are committed to memory through practice such as 7 x 8 = 56. But these are very simple operations and nothing like the contrivance ((B-A)+1)+B)=C. In fact, the benefit of knowing simple multiplication results ‘by heart’ is that you can then attend to the other elements of a complex operation.

8. Timed tests impair working memory in students of all backgrounds and achievement levels, and they contribute to math anxiety, especially among girls.

This is partially a repeat of claim 6 but also adds the claim that timed tests impair working memory. Again, it would be good to see the evidence to support this.

Standard

16 thoughts on “Why the Scientific American article on maths education doesn’t add up

  1. Oggi says:

    I keep hearing that self-reporting is not reliable and then I come across questionnaires like this one. We keep telling children that memorization is a lower order skill and that other activities are more worthwhile and more valuable. Then we ask them if they memorize to prepare for exams. What a surprise that only the lower scoring kids admit to doing this lower order thing. (ignorant and proud of it!). It reminds me of a child I volunteered with; because he was told that he got easily bored in lessons because he was very intelligent he kept complaining of being bored, even when touring London in a double decker and seeing the beautiful view of the Thames for the first time.

  2. David says:

    It’s much simpler than all that analysis. Those of us who were forced to memorize (or loved it) are being told by “experts” who do not teach children each day, but who, themselves, went through the forced memorization or loved-it memorization, that memorization is a bad and unnecessary strategy. I was forced to eat veggies, but I’m going to let my children eat whatever they determine is tasty. I was forced to brush my teeth, but my kids can choose.
    Try teaching grade three math (disclaimer again: US teacher with no knowledge of how these things play out in other English speaking countries) to children who lack the benefit of numeracy, yet can draw several diagrams to prove that 2 + 1 = 3, call number bonds “math mountains,” have no clue why PEMDAS, and have never heard the term “addend,” because the text calls them “number partners.”

  3. I believe the report was timed to support and provide an authoritative voice for the Boaler piece in Scientific American. Get the implication — it’s SCIENTIFIC … and it’s citing EVIDENCE from a big data set. That is supposed to close the debate. I find it shocking that the SA folks just caved to this, but … well, we know how the Prog Ed folks roll.

    This supposedly happens independently, but Boaler was citing this “report” almost a year ago. She knew ages before it came out what would be in it. I suspect she was very much a part of the shaping of that report but insisted “for appearances” that her name not appear except in one place where she’s cited as an authority. So this is supposed to suggest experts “around the world” are totally coming to the same conclusions as JB.

    I can’t prove that theory with anything I know. But I’m pretty sure there’s some truth to it just because of the sequence of events … when the article & report came out (and when that article must have first been submitted) and when she started citing the report … and how conveniently it is worded to match up with her desired conclusions. That and the way, as Greg documents, the report’s conclusions don’t really draw from anything particularly notable in the data arrogated. It’s almost like the conclusions were written, and the data superimposed as an afterthought.

    Anyway, I had to say that. I’d be content to be shown otherwise, but I think the circumstantial reasons to believe this is pretty strong; it’s a simple piece of sleight of hand.

  4. “1. In every country, the memorizers turned out to be the lowest achievers, and countries with high numbers of them—the U.S. was in the top third—also had the highest proportion of teens doing poorly on the PISA math assessment.”

    Not only is there no correlation but there is absolutely nothing remarkable about the U.S. on EITHER of these scales. As your chart shows the U.S. is absolute vanilla — right in the middle of the spread on both scales. On what basis do they single the U.S. out for comment here? There’s nothing to see there. Did they even actually … you know … look at the data?

    But if there were something worth commenting on in this case — so what?

    To draw any conclusions about the proportion of the group performing very poorly (relative to the rest of the population) is to magnify some of the most irrelevant variables, namely the degree of societal homogeneity, stress to minority cultural communities and the relative proportion of those alienated and disadvantaged.

    It would be completely unremarkable if the U.S. stood out in 2012 by any measure based on such an observation, since (at least until the recent migrant crisis in Europe) the U.S. has been proportionally the largest destination for refugee and illegal migrant populations. This means that the U.S. Societal system has been disproportionately burdened by destitute people of extremely difficult backgrounds who for various reasons have fallen between the cracks.

    Add to this the crumbling family structure and social decay in the black (and to a lesser extent some other ethnic) subpopulations. Now many countries display the latter characteristic, but it is of notable proportions in the U.S.. To draw conclusions about pedagogy and learning strategies based on an observation highly likely to rely more sensitively on these variables, singling out the U.S. as if it were an ideal showcase of the effects of a single educational variable is … ludicrous. Try that in a country with a more homogeneous cultural and civilizational structure. Like Japan, perhaps. Or Switzerland. But not the U.S. That’s just asking for conflating variables.

    Jumping into a noisy population with volatile extraneous factors would be like choosing the worst basketcase schools possible to run a pedagogical experiment to prove that some approach is better because it “improves” education relative to a “control” group.

    Yeah, pick schools so bad that filling classrooms with jello and playing loud electronic pop music all day is likely to improve scores. Anything being better than the status quo. And lo and behold your garbage system looks pretty good in comparison. I can’t think of anyone dumb enough to think educational experiments like that would be taken seriously … er … uh …

    Oh yeah.

    Actually … I can think of three.

    Boaler’s Phoenix Park experiment.

    Boaler’s Railside experiment.

    Kamii and Dominic’s “The harmful effects of algorithms” experiment.

    And all three were taken seriously by North American educationists.

    Gonna have to recalibrate my assumptions there …

  5. Jos Linssen says:

    &Robert Craigen

    “Boaler was citing this “report” almost a year ago. She knew ages before it came out what would be in it.”

    Boaler was actually involved in this research. At he Youcubed site you can read:

    “I am excited to share with you today the release of a Scientific American article I wrote with Pablo Zoido on the dangers of memorization approaches in maths. Pablo was an analyst for the OECD / PISA and together we analyzed data from 13 million students worldwide. We found that students who take a memorization approach are the lowest achieving students in the world. ”

    Jos (Dutch Maths teacher. Maths in the Netherlands is destroyed by the Freudenthal Institute who are a big supporter of Boaler and use the same ‘research’-methods)

  6. Pingback: What will we learn from the new round of international test results? | Filling the pail

  7. Pingback: Math Anxiety: An Important Component of Mathematical Success | The Learning Exchange

  8. Pingback: How should we teach maths? | educationandstuff

  9. Pingback: Why test times tables? | Filling the pail

  10. Pingback: Uitspraken Rekenen Deel 2 - Beter Onderwijs Nederland

  11. Pingback: Jo Boaler - Beter Onderwijs Nederland

  12. Pingback: Math Anxiety: An Important Component of Mathematical Success - The Learning Exchange

  13. Pingback: How Andreas Schleicher Learned to Stop Worrying and Love Teacher-Directed Instruction – Filling the pail

  14. Pingback: Math Anxiety: An Important Component of Mathematical Success - Math - theLearningExchange

  15. Pingback: How to frustrate maths students with mixed ability teaching | Filling the pail

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.