# School maths versus real-world maths

**Posted:**April 12, 2016

**Filed under:**Uncategorized 21 Comments

The constructivist or reform mathematics movement tends to set-up a distinction between the kind of mathematics that is often taught in school and the mathematics that is used in real-life. Three kinds of research are used to support this.

- Studies that show that children invent their own mathematical strategies if required to do so in order to earn a living. These include studies of Brazilian street vendors.
- Studies that show that adults who can answer an arithmetic question in a real-life context are less successful when answering similar questions in a formal test.
- Studies that show that children who are taught a standard mathematical procedure will apply it in a nonsensical way and/or generate bizarre answers.

Put together, this evidence is used to support a narrative that there is something artificial about school mathematics. Instead, we should teach students mathematics through involving them in authentic, real-life experiences of *using* maths. By developing their own strategies for solving such problems, they will have a better understanding of what they are doing and also be able to apply maths in a range of different contexts.

But it is worth analysing these ideas from a different perspective.

The Brazilian street vendors seem to show that it is possible to learn through discovery, although it seems likely that mathematical strategies are communicated between children or between adults and children, and so participants may simply be using procedures that someone else has shown them. It’s also worth noting that these students could have been at school until the second grade and still be included in the study.

A *possibility* of learning though discovery is not the same thing as discovery being the *most effective* way of learning. For instance, we know that many children will learn to read through a whole language approach. The problem with whole language is that fewer children will be successful this way and those that do learn may have later problems with spelling or pronouncing unfamiliar words.

The second set of studies relates to a failure of transfer. Transfer is notoriously difficult to achieve. Dan Willingham explains this as an inability of novices to tell the difference between the surface (i.e. unimportant) features of a question and the deep structure.

A fairly traditional way to teach for transfer is to start by teaching a concept in a particular context or in the abstract and then gradually work outwards to more diverse contexts. I also think activities that compare and contrast approaches can help students apprehend the deep structure, something that I wrote a little about here. It is interesting that contrasting cases appear a great deal in studies on ‘productive failure’ and ‘preparation for future learning’ literature, although not always in the explicit instruction control conditions that are used.

It seems likely that students who learn maths through a particular real-life context may have their learning stuck to that context and may struggle to transfer their learning just as much as those students who learnt ‘school mathematics’. Moreover, the very artificiality of school maths is often about stripping away surface features in order to make the deep structure more visible – this is the point of maths being abstract. Students who have learnt school mathematics may well struggle to apply it to a novel measurement problem but those who learnt the same mathematics through solving money problems may struggle even more.

Finally, what of the students who, when taught standard mathematical procedures, use them to give bizarre answers or answers that don’t fit the context. That was the result of the Kamii and Dominick study that I linked to above. They found that students who were taught to use the standard procedure often produced answers that were a long way from the real answer whereas even when students who invented their own strategy were wrong, their wrong answers were a good estimate for the actual answer. It was not a true experiment and, when a similar study was conducted by Stephen Norton in Queensland, a very different result was obtained.

There were two key differences between Kamii and Dominick and Norton: Norton classified children based upon their answers – had they used the standard algorithm or not? – whereas Kamii and Dominick tracked students who had been taught in particular ways. Secondly, the calculations that students did in the Norton paper where generally more complex, involving larger numbers. Norton found that students who used the standard approach faired better.

Bearing this in mind, it is worth reviewing an expert panel report produced in 2004 for the Ontario government in Canada. In states:

“It is known that, initially, most children come to school as enthusiastic, curious thinkers, whose natural inclination is to try to make mathematical sense of the world around them. This natural curiosity can be nurtured in a problem-solving approach that begins with, and fosters, student own ideas and methods. For example, Carpenter, Ansell, Franke, Fennema, and Weisbeck (1993) found that two-thirds of Kindergarten and Grade 1 students in mathematics programs focused on problem solving were able to solve the following problem: If a class of 19 children is going to the zoo and each car can take 5 children, how many cars are needed? When asked whether all the cars were full, they said: “No, there is an extra seat in one car” or “Yes, because I’m going too!” They were making sense of the question. Contrast these findings with test results of Grade 8 students in non-problem-solving programs who were asked the same type of question, but with larger numbers: An army bus holds 36 soldiers. If 1,128 soldiers are being bused to their training site, how many buses are needed? Two thirds of the 45 000 students tested performed the long division correctly. However, some wrote that “31, remainder 12” buses were needed, or just 31 – lopping off the remainder. Only one-quarter of the total group gave the correct answer of 32 buses (O’Brien, 1999). For those students, learning “school mathematics” (Fosnot & Dolk, 2001b) meant carrying out procedures without making sense of what they were doing. Is there evidence that Ontario students stop making sense of what they are doing in mathematics as they progress through school?

The clear implication is that learning standard approaches harms students’ ability to make sense of maths problems: algorithms are *harmful*. Yet, when we look at the examples used to make this case, two things are changing at once; the familiarity with standard procedures *and* the complexity of the problems (setting aside the fact that we are talking about very different cohorts of students). Would students unfamiliar with the standard approaches have fared better on the soldiers problem? I think the Norton research suggests that they might not.

I think I know what is going on here. For most students, carrying out a long division problem is sufficient to fill-up the working memory, so much so that there is little capacity left over to attend to other aspects of the question. This is why students make supposedly ‘silly’ mistakes in class. The solution to this is to practice long division to the point that it is more automatic and requires less conscious attention. This way, students can focus more on the context of the question and the reasonableness of their answers.

This is quite the opposite conclusion to the constructivist prescription for more real-life, experiential learning.

Reblogged this on The Echo Chamber.

The bus type problem (1128 soldiers, one bus holds 36 soldiers, how many buses?) is very popular realistic math education (RME), a Dutch variant of fuzzy math (see PISA Math test items to get the idea). The particular example is from NAEP (see also Alan H. Schoenfeld (2007). What Is Mathematical Proficiency and How Can It Be Assessed? In Alan H. Schoenfeld (Ed.) (2007). Assessing mathematical proficiency (59-73). Cambridge University Press. p. 69-70. http://library.msri.org/books/Book53/files/05schoen.pdf ) Some 45 000 students answered that question: 30% miscalculated, 18% rounded to 31, 29 % answered 31 rest 12, 23% give the answer that fuzzy math professors want to hear: 32 buses.

If ‘32 buses’ is the one and only ‘correct’ answer, then evidently to the psychologist that I am, this item belongs in an intelligence test. ‘32’ is the only correct answer by fiat only, not because of the math involved.

This item is part of a math test. Well, why would ‘31 rest 12’ be wrong, then? It is the mathematically perfect answer. Also, ‘31 buses’ is not that unreasonable! After all, the paper and pencil test itself is a highly artificial situation. I can’t imagine a truly ‘realistic’ situation where 12 soldiers would be left behind because someone ordered the wrong number of buses.

Another take on this item’s qualities would be Dan Willingham’s: well, there might be some ‘inflexibility’ in the answers of a large group of students, so what?

“Would students unfamiliar with the standard approaches have fared better on the soldiers problem? I think the Norton research suggests that they might not.”

Kids working with the real-world investigational, constructivist, whatever it’s to be called, are learning about the nature of division, not learning a method (algorithm). Faced with the soldiers and buses problem they would all get out their calculators.

Is the real reason for teaching the “standard algorithm”, of which there are many, that it is going to be “on the test”?

Thanks for covering this Greg,

Here in Ontario our Ministry of Ed has been working hard to address the controversy around these topics. Well actually no, you won’t find any mention from them that any expert has any doubt about any of it. See their latest at

http://www.edugains.ca/newsite/21stCenturyLearning/index.html

The simple fact that Canada has organizations such as wisemath.org that highlight the concerns from University profs and that these issues are not even mentioned let alone addressed suggests something is up with what passes for critical thinking and research based policy at the ministry.

It’s really quite ironic when you look at what they want for future students. All sorts of critical thinking competency and the ability to gather information from diverse sources and form well founded opinions. Yet while Greg can find a diversity of opinion from education researchers on every topic our ministry just finds the folks who are saying yes minister. Maybe I’m wrong but don’t think that show was intended as an exemplar for how to run things.

We need a name for the phenomenon whereby advocates of critical thinking fail to display it. It happens a lot. I might ask Twitter.

A generous take would be that it is an effort at paradoxical intention.

I suspect that pupils tend to disengage from the actual “real life” element of the questions, simply as a matter of time efficiency. They know they have to extract a mathematical problem from the verbiage, and they learn to do that as quickly and efficiently as possible. Most questions are in a format that allows this. The fact that occasionally one will come up that has a “trick” aspect, and they will thus get it wrong, will be accepted by most as a price worth paying. Only the most conscientious will take the time to consider every aspect of the “real life” part of the problem.

But I fear it is more likely paradoxical inattention.

FYI, Math will soon enter the “why learn it when you can look it up” dynamic that we get in history. There are several new phone apps that one of my colleagues in our math department were showing to me where students can take a picture of a problem and the app will solve it and show the steps. See this one as an example: https://itunes.apple.com/us/app/photomath-camera-calculator/id919087726?mt=8

Heh. That won’t even solve the bus problem, as given to them.

History has managed to weather the “look it up” phenomenon that arose once encyclopedias became available. I’m sure Maths will get by.

My Year 12 class all have graphics calculators that solve most of their problems already. But they have to get them into the appropriate form first, and that is where the Maths comes in.

So long as math in schools is seen as a bunch of methods for solving simple problems it doesn’t matter how it is done. It is largely a waste of everybody’s time. Trouble is, that is the easily testable stuff.

If this is a concern, then write problems that are more complex but that only use lower level skills in combination.

However, that’s not why we focus on simple problems. We do simple problems at junior school because you can’t do the hard skills until you have the easy ones mastered. You can’t do Calculus if you are uncertain what a negative number is, and what division does.

But the issue isn’t testing at all. It is the alleged need to teach such skills in a “real world” context to make them relevant. The two are unrelated, despite protestations (without evidence). You can set tests that are not just “solve problem A, then problem B” without having to go all discovery learning on them.

Nor should you teach in a context just because you are going to test in one. You teach the skills as a stand alone, then introduce them into contexts, not the reverse.

Hola Greg ! I just came across this:

http://ww2.kqed.org/mindshift/2014/02/03/math-and-inquiry-the-importance-of-letting-students-stumble/

You are going to hate it !!!!!

Thanks but I’m trying to stick to rule 6.

http://www.openculture.com/2013/05/philosopher_daniel_dennett_presents_seven_tools_for_critical_thinking.html

I sometimes fail.

When you assert that fewer children learn to read with the whole language approach, do you mean read or word-call?

I meant what I wrote. And I linked to evidence so it’s not just an assertion.

In New Zealand, where the stress is much more heavily on giving answers in context than in most countries, the normal kids get the bus problem right. That is because, … wait for it …, they have been

taughtto do it correctly.But that is not evidence for discovery Maths or “real world” Maths. Just a change to the marking schedule that emphasises the importance of answering in context.

Interesting idea, but I am really sceptical of working-memory arguments in cases like this. (see e.g. http://garydavies.org/2016/04/13/i-dont-know-my-times-tables/ )

The big problem here is that students don’t reason with their answer and sanity check it.

In this case it is quite obvious that students have decoupled the algorithmic part (doing the division or multiplication) from the reasoning part (deciding whether your answer is sensible).

The reason this is important is because you can quite easily take your brain out of the algorithm part (by using a calculator, or a computer, for example).

So I imagine the problem is the focus on teaching the algorithmic part rather than on the reasoning part. You need both to come to a sensible answer.

Gary, I think you should click and read the “Dan Willington Explains” link:

http://www.aft.org/periodical/american-educator/winter-2002/ask-cognitive-scientist

That you should teach reasoning and to read the question and think about what a sensible answer is has nothing to do with whether it is useful to learn the times tables.

These are not either one or the other questions. Anyone who brings that into the debate about learning the times tables should be poked with a hot false dichotomy.

In response on your blog I suggest it doesn’t matter that a lot of real world quadratics are not solvable with mental arithmetic. Working with quadratics that can be solved using mental arithmetic is a useful skill for doing high school math and doing this is useful for getting your head around how quadratics behave. You also split the work into two steps and offload working memory by writing down the results of the first. I certainly stopped doing that once I had a good handle on the mechanics.

My experience tutoring students that struggled with math was that arming them with a reliable memory of the times tables was the best way to help them. Suddenly one stumbling block was removed and they were able to start keeping pace with the class.

The jumpmath.org tipsheet suggests spending about 10 minute a day plus some practice time to learn the tables (http://www.jumpmath.org/jump/sites/default/files/TipSheet_Times.Tables-01.pdf).

Sure its use it or lose it but again it is minutes a year of practice for students not to lose it.

No one should take my ancedotes or yours as conclusive but this is such an easy topic to test you would think all those saying there is no need would be busy doing experiments to close the issue. Like the ones the jumpmath guys produce which suggest what they recommend seems to be on the right track even if it doesn’t isolate this issue.

I posted that link to highlight the “working memory” fallacy in these kinds of calculations.

Rote learning the long division algorithm, which I think we can all agree is a good thing, doesn’t help you evaluate the meaning of your answer.

In the bus case, they do the algorithmic part but then don’t take a step back and ask themselves “Does it make sense to have 31 remainder 12 buses?” or “What is remainder 12 of a bus?”

You can solve this problem by doing more real-world examples, where students intuitively learn to think about what things can logically have a remainder and what things can’t (doing calculations involving people is a good one, since it’s impossible to have half a person etc.)

The path to expertise and understanding is directed practice, after all.

Success or failure at this task really has nothing to do with working memory.

Gary,

And I pointed out that in the case of quadratics you are mistaken. Only your decision to write down the factors before proceeding made it not a CLT issue. You can define a method that includes writing down items to make any CLT problem go away.

The same goes for long division. It is quite possible to do long division in your head up to some level. Again the less you have to think about while doing this the easier it is.

I am sure the more innate capability you have the easier it is to get away without memorizing the tables. You may not be the best sample to base a conclusion on about its value.