Those of you who are familiar with this blog will be aware that I have highlighted differences between my position and that of Dan Meyer. In particular, I have criticised some of the ideas that Meyer presented in his 2010 TED talk. The claims that he made can be typified by this extract from the transcript:
“So 90 percent of what I do with my five hours of prep time per week is to take fairly compelling elements of problems… from my textbook and rebuild them in a way that supports math reasoning and patient problem solving. And here’s how it works. I like this question. It’s about a water tank. The question is: How long will it take you to fill it up? First things first, we eliminate all the substeps. Students have to develop those, they have to formulate those. And then notice that all the information written on there is stuff you’ll need. None of it’s a distractor, so we lose that. Students need to decide, “All right, well, does the height matter? Does the side of it matter? Does the color of the valve matter? What matters here?” Such an underrepresented question in math curriculum. So now we have a water tank. How long will it take you to fill it up? And that’s it.”
Meyer goes on to talk of his use of video to create engagement with the problem. It is clear that he sees motivation as important. He also speaks as if these are ideas that he has developed and evaluated himself, in his own classroom – “It’s been obvious in my practice, to me.” – rather than anything that is the product of educational research. He does not refer to evidence that his is a more effective form of teaching. This would be fine if he was simply presenting ideas for us to contemplate but, instead, he is making a strong claim; that we need a general change to the way that we teach mathematics.
I have queried this lack of evidence before and I have suggested that it is a good example of the problem with how we talk about education. It is hard to imagine this kind of a discussion about the practice of any other profession (if we can class teaching as a profession – this might disqualify it).
In my view, Meyer’s ideas are at odds with cognitive load theory. The type of instruction he describes might be effective for students who are already relatively expert but the evidence suggests that it is unlikely to work with novices who haven’t had much instruction in the topic. A lot depends upon the enactment. If the open-ended task comes at the end of an extended period of instruction or it is brief and is followed by comprehensive, explicit instruction then this might work reasonably well. If we are hitting relative novices with such problems then it would seem to provide too little guidance.
In a recent exchange with Meyer on Twitter, I suggested that the difference between us was that I had evidence for my position and that he did not have evidence for his. He did not agree with this and when I asked for the evidence to support the claims in the TED talk, he offered the following:
I thought that I should look these up. They are not full references and so I entered the terms into Google Scholar and read the first paper that seemed relevant. I expected the evidence to be about problem-based learning because that is the way that I would broadly categorise Meyer’s method.
I was familiar with Mayer’s work on multimedia and it seems that this supports Meyer’s suggestions about using multiple forms of presentation, although it must be pointed-out that textbooks with diagrams do also fit the definition of ‘multimedia’. It seems sensible to use arresting graphics, visuals and video as long as we pay heed to cognitive load theory and its implications which, according to Mayer, include:
“(1) the presented material should have a coherent structure and (2) the message should provide guidance to the learner for how to build the structure. If the material lacks a coherent structure — such as being a collection of isolated facts — the learner’s model-building efforts will be fruitless. If the message lacks guidance for how to structure the presented material, the learner’s model-building efforts may be overwhelmed. Multimedia design can be conceptualized as an attempt to assist learners in their model-building efforts.”
This would not seem to support the idea of using distracting information with novice learners and I would argue that many textbook questions have substeps in order to assist learners in their model-building efforts. [I will also take this opportunity to add that Mayer has written an excellent paper on the repeated failure of discovery learning]
The paper that I found by David and Roger Johnson on ‘constructive controversy’ doesn’t seem to have much to do with the effectiveness of maths teaching, focusing largely on variations of social studies lessons. It does have some data and reports effect sizes. However, many of the dimensions on which constructive controversy is assessed are things like attitudes and motivation. When academic performance is reported, constructive controversy seems to lead to ‘higher-level reasoning strategies’. I think it would be important to look at exactly how this is measured, particularly since much of the discussion is about understanding the position of opponents. It is not obvious how this might apply in maths.
It also seems that constructive controversy is not compared with explicit instruction in these studies (the comparison conditions are group ‘concurrence-seeking’, ‘debate’ where a judge decides on winners and losers and ‘individualistic efforts’). I am willing to concede that controversy may make something memorable and interesting but I am not sure that we have to follow the Johnson’s model in order to introduce it or that explicit forms of instruction cannot make use of controversy.
And so, finally, I reviewed an article by Kasmer and Kim on ‘The nature of student predictions and learning opportunities in middle school algebra.’ I can see that prediction might be a component of problem-based learning and that it might help to activate prior knowledge or give knowledge-gap experience to students. Yet the study in question seems to have no control group and so we can’t really establish the extent of the advantages of predictions over any other type of instruction (I did briefly look for controlled studies but couldn’t find any):
“In order to investigate the value of using prediction, prediction questions were developed and posed in one middle school mathematics classroom throughout one school year. The target content area was algebra: linear and exponential relationships.”
Let us assume that there are significant advantages to making predictions. This still does not imply the use of problems with distracting information and substeps removed. Predictions could as easily be incorporated into a period of explicit instruction as one of problem-based learning. We could ask students what they think will happen before demonstrating the correct method. In fact, I do this a lot.
In my view, the papers that I have presented do not provide compelling evidence that maths teachers should follow the approach to maths teaching that Meyer describes. Instead, there’s seems to be a much stronger case for using explicit instruction. You can find some of the evidence to support my view here and I intend to have an ebook available soon where I will discuss this at greater length.
15 thoughts on “The evidence for Dan Meyer’s TED talk”
Reblogged this on The Echo Chamber.
“I will also take this opportunity to add that Mayer has written an excellent paper on the repeated failure of discovery learning.”
I don’t have any interest in defending discovery learning or problem-based learning or attacking explicit instruction. I can see how the talk could be viewed that way, but it is a particularly partisan viewing. The teacher is present throughout these tasks. For instance:
“So I want to put to you that if we can separate these in a different way and build them up with students …”
These tasks (which emerged from that talk) are, in fact, impossible without the teacher’s guidance. They begin without a question and without information. How would students complete the task if not for the guidance of their teacher?
So when you say:
“Predictions could as easily be incorporated into a period of explicit instruction as one of problem-based learning.”
I shrug and say, sure, why not. Your quarrel, not mine.
The case for these tasks from the standpoint of cognitive load is, first, that temporarily subtracting information increases intrinsic load, not extraneous load. Nowhere in traditional textbooks do students receive a media res context and consider what questions could be asked and what information would and wouldn’t be relevant for those questions. These tasks do increase load. But that load is intrinsic. That’s a normative claim, of course, not an empirical one. I’m saying, “Math should look like this.” Reasonable people can disagree.
Second, print media impose heavy extraneous load on students. They combine illustrations, questions about those illustrations, information relevant to those questions, substeps for answering those questions all in the same visual space. They take advantage of very few of Mayer’s recommendations for multimedia learning while violating several others. That’s a response to the limitations of the print medium. The entire task all has to be printed in the same space. Digital media, by contrast allows teachers to lower that extraneous load and sequence those steps.
Your quote from Mayer is adequate for my purposes. Specifically, this:
Problems like this one are exactly that – sentence after sentence of isolated facts, each interacting, each stored in working memory. Their point isn’t clear until the student reaches the end of the task and deciphers the question. That load is extraneous.
Making the problem over based on the work of Mayer and Kasmer & Kim and David & Roger Johnson and others, makes the problem more interesting for students, more productive for students, and, normatively, more important for students.
“The case for these tasks from the standpoint of cognitive load is, first, that temporarily subtracting information increases intrinsic load, not extraneous load. Nowhere in traditional textbooks do students receive a media res context and consider what questions could be asked and what information would and wouldn’t be relevant for those questions. These tasks do increase load. But that load is intrinsic”
The distinction between intrinsic and extrinsic depends on the objective. Your claim is true if the objective is to teach students how to decide what information is relevant to a question or how to structure a solution. This might be appropriate late in training. However, early in training, an appropriate objective would be to teach students how to solve the problem. Your strategy, if used early in training and with novice learners, would lead to cognitive overload. See the worked example studies of Sweller.
It is as if Mayer were to claim that multimedia presentations should be full of distracting and conflicting information so that students can learn how to cope with cutting through it. In this case, such information would present intrinsic load. He doesn’t say this because he doesn’t consider that to be the objective of such a presentation.
I see no reason to think that your approach is more interesting than the alternatives. The evidence you supply does not provide it.
Great, so do it late in training. I’m not making a claim here about when students should model with mathematics. I’m claiming that they don’t, and should.
I cited research earlier that specifically concerned the elision of information (which is precisely my project) and you didn’t engage. I don’t think you’re persuadable here.
I engaged with your blog post in the comments and I wrote my own post in response:
I don’t think the study that you cite proves very much but I think that’s a fair comment and not a failure to engage.
If your claim is that students should be given open ended problems later in training then I don’t see how that differs much from traditional practices. That is not the message that I took from your TED talk or from your subsequent work which seems like a version of problem based learning with the open ended problem posed at the outset.
Great, so do it late in training.
I go to your site regularly Dan, because it is a good place to find ideas. And sometimes to challenge my own thinking from someone who has thought through his position.
However, I approach it from the point of a person who teaches explicitly until the material is at least partly learned, and then uses more open ended questions to keep them from getting bored with endless repetition.Your activities are helpful for me that way.
However I get no feel from your site that this is how you think we should teach. All the activities shown are ones involving modelling or other open-ended questioning. There is no statement of what should precede any activity to make it work. You say now “so do it late” but there was no such advice when it was delivered to use it late.
I bet most of the people who visit your site think you are advocating discovery learning as the primary means of teaching. I know that isn’t your position, but I have to remind myself that, because that’s the overwhelming feel I get.
There’s not a whole lot of love in the comments for explicit teaching, that’s for sure.
Right Chester, All we need is for Dan and Greg to collaborate on some guidelines for the range of readiness for switching from explicit to open problem based instruction and a lot of the noise goes out of the debate.
Even if they can’t pinpoint it having a discussion about when rather than one verses the other would make it much more interesting.
I went to a workshop you ran in Melbourne in 2013 (or maybe 2014) where you introduced arithmetic sequences through stepping up and down one step, then two steps, and so on. You made it clear that this is how you wanted people to introduce the topic, spending a lot of time on building the question with the kids, asking them what information was needed etc – starting off with a high cognitive load. Has your position on the ‘when’ changed since then?
I don’t know if there’s much of an appetite here for the distinction between a modeling task and a preparation for learning task. As I read Greg, you’re either teaching explicitly or you’re doing discovery learning and students are twiddling their thumbs on one of David Perkins’ projects – no shades in between.
To the extent that there is an appetite:
My TEDx talk considers modeling tasks and I am unopinionated about their order in the curriculum. Modeling just doesn’t happen in the curriculum. It should.
Preparation for learning tasks should happen before instruction, consonant with Schwartz & Martin and How People Learn. They alert students to salient features of new mathematical structures and ready them to learn from instruction. These tasks are characterized by a light cognitive load.
Here are the tasks from Super Stairs which fattenthepig calls “high cognitive load”:
• “Watch this video.”
• “Write down any questions you have.”
• “Write down your teacher’s question.”
• “Make a prediction.”
• “Decide what information would be helpful / unhelpful to know.”
Then the teacher explains what to do with that information.
I’ve students enduring excessive cognitive load. This isn’t that.
Below is a good article on preparation for future learning, especially the Q&A at the end. As Sweller points out, these studies do not appear to be well controlled:
Click to access Constructivist_Assessments_Final.pdf
Sweller wrote “Can the authors of this chapter throw any light on why the obvious control groups seem never to be used?” but he doesn’t specify any of those control group. Any insight?
I wouldn’t want to speak for Sweller but they do seem to violate the change-one-thing-at-a-time principle and this is something that the authors seem to acknowledge in their response. A fair test is pretty basic to the scientific method.
There may well exist some preparation for future learning or productive failure effect. I just don’t think the current studies provide the evidence to support this. They certainly don’t substantiate the assertions in How People Learn.
It seems to me that constructivism is slowly evaporating, finding smaller and smaller niches to cling on to where the research is still fuzzy. Since the 60s, we’ve gone from the full-on discovery learning of Bruner right the way down to ‘there might be some point in a bit of open-ended exploration before we hit them with explicit instruction’. There might be, but its like the god-of-the-gaps argument. And all this whilst the evidence for explicit teaching continues to amass.
It is interesting that these models still require explicit instruction at some stage. I wonder how many elementary maths teachers or visitors to popular maths teaching blogs would be aware that this now represents the strongest position that constructivism holds.
Pingback: The skill of ignoring distracting information | Filling the pail
Hmm – more generally, the use of evidence helps: Jan 2016 report by National Council for teaching quality Learning about learning supports both arguments. It’s not either or; what is clear is that text books per se are not the answer.
Pingback: Evidence and Integrity | Filling the pail