Busting the number one myth of problem-based and inquiry learning

It is a cold Saturday morning in winter. You have lots to do but the covers are warm and soft and your bed is welcoming. Your children are quiet – they’re probably watching something inappropriate on TV but, crucially, they’re not bothering you. A few thoughts flit through your mind; you need to pay the water bill, you need to post a birthday present to your great aunt Penelope before the post office shuts at 12.00 pm, you need to get your car cleaned so that you don’t have to wash your hands every time you open the boot, you need to get out there and do something significant with your life before you die.

Your better-half enters the bedroom with a steaming mug of milky tea; your favourite. ‘How agreeable!’ you think, but before you can voice your appreciation, your partner speaks.

“The kitchen sink is blocked and it smells like rotten cabbage.”

According to many proponents of problem-based learning, you should now immediately leap out of bed crying, ‘thank you, thank you,’ whilst skipping to the kitchen to have a sniff of it yourself. Why? Well, being presented with problems is intrinsically motivating and authentic, real-life problems – such as a blocked sink that smells of putrefying brassicas – is doubly so. No, you don’t want to simply be told a solution.

I think this myth warrants specific challenge because it represents the single wheel on a barrow that some people have been pushing for far too long.

It signals a retreat from claiming that problem-based learning produces greater academic gains. It seems that it does not. And so the territory has shifted; problem-based learning is more intrinsically motivating instead.

I have noted before that this is an odd claim. Increased motivation should result in increased application to learning tasks and therefore increased academic gains. Although it might be true that these won’t show up in short studies, they should appear in longitudinal and correlational studies.

So I offer two more pieces of evidence. In the first study by Tornare et. al. (2015), we have students who feel more negative after being asked to solve mathematical problems than beforehand. The most important factor seems to be how they feel that they went in the problems and that this is related to self-concept i.e. how good they think they are at maths. Actual performance was not related to their emotional state (apart, perhaps, from feelings of ‘hopelessness’.) It seems evident to me, and probably to many others without needing to read research papers on the subject, that we need to set children up for success by teaching them how to solve the problems. This will make them feel more competent and will positively affect how they feel about problem-solving.

Instead, we have a narrative where children need to struggle because it’s good for them and if they don’t like it then that’s because they’ve got a fixed mindset or something and so we need to put up some motivational posters. I paraphrase.

I dislike talking about definitions but it is necessary to highlight here that the concept of problem-based learning tends to overlap with the idea of inquiry learning; both seem to share many features. In medical training and mathematics teaching, educators tend to refer to quite well-defined problems whereas science tends to adopt a language around ‘inquiry’ because people link this to the scientific method – in fact, many courses will describe learning about the scientific method as developing scientific inquiry skills. However, both approaches involve presenting a problem or question to resolve without giving instruction on a solution, or the solution to a similar task, upfront.

I therefore want to mention a second study by Hushman and Marley (2015) which I think has something to tell us. It is similar to the famous Klahr and Nigam experiment in which students were either taught the scientific principle of controlling variables (CVS) or were facilitated in discovering the principle for themselves (I always find this an ironic subject to choose given that so many educational studies are badly controlled). However, in this case, the researchers had three instructional conditions which are interesting to investigate.

The first condition was called “direct instruction” and is worth quoting in detail:

“The experimenter read a definition to the students of three types of variables (independent, dependent, and control variables). The relationship between variables was verbally illustrated using an example of two levels of ramp steepness as the independent variable, the distance the ball rolled as the dependent variable, and the surface area as the control variable. The explanation was delivered without soliciting responses from the participant. Next, two examples were given… During the presentation of the examples, each type of variable was highlighted by the experimenter. After each example, participants were asked if they could clearly tell the effect of steepness on the distance the ball rolled to induce cognitive engagement (Klahr & Nigam, 2004). After the student answered, regardless of their answer, an explanation as to why the example was or was not unconfounded was given by the experimenter.”

If you are a regular reader of my blog then you will know that I prefer the term ‘explicit instruction’ to ‘direct instruction’ to avoid confusion with Engelmann’s ‘Direct Instruction’ programs. However, I do not recognise the above as a description of explicit instruction because it is profoundly non-interactive. Rosenshine unpacks the various confusions around this term well in an article that you really should read if you have the patience. In short, I think this represents a worst-case kind of direct instruction.

The other two conditions were a minimally-guided student-centred inquiry condition called ‘minimal instruction’, similar to the Klahr and Nigam study and a ‘guided instruction’ condition which, confusingly, reads a lot like my understanding of explicit instruction and seems to have no student problem-solving or inquiry prior to the presentation of examples (unless we count the presentation of some initial prompt-to-reflection questions; something that I often use in my own teaching).

“Guided instruction was delivered through the use of leading questions prompting reflection during the example phase of the session (Mayer, 2004). Students in this treatment received the same instruction on the type of variables as those students in the direct instruction treatment. While the same experimental examples were used, the participants in the guided instruction treatment were asked questions prompting explanation of the parts of the experiment and whether the experiment was unconfounded. They were asked to verbalize what the independent variable was in the example and to elaborate on how they knew. Then they were asked the same questions regarding the dependent variable and the control variables. When wrong answers were given, the facilitator encouraged the participant to try again. Finally, the student was asked if he or she could clearly tell the effect of steepness on the distance the ball rolled, followed by questions asking the student to provide an explanation as to why he or she could or could not clearly make a conclusion. In contrast to the direct instruction condition, the facilitator offered no explanations during the presentation of the examples.”

Why have I gone to so much trouble to describe an experiment of this kind in a post about motivation? Well, sometimes you have to go digging for gold. Without perhaps realising it, these experimenters have run a trial of interactive explicit instruction against student-led inquiry. And what makes this interesting is that they measured the students’ self-efficacy.

Before I get to that, it is also worth remarking that both ‘direct instruction’ and ‘guided instruction’ outperformed ‘minimal instruction’ on most learning measures. Interestingly, although these differences were significant, there was often no significant difference between the ‘direct instruction’ condition and the ‘guided instruction’ condition. I would have expected more differences in favour of the latter, given that it involved more student interaction.

When students’ feelings about their success in science – their self-efficacy – were examined, the pattern changed. Gains in self-efficacy were significantly greater for ‘guided instruction’ than for ‘direct instruction’ and ‘minimal instruction’, with the latter two not being significantly different from each other.

So what does this show? If you explicitly teach students stuff and ask them questions while you’re doing it then they will learn more and feel like they’re better at the subject than if you just lecture them or let them solve problems without much guidance.

And I suspect this is going to be motivating.

Advertisements

6 Comments on “Busting the number one myth of problem-based and inquiry learning”

  1. Nick says:

    This is a bit of a straw man argument isn’t it? The real power of Inquiry/Problem based learning is obviously its ability to slow down time.

    “But we have witnessed and documented dozens of examples, from grades K-12 and in all subject areas, that something about time and its passage shifts in such matters and that, in a measured way, time slows down and teachers and students accomplish mandated curriculum requirements with time to spare.” (My emphasis of where she seriously says this… and it is on our government’s website. Pg. 29). http://education.alberta.ca/media/1087278/wncp%2021st%20cent%20learning%20%282%29.pdf

    When you have documented dozens of examples of how explicitly instructing students slows down time, you might have something to actually add to the debate.

    • Roger says:

      My efforts to produce the next generation of all-conquering accounting professionals included much project-based assignments. They read about Batonne and Deyonne and argued their relative positions like lawyers. They created businesses and discovered that items of value or called assets and once you subtract liabilities you’ve got equity. Best yet, they built bridges with toothpicks while completing order-forms, filling out transaction analysis sheets, writing cheques and turning in Balance Sheets at the end of every day. For these kids, time stood still in one long orgy of learning, accountancy and most importantly–fun. As for me, time staggered along SO slowly I got chest pains. And what did this all produce? Kids who asked me when can we have more fun like before and why do we need to pick up the pace? Sigh.

  2. R. Craigen says:

    Great post, and the initial example practically makes the point by itself. I agree with your little misgiving about the Klar and Nigam/Hushman and Marley experiments being a bit of a divergence from your main theme. They don’t appear to track the argument you began at the outset, although they make a good point about these experiments. In this an the college-level lecture versus student exploration studies it does seem that what is pitched as “direct instruction” is specifically engineered and/or chosen to fail, and is at best a caricature or limited instance. Comparison of live instances of the actual teaching methods by practitioners under controlled conditions tend to consistently favour direct instruction, in any case. But your piece is ostensibly about motivation, and the thread is lost in this digression.

    I think you could have contrasted the obvious lack of motivation in being faced with problems (as in your amusing story — segue into more realistic classroom versions) to the genuine motivation associated with steady learning and development of skills. I have come to believe that the *single* most powerful motivator for small kids in education — perhaps even ranking above the desire to please or impress adults — is the feeling of accomplishment and mastery; the feeling that one’s learning has progressed. And I believe on this score much (not all) discovery or inquiry-based learning with novices is demotivating, and much good direct/explicit instruction — particularly if it has a rapid feedback-cycle of the right kind — can be extremely motivating. One needn’t look much further than the old films of Engelman doing DI with Kindergarten kids to see this in action … but you find it almost universally in classrooms where the focus is on children attaining, and recognizing their own, significant achievement. Here we have an accelerated out-of-school math enrichment program called Spirit of Math, and you see that same drive and lust in students’ eyes in this program as they “get hungry” for more learning as they see themselves leaping ahead in understanding and skill. SOM is largely a mastery program that involves a highly interactive process and focusses intently on skill as a wedge to raise understanding. A child does not spend 5 minutes in their sessions without learning something — the pace of learning is very impressive. And each learning outcome is solidly reinforced, immediately and thereafter. The kids continually see their own growth, and this puts fire in their bellies. Incidentally, my interaction with SOM kids came through my involvement in the mathematics contests. The kids going through SOM are unusually adept at open-ended problem solving of the kind required to succeed in the contests. I won’t say they teach those skills (I don’t think any program effectively does this), but in the course of their rapid learning there is much provision of opportunities to exercise these skills, and perhaps more important, the automatic skills having been fine-honed, present no barrier and the kids can concentrate most of their mental power on the problem itself, a huge advantage. And that, too, is deeply motivating.

    • gregashman says:

      Perhaps the thread is a little tenuous and perhaps I have yielded to the temptation to shoehorn-in papers that I happened to find interesting. To my mind the second paper shows that EI is best for boosting students’ sense of themselves as scientists which, if applied to maths, the first paper shows would be more likely to make problem solving pleasant. Which I reckon would be more motivating.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s