I have had a paper that I have co-written with my PhD supervisors accepted for publication in *Educational Psychology Review*. Having studied the copyright statement, I am clear that I am able to publish something called the Author’s Accepted Manuscript (AAM) on this personal blog. The link to the AAM version is at the bottom of this post. The link to the published version is here and you may read it online (but not download or print it) here.

Briefly, the purpose of the study was to investigate the competing predictions of cognitive load theory and productive failure in the context of relatively high element interactivity learning materials. Productive failure research suggests that, provided certain conditions are met such as the task being intelligible to non-experts, there is a benefit to open-ended problem solving prior to receiving explicit teaching. However, cognitive load theory suggests that this is likely to overwhelm working memory and so explicit teaching prior to problem solving should be more effective.

The learning task was drawn from middle school / upper primary science and involved determining which was the most efficient light globe, given data on energy use. There were two experiments in total that both took place in a lecture theatre and had the same basic design. Half of the students were randomly assigned to complete problem solving while the other half completed a reading filler task. All the students then received interactive explicit teaching at the same time in the hall – all students had a calculator and were asked to perform the calculations at the same time as the teacher before holding their calculators aloft. The tasks were then reversed, with the students who initially completed the problem solving task completing the filler task and vice versa:

In their next science lesson, the students then completed a post-test consisting of two parts. The first part closely replicated the questions in the problem solving booklet – ‘similar questions’ – and the second part replicated the same problem structure but with varying contexts or information – ‘transfer questions’. For instance, one of the transfer questions was about electric fans, one provided redundant data and so on.

In the first experiment (N=64), students who received explicit teaching first performed better than those who completed problem solving first on similar questions, but there was no significant difference on transfer questions. In the second experiment (N=71), an additional step was introduced for the students to complete, increasing the element interactivity. This time, students who received explicit first performed better on both sets of questions.

One strength of this design is that all students received exactly the same explicit teaching, in the same place and at the same time. This reduced the possibility of the teacher unconsciously altering the teaching for the different groups. A weakness is that there is a time slip between the two conditions. Students who received the explicit teaching first, for instance, may be more cognitively fatigued by the time they reached the problem solving task. A weakness of the study as a whole is that, due to the school timetable, the post-tests for the two experiments took place after different delays.

Another point worth noting is that we chose to assess underlying understanding through transfer questions rather than by attempting to assess it directly, as many similar studies have attempted to do. Assessing by transfer may be a more robust measure of understanding. For instance, in some maths studies, understanding of the principle of equivalence is directly assessed by asking students to explain what the “=” sign means in an equation. However, it is possible to teach students such a definition directly – a fairly low element interactivity learning task – and this may not therefore always represent a flexible understanding that can be applied in different situations. By using transfer, flexibility of understanding is required. We discuss this point in the paper.

*Update: A few people have expressed confusion about exactly what students were asked to do. They were asked to decide which light globe was the most ‘energy saving’. The canonical approach to doing this is to calculate efficiency. Here is an example question.*

Nice study Greg!

Thanks

While hardly riveting reading that paper was the clearest and most logically laid out bit of research I have read in a while. Brutally on point. I particularly liked the well balanced theoretical overview which explained why the experiment was important.

As an aside you may wish to add what the extra element interactivity was to your blog. The calculation of light output was especially evil as it would encourage a potential misconception of efficiency which they were subsequently asked to calculate. (Total output rather then relative efficiency). It would help people who only read the blog to see why that would make the questions much harder while providing a concrete example of element interactivity. Being able to spot that misconception and avoid it is deep learning aka expertise.

Most people won’t bother to read the paper and this detail was the most import distinction.

Thanks – I’m wary of putting too much detail into the blog in case it makes it a bit clunky.

I figured that. Though by mentioning it and not explaining it I did find it distracting. Made me read the paper though.