Group work and the Education Endowment FoundationPosted: May 28, 2017
But there are genuine and serious reasons to be concerned about the kind of meta-analysis that the EEF curates. Following my last post, someone pointed me to the EEF toolkit which includes a review of collaborative learning. On face value, it looks like adding collaborative learning to whatever you are doing already will mean that your students make an extra five months of progress. How extraordinary!
When you dive into the detail, there are a few caveats. You can’t simply implement any old group work in order to obtain the advantages. You need to, “get the detail right,” and use, “structured approaches with well-designed tasks.” So you can throw out any plans you had of implementing unstructured group work with poorly designed tasks. The EEF also note that, “There is some evidence that collaboration can be supported with competition between groups, but this is not always necessary, and can lead to learners focusing on the competition rather than the learning it aims to support.”
This point is really interesting. I was surprised when I looked in to the group work evidence to find that at least some of it is based upon experiments that manipulate different systems of rewards (e.g. here). In these studies, rewards are either handed out individually or to groups and the evidence suggests that the students earning group rewards learnt the most (at least from completing assignments that seem to involve little direct teaching). This is probably why Robert Slavin emphasises the need for group goals in order for collaborative learning to be successful. it seems like the EEF are aware of this but don’t really like the finding.
I haven’t examined the papers in the EEF’s technical appendix but I note that some of them appear to be similar to Slavin’s sources. If you are interested in looking into these papers then I suggest examining the methods closely. What are the comparison groups? Is collaborative learning being tested against individually studying from a worksheet or book or is there actual teaching involved? What are the age ranges of the students involved? What are the outcomes?
Such an analysis is not the purpose of this post. According to the EEF, 5 months of progress equates to an effect size of about d=0.44 which, even if you still believe in effect sizes, is neither here nor there (see my take of effect sizes here). Moreover, the EEF state that, “The effects vary… with pooled effects between 0.09 and 0.78 but no clear explanation of what explains the variation.” Which is also neither here nor there.
And what of the absurdity of treating group work in this generic way? Are we meant to assume that we would gain five months of progress by adding group work to drama lessons? It’s hard to think of drama lessons without it. Are the findings equally valid for science and maths and reading and art?
Nevertheless, the influence of the EEF toolkit is expanding. We now have an Australian version based on the same data set. Busy school leaders will reach for it as a source of ‘evidence’.
What’s the harm? Well imagine an excellent maths teacher who instructs a whole class in an interactive way. Imagine that teacher being told to incorporate more group work because the evidence supports it. That’s the potential harm.