The article that England’s Chartered College will not print

The following article was submitted to Impact, the trade journal of England’s Chartered College of Teaching. This is the version that was revised in response to the first three peer review reports. You can read about the process here. I will follow-up by discussing the peer reviews. Unfortunately, I don’t have permission from the College to print these in full. For now, read the article and see what you think: 

If you are a teacher or a school leader and you visit the Education Endowment Foundation’s (EEF) online toolkit, you will notice one ‘strand’ that stands out from the rest. Implement ‘meta-cognition and self-regulation’ in your school and you can expect your students to make an additional eight months of progress. But wait, it doesn’t stop there. Implementation is low cost. And there’s more. The evidence for its effectiveness is even stronger than the evidence supporting the use of feedback. It’s a no-brainer then. Off you go and do it!

What is that, you say? You are not sure what ‘meta-cognition and self-regulation’ is? Perhaps Kevan Collins, Chief Executive of the EEF, can help. According to Collins, ‘Meta-cognition is getting beyond – above the actual thing – to have a better sense of it.” (Collins, 2017). Does that help?

I was not entirely clear and so I decided to look at the studies that sit behind the EEF figures. What I discovered caused me to question the headline claims. It seems as if, like the mythical chimera, the category has been stitched together from a range of different beasts. Moreover, the outcomes we should expect vary greatly, depending on what kind of approach we select.

The EEF produce a ‘technical appendix’ for each of their toolkit strands and so I consulted the technical appendix prepared for meta-cognition and self-regulation (Education Endowment Foundation, 2016). It lays out two sources of evidence. The first is a range of meta-analyses conducted by different education researchers that seek to draw together and synthesise the findings from many different studies. The second is a set of individual studies, mostly conducted as randomised controlled trials by the EEF itself.

An ‘effect size’ is calculated for each of the meta-analyses and studies and these are combined by the EEF, in a further layer of meta-analysis, to produce an overall effect size which is then used to generate the headline figure of eight months of additional progress. Combining effect sizes through meta-analysis is controversial because the conditions of a study can influence the effect size. For instance, the age of the subjects and whether the outcome measure is designed by the researchers can both influence effect sizes (Wiliam, 2016). In the case of meta-cognition and self-regulation, this issue is compounded by the fact that the outcome measures vary widely from reading to maths to critical thinking.

Moreover, we might expect effect sizes to be influenced by the quality of the study design and the technical appendix seems to support this conclusion because the effect sizes of the more rigorous EEF randomised controlled trials are generally lower than for the meta-analyses.

Such problems have led Dylan Wiliam, Emeritus Professor of Educational Assessment at the UCL Institute of Education, to conclude that, ‘…right now meta‐analysis is simply not a suitable technique for summarizing the relative effectiveness of different approaches to improving student learning…” (Wiliam, 2016). It is therefore not clear that meta-analysis is an appropriate way of evaluating educational interventions at all and it certainly calls into question the EEF’s approach of attempting to derive an overall effect size from multiple meta-analyses.

If we focus only on the randomised controlled trials conducted by the EEF, the case for meta-cognition and self-regulation seems weak at best. Of the seven studies, only two appear to have statistically significant results. In three of the other studies, the results are not significant and in two more, significance was not even calculated. This matters because a test of statistical significance tells us how likely we would be to collect this particular set of data if there really was no effect from the intervention. If results are not statistically significant then they could well have arisen by chance.

Furthermore, the diversity of approaches sitting under the EEF label of meta-cognition and self-regulation is astonishing. In Philosophy for Children, for instance, teachers use stimulus material to initiate class discussions around concepts such as truth. This supposedly has an impact on their maths performance (Gorard et al., 2015) although the way that this is meant to happen seems spookily mysterious and the lack of a test of statistical significance does not fill me with confidence.

In contrast, Improving Writing Quality is an intervention where students are explicitly taught how to plan, draft, edit and revise their writing. This was one of the two EEF studies with a statistically significant result and it was the one with the largest effect size. This is hardly surprising because explicit writing interventions have repeatedly been shown to be effective at improving students’ writing (Torgerson et al., 2014). Moreover, in contrast to Philosophy for Children, the way that it works is highly plausible.

What do these approaches have in common and what do they have in common with a science intervention such as Thinking Doing Talking Science (Hanley et al., 2015) or a growth mindset intervention? True, they all involve students in thinking, but then so does every other educational activity.

One intervention, Let’s Think Secondary Science, has not yet made it into the data pool for meta-cognition and self-regulation and I’m not clear as to why, although it may just be due to the timing of the study. It is based on the Cognitive Acceleration in Science Education projects of the late 1980s and early 1990s, has similarities to Thinking Doing Talking Science, but when tested by the EEF was found to have no effect on learning (Hanley et al., 2016).

It therefore matters greatly what type of intervention we select and what outcomes we are intending to improve by selecting it. By stitching together explicit writing interventions with philosophical discussions, the EEF have created a monster; a chimera that hinders our ability to understand what works best in schools. Teachers and school leaders would be wise to read the underlying studies in the meta-cognition and self-regulation strand and draw their own conclusions. For its part, the EEF should get, ‘beyond – above the actual thing – to have a better sense of it,’ and then break it apart.

References

Collins K (2017) Sir Kevan Collins on Metacognition. Vimeo. Available at: https://vimeo.com/225229615

Education Endowment Foundation (2016) Technical Appendix: Meta-cognition and self-regulation. Education Endowment Foundation. Available at: https://educationendowmentfoundation.org.uk/public/files/Toolkit/Technical_Appendix/EEF_Technical_Appendix_Meta_Cognition_and_Self_Regulation.pdf

Gorard S, Siddiqui N and Huat See B (2015) Philosophy for Children Evaluation report and Executive summary. Education Endowment Foundation. Available at: https://educationendowmentfoundation.org.uk/public/files/Support/Campaigns/Evaluation_Reports/EEF_Project_Report_PhilosophyForChildren.pdf

Hanley, Slavin & Elliott (2015) Thinking Doing Talking Science Evaluation Report London: EEF *Higgins, S., Hall, E., Baumfield, V., & Moseley, D. (2005). A meta-analysis of the impact of the implementation of thinking skills approaches on pupils. In: Research Evidence in Education Library. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.

Hanley P, Böhnke JR, Slavin R, Elliott L and Croudace T (2016) Let’s Think Secondary
Science Evaluation report and executive summary. Education Endowment Foundation. Available at: https://educationendowmentfoundation.org.uk/public/files/Projects/Evaluation_Reports/Lets_Think_Secondary_Science.pdf

Torgerson D, Torgerson C, Ainsworth H, Buckley HM, Heaps C, Hewitt C, and Mitchell N
(2014) Improving Writing Quality: Evaluation Report and Executive Summary. Education Endowment Foundation. Available at: http://educationendowmentfoundation.org.uk/uploads/pdf/EEF_Evaluation_Report_-
_Improving_Writing_Quality_-_May_2014_v2.pdf

Wiliam D (2016) Leadership for teacher learning. Morrabbin, Victoria. Hawker Brownlow Education.

Advertisements

19 thoughts on “The article that England’s Chartered College will not print

  1. Thanks, Greg a very detailed and clever piece, I like the twist at the end.

    As Karl Popper said,

    ‘Those amongst us unwilling to expose their ideas to the hazard of refutation do not take part in the scientific game.’

  2. It is a real shame, more than that actually, that this was not published. It needs a proper response from EEF and others to try to counter your claims if they can. Without the challenge that this article presents then many teachers and teacher leaders will be left thinking that EEF does provide answers where your article suggests they may not be doing so.

  3. Well done for your investigation of metacognition. Do you (as I do) see significant cross-over between this argument over metacognition and the wider (and perhaps more prominent) argument over Assessment for Learning? For Dylan Wiliam (guru as he may be on the subject of meta-analyses) metacognition and self-regulation are important elements in his account of AfL, and I am sure you are familiar with Randy Bennett’s critique of AfL on the basis that no-one can be sure what it is (https://www.skolverket.se/polopoly_fs/1.126741!/formative_assassement.pdf). This strikes me as a collection of nested onion-skins of poorly defined concepts and Wiliam’s dismissal of Bennett’s arguments in his recent book “Leadership for Teacher Learning” strikes me a peculiarly unconvincing.

    My second take-away is that we should stop thinking that the EEF is the panacea for all poor research (which is how it was surely intended). It seems to me to be repeating all the same mistakes that everyone else in educational research has committed. I think this suggests that the problems are not down to incompetence or misguided ideology but are systematic. I would be very interested in your response to my case that we will never solve the educational research riddle until we start using digital technology effectively – see my https://edtechnow.net/2015/09/10/red15/.

    Thanks, Crispin.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.