Evidence in education

I am unashamedly in favour of using evidence to inform what we do as teachers. This is the reason why I am completing an empirical PhD in an area that I believe to have great potential to guide our practice: Cognitive Load Theory.

However, I also recognise that no matter how good educational research may be, it is never going to tell us how to deal with the boys in the lunch queue. Some commentators will point to this complexity and declare a plague on all educational research. They might even start mentioning ‘critical realism’ or other such verbose and obscure social theories. But there really is no need. My position is a consistent one.

Where there is strong and consistent evidence then we should really be guided by that evidence. Where no such evidence exists then we should use our craft knowledge, underpinned by our humanistic principles. We can do both of these things.

And this is not about Randomised Controlled Trialls (RCTs). These can be incredibly useful but they are generally quite expensive and so there’s not that many to draw upon. They can also be badly designed to the point where they can only deliver a desired result and can tell us little or nothing about the strategies supposedly being tested. 

And it is not about effect sizes. I have become less convinced of this as some kind of across-study comparison measure the more that I have read. An effect size of 0.3 from a well-designed RCT may be far more significant than an effect size of 1.2 from a badly designed, poorly controlled trial with a dodgy test at the end of it. It simply does not all come out in the wash in the way that is sometimes supposed.

Instead, we should be looking for a volume of replicated studies where the strategies that are tested are well-designed. An example of such evidence is the evidence supporting systematic synthetic phonics (SSP) as an approach to teaching reading. Not all of these studies are RCTs. Yet the findings are so consistent that we now have three national panels from the US, UK and Australia all confirming that the weight of evidence supports SSP.

It really would be perverse to ignore it.

Advertisements

8 Comments on “Evidence in education”

  1. suecowley says:

    Teachers in England can’t ignore SSP, because it is mandated via the QTS standards (the only method that is) and this mandate is then enforced via the phonics test. Do you agree with the government that mandates should be used in teaching? Or would you prefer a focus on the outcomes achieved, rather than the approaches used to achieve them, as seems to be the current drive in reforms to Ofsted?

    • It’s rather naive to say that because something is mandated through official requirements, it can’t be ignored. For any programme to be properly carried out, the hearts and minds of classroom practitioners need to be changed. We have a long haul ahead to do that, after decades of progressive indoctrination.

    • Bob says:

      I think the answer probably is that this should not be prescribed by Ofsted, after all that was exactly the criticism on Ofsted before (e.g. Peal’s book). Unless of course it was more a case of not liking *that* specific judgment by Ofsted but not minding judgments one agrees with. Which would be very opportunistic.

      • gregashman says:

        Mandating bad ideas is, indeed, bad. However, it does not necessarily follow that mandating good ideas is also bad. But it could be.

    • gregashman says:

      This post is not really about mandation. That would require a more detailed response. I certainly think that mandating bad ideas is a bad plan. However, I am not so sure about proven approaches such as SSP. Regardless of mandation, I doubt from personal experience that the majority of primary teachers use SSP although I am willing to reconsider this if evidence exists. The phonics check is an interesting experiment. In one sense, it measures outcomes and so perhaps this is a focus on outcomes.

  2. […] Evidence in education. […]

  3. Dick Schutz says:

    we should be looking for a volume of replicated studies where the strategies that are tested are well-designed
    Amen! The thing is though, the evidence is that we don’t now have these replicated studies. What we have is a body of educational research that can be used to support any claim about schooling that anyone cares to make. It really would be perverse to ignore the evidence, wouldn’t it?

    Don’t get me wrong. I fully agree with your views about Cognitive Load Theory, RCTs, Effect Size, research, and the blog in general. But much of the current EdLand buzz for “evidence,” “evidence-based decision making,” “research,” and such is akin to alchemy and ignores the “on the ground” status. It puts teachers in an impossible bind, and although well-intended is generally counterproductive.

    Although Schooling Intelligence (in the sense of Military Intelligence and Business Intelligence) is currently very weak, strengthening it is a very tractable matter. For example, the “best evidence” regarding reading instruction is in the Natural Educational Experiment currently in progress in England. At a national level, the data are conclusive that the UK “treatment” is i
    inteoculary significant (the difference hits you between the eyes with no statistical processing required) compared to the control groups of the United States and Australia.

    However, as-yet-unexamined variability in the SSP treatment remains at the local-authority, school, and classroom levels. The data indicate that the variability is by-and-large not a function of student biosocial characteristics, but of how schools and teachers choose to interpret “SSP.” (SSP is “the law,” but the majority of schools and teachers report using “mixed methods” to “augment phonics.” The majority of UK opinion conforms to that in the US, where the term “balanced literacy is used to describe the same instructional practice.)

    The Experiment is not recognized as such in England and has for all practical purposes escaped notice in other parts of the UK and in countries outside the UK.

    My point is that opportunities experimentally generating “replicated studies” using tried-and-true scientific and technological methodology abound in precollegiate schooling. Each year a fresh cohort of “subjects” enter the schooling pool. Their instruction and nurturance is still a black box, but at this point in time there are no scientific or technical obstacles to rapidly improving Schooling Intelligence.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s