Project based learning – further analysis 

I’ve now looked at the Education Endowment Foundation (EEF) report on project-based learning (PBL) in some detail. It gets even more fascinating.

There was a high attrition rate. Half of the schools who initially adopted the intervention and paid the £10,000 start-up cost dropped-out before the end of the study. This affects the security of the evidence. 

But why did the schools ditch PBL? I imagine that they felt it wasn’t working for them. So if they had stayed in the project then it is plausible that the results would have been even more negative. This attrition itself also strikes me as further evidence of the failure of PBL. Why?

As John Hattie has suggested, the vast majority of education interventions ‘work’ because people have expectations of success, invest in them financially and with their own pride and are backed by support from consultants. If something doesn’t work under such favourable conditions then that’s a bad sign.

Another interesting dimension is the way that the project changed during the study. It was initially intended to last two years but that was reduced to one because nobody would commit more funding. And they initially intended to measure outcomes in English, maths and science:

“The original protocol only included the academic outcome measures of literacy, maths and science. The piloting revealed that schools were not including maths as part of project based learning and that science was included only in one school. It was therefore decided to retain the literacy focus but to introduce an ‘engagement’ secondary outcome as this seemed more aligned with the intervention’s aims.

So they ditched maths and science because the students weren’t learning any (remember that this took up 20% to 50% of the timetable) and added engagement because they thought this might generate a more positive outcome. Yet at the end of the study there was no evidence even for an improvement in engagement.

The EEF acknowledge that there is no strong evidence for PBL from previous studies:

“The existing evidence for a causal link between PBL and attainment outcomes seems to be weak. Most of the reviewed studies did not involve random allocation of participants to control and experimental groups and, as a result, a causal link between project based learning instruction and positive student outcomes has not been established.”

The PBL intervention was designed and led by The Innovation Unit and they reiterate this point on a blog post responding to the EEF study:

“We know that it can transform academic and personal outcomes for students, so why is it so hard to find out if that’s true in general? In particular, why hasn’t a major study funded by a serious organisation been more conclusive, one way or the other?”

I’m not sure that it can transform academic and personal outcomes and I suspect that this is why no major study by a serious organisation has been more conclusive.

At best, PBL might work well as an enrichment task for those students who already have knowledge and lots of resources – eg support from home – to draw upon. This is why I have dubbed it a ‘pedagogy of privilege‘. 

Interestingly, the consultant who recently ran PBL training in New South Wales is British and is a Senior Associate at The Innovation Unit. So we might expect that the PBL he is promoting in Australia has many similarities to the program that failed in the EEF study.

Advertisements

9 Comments on “Project based learning – further analysis ”

  1. Queen's English says:

    I liked this: “The process evaluation, which was based on classroom observations and feedback from headteachers, project leads, teachers, and pupils in the schools, as well as the Innovation Unit delivery team, did provide evidence of positive benefits from doing PBL, in particular in terms of developing oracy, communication, team working, and research skills.”

    So, in other words, ‘We FELT that there was a benefit to oracy, communication, team working and research skills.’

    That’s a bit thin, after all the time, energy and money spent on it, particularly considering the other negative outcomes.

  2. pricedav says:

    Greg: Never let the truth get in the way of a good personal bias.

    Where does it say in the evaluator’s report that they ditched maths and science because they weren’t learning any? The decision to concentrate on literacy was made BEFORE the trial even began.

    ‘Analysis’ is a bit rich when you say “I imagine they felt it wasn;t working for them” This is simply speculation, which for an education researcher, isn’t very rigorous. Especially since the evaluators report makes it quite clear why schools withdrew: some did so in protest at the EEF not funding a second year, another because they believed you can’t compare a PBL innovation of one year against traditional programmes that have been long running. Others had been judged ‘Requires Improvement’ and you will know what that means: a non-stop diet of maths and English to boost test scores.
    The schools in the intervention group were not similar to those in the control group: “8 of 11 (intervention) schools in the study were ‘Requires Improvement’ (an OFSTED categorisation indicating cause for concern) or worse (national average is 1 in 5)…The control group were stable in comparison, with 8 of 12 schools Good or Outstanding” .
    In other words, the control group’s students command of Literacy was likely to be quite a bit better than those in the intervention group even without the PBL introduction.
    Anyone who read the evaluator’s report would have seen the conclusion “ it is not possible to conclude with any confidence that PBL had a positive or negative impact on literacy outcomes”

    In other words, you’re putting words into the researchers from Durham and York who did the evaluation. If you can’t be bothered to read the full report, at least read this: http://engagedlearning.co.uk/lies-damn-lies-and-conscious-misrepresentation-of-evidence/

    As for your final comment – that’s more specualtion, I’m afraid.

    Let’s see if you approve of this comment….

    • gregashman says:

      Thank you for taking the time to respond. I appreciate the fact that you have avoided launching personal attacks. I think we all gain from discussing the issues.

      In terms of maths and science, I quoted from the evaluation report itself and then paraphrased it.

      I am sceptical that the EEF themselves know exactly why schools dropped out of the trial. At best, they know what the schools wanted to tell them. I have a right to speculate.

      You are absolutely right about the disparity in Ofsted ratings between the experimental and control groups. The trouble with the kind of matched randomisation that was done here is that if you match schools closely for one set of factors then they are likely to diverge for another set of factors. It might have been better to have a larger sample or maybe randomise by classes rather than schools.

      If I were leading a school that inspectors had deemed “requires improvement” then I would be looking to implement interventions that would lead to improvements. Obviously, project-based learning was not such an intervention in the judgement of these headteachers. And I agree with them. There is a lack of evidence in general for PBL and the way that it acts has previously led me to suggest it is a ‘pedagogy of privilege’ that does the most harm to the most disadvantaged – something that seems to have been tentatively confirmed by this research.

      Perhaps we can agree that, whether it is effective or not, project-based learning takes more than a year to get going and is probably not suited to schools facing challenging circumstances? Heads should treat it with extreme caution?

      In terms of the literacy data, my understanding is that they controlled for KS2 results so if the students in the experimental schools were more disadvantaged then this would have been taken into account. But I might have misinterpreted this.

  3. pricedav says:

    Well, it took us some time, but we finally did manage to agree on something! It ISN’T suited to schools in UK-type challenging circumstances, yet there are 100+ Expeditionary Learning schools, mostly in areas of high deprivation, who have near 100% college graduation rates with PBL as a central plank, versus 12 schools in the trial. So, I don’t think it’s fair to call it a pedagogy of privilege.
    Nice phrase, though.

  4. Janet Downs says:

    The EEF ‘ditched’ the evaluation of maths and science before the main trial not during it. Maths and science were replaced by ‘engagement’ after the pilot scheme ended.

    You ‘imagine’ the schools dropped out because PBL wasn’t working for them. This is supposition. Both intervention schools and control schools (which weren’t using PBL) dropped out. The Innovation Unit which developed the REAL projects said the schools involved were ‘disproportionately facing challenging circumstances’ such as changes in leadership and academy sponsor. This made it difficult, if not impossible, to cope with structural changes at the same time as a major trial. The Unit said the high drop out rate underlined the ‘challenges’ of introducing innovation in schools and evaluating such interventions. http://www.innovationunit.org/blog/201611/education-endowment-fund-report-published-today

    • gregashman says:

      “The EEF ‘ditched’ the evaluation of maths and science before the main trial not during it. Maths and science were replaced by ‘engagement’ after the pilot scheme ended.”

      I’ve not said otherwise. I quoted from the report and that quote mentions the piloting.

      “You ‘imagine’ the schools dropped out because PBL wasn’t working for them. This is supposition.”

      Yes – I assert my right to speculate. See my response to David Price.

      “Both intervention schools and control schools (which weren’t using PBL) dropped out.”

      The report states that all 12 control schools completed both the attitudinal survey and outcome assessment.

      “The Innovation Unit which developed the REAL projects said the schools involved were ‘disproportionately facing challenging circumstances’ such as changes in leadership and academy sponsor. This made it difficult, if not impossible, to cope with structural changes at the same time as a major trial. The Unit said the high drop out rate underlined the ‘challenges’ of introducing innovation in schools and evaluating such interventions.”

      See my response to David Price.

      • Janet Downs says:

        Re drop out of schools taking part. Apologies. The EEF summary said, The amount of data lost from the project (schools dropping out and lost to follow-up) particularly from the intervention schools…’. This ambiguous statement led me to think both groups had problems with drop out but this was a particular problem with the intervention group.

        I have now read the full report and see that all control schools completed the trial.

        However, the EEF noted that ‘a number’ of schools in the control group had adopted PBL or similar strategies. This makes the trial data even less secure.

        Your reply to David Price said: ‘I am sceptical that the EEF themselves know exactly why schools dropped out of the trial. At best, they know what the schools wanted to tell them.’

        This isn’t the case although someone unfamiliar with what’s going on in England would not know that. When a school decides to convert to academy status (or is forced to convert following a poor Ofsted judgement), it becomes common knowledge. Stakeholders are informed (this is called a ‘consultation’, but the decision’s already been made). The Department for Education lists which schools are in the process of converting. Such information is, therefore, in the public domain. The EEF didn’t have to rely on being told by the school.

        A change to academy status means a change in governing body and frequently a change of head. It is also often accompanied by other changes: name, uniform, ethos. All this is disruptive.

        There is no doubt that PBL is challenging to implement. But it does not follow that finding something challenging means it is a failed method. I accept that it needs to be implemented properly. School21 (mentioned by EEF) is an example of a school which bases is whole curriculum on PBL. It was judged Outstanding by Ofsted. And a little known annual project by the National Gallery encourages project-based learning based on a picture from the Gallery. The current display is here: http://www.takeonepicture.org.uk/exhibition/2016/index.html


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s