Bucking the trend – Part II


It would be pretty silly, would it not, to point at a society very different to Australian society and say, “These guys do well in the Programme for International Student Assessment (PISA) so let’s copy what they do.” Unfortunately, that’s about as sophisticated as much of the discussion about Finland has been. We simply cannot know whether anything we identify about Finnish education, or anything that Finnish educators highlight as a cause of their success, is the reason for the relative difference in performance of Finland and Australia.

And while the example of Singapore aligns far better with my own particular biases about what good education looks like, I would fault any similar approach that swapped out Finland as our object of affection and replaced it with Singapore (like that would even happen, but, you know).

Instead, the better comparison is to examine trends and variations within systems (and PISA have worked hard to make such data available, as I have written about before). This is what is so interesting about the graph in Part I of these two posts. Yes, I can see there is a trend and that’s what makes those schools that are bucking that trend so interesting. Perhaps we can learn something from them.

However, in order to learn from outlying schools, there needs to be some variation. If every single school in a state requires teachers to issues learning styles assessments and differentiate according to learning styles, we can draw precisely no inferences about the utility of this approach from a comparison of the performance of these schools. Fortunately, we have other evidence to draw upon in this case. The problem arises when we wish to draw inferences about the kinds of complex real-world approaches that schools tend to adopt – those large, messy policies that bridge research, experience, local conditions and inspiration.

That’s when introducing controlled variation to a system can help. No, it will never prove a cause-and-effect relationship, but it can enable us to make a few more tentative inferences. This is aided when performance information is made publicly available. In Australia we have the MySchool website that is set-up for the very specific purpose of informing parents about local schools. The UK government makes data analysis more easy. For instance, at the ‘compare school performance’ website, you can download spreadsheets full of progress data. This has not yet been updated for 2019.

However, you still need the variation. You need schools to be pursuing different approaches so that you have a chance of learning something about those approaches. In the UK this seems to have been aided by greater school autonomy and the Free Schools movement. In his recent post for the campaigning group Parents and Teachers for Excellence, Mark Lehain notes that a pattern is starting to emerge where those schools that combine a ‘warm-strict’ approach to behaviour with a knowledge-rich curriculum are appearing as outliers in the data. Once the 2019 data is available, we can assess this more systematically.

I would like Australian State and Federal politicians to reflect on how we may introduce more planned variation into our own state education systems and how we might learn from the natural experiment being conducted in England. After all, England and Australia have far more in common with each other than either does with Finland or Singapore, so there is a good chance that promising approaches identified by variation within England will also be promising here.

Advertisements

Bucking the trend – Part I

There were a number of responses on Twitter when Pasi Sahlberg posted a Tweet apparently quoting Diane Ravitch:

At a basic level, the statement contradicts itself. If a student taking the same test at different times will get different results then that student’s results cannot be ordained by their family income and parents’ education. Nevertheless, being charitable, there is a sense in which Pasi and Ravitch are right and we will return to that later.

Looking at the broader picture, there is irony in Sahlberg coming out against standardised testing, if that is what he is doing. Sahlberg is currently professor of education policy at the Gonski Institute for Education at the University of New South Wales in Sydney, but he is originally from Finland. He rose to prominence as an authority on Finnish education, writing books on what the world can learn from Finland. And the world is keen to learn these lessons. Why? In the early rounds of the Programme for International Student Assessment (PISA), Finland gained some of the highest results and this sparked a rush to find out how they did it. But PISA, of course, is a set of standardised tests.

Since those early days, Finland’s PISA performance has significantly declined and the factors often cited as the cause of its early success are likely to be wide of the mark. It still performs relatively well compared to other countries, but this has never been a valid comparison. Countries differ on a variety of factors from the homogeneity of the population, all the way down to the home language and how regular and easy it is to learn. This means that the direction of travel of a particular state or country tells us more than any comparison between different countries.

So in what sense are Sahlberg and Ravitch right about standardised testing? Well, it certainly correlates strongly to family background. In Australia, schools are given an ICSEA score that measures educational advantage. School students also sit standardised NAPLAN assessments in English and mathematics. The correlation between the two is striking (thanks to Julian Rossi, @julianvrossi):

If Sahlberg and Ravitch made the claim that standardised tests are unreliable at the student level but the aggregate scores correlate with family background at the school level, then the claim is more justified. No assessment is ever completely reliable at the student level and therefore accepting their point about reliability depends on how much variation you are prepared to tolerate.

Why would standardised test scores correlate to family background? If all else is equal, it makes sense that children from financially stable homes whose parents are highly educated would do better than those who lack this background, even if there is a fierce argument about how much of this advantage is nature versus nurture. However, in Rossi’s graph, we can clearly see that all thing are not equal. There’s a school with an ICSEA score just over 800 that is far outperforming many schools with an ICSEA score above the average of 1000. I wonder which school this is and I wonder what they are doing?

That’s the advantage of having standardised test scores to consult. We can now ask these questions. Even better, NAPLAN assessments take place at Years 3, 5, 7 and 9 and so, if we wish, we can examine which schools have students who make the most progress.

For instance, Blaise Joseph of the Centre for Independent Studies used NAPLAN data to identify primary schools that are bucking the trend, visited them and described some of the common themes such as strong behaviour policies, explicit teaching and evidence-informed reading instruction. Ideally, it would be good to compare this with a control group of less effective schools*, but nonetheless, this kind of analysis is useful to schools and policymakers who want to know how to improve.

In Part 2, I will examine another context where some schools are bucking the trend.


*You can see why this might be difficult to do. It’s relatively straightforward to call a high-performing school and ask if you can visit them and find out about why they are successful. It is harder to call a low-performing school and ask if you can visit them and find out why they are unsuccessful. Nevertheless, it is feasible that some schools would want to cooperate in order to find ways to improve and this kind of research should be a focus of university education faculties.

Why so many education-related freedom of information requests in the UK?

I will never forget the UK election of 1997. The UK had been governed by the Conservatives for as long as I could remember and there was a sense that the party was a spent force with no fresh ideas and riven by bitter infighting over Europe. In constrast, Tony Blair’s Labour party offered the hope of change and renewal. This was epitomised by a credit-card sized ‘pledge card‘ listing five priorities that Labour would act upon as soon as they were elected (this piece of history was later repeated as farce in 2015 in the form of Ed Milliband’s tombstones). Although not on the pledge card, a key early commitment of the elected Blair government was to implement a Freedom of Information bill.

Although 1990s Britain was a democracy, it was a common perception at that time that it was a particularly secretive democracy. Although most people could see the need for government to keep certain secrets, there was a view that it erred on the side of blanket secrecy. Fear of embarrassment did not seem like a good reason to fail to disclose information to the public that was, after all, collected and generated at the public’s expense. I still believe that this is a fine democratic principle and one that is worth defending.

However, in the intervening years, something decidedly odd has happened to Freedom of Information requests. As seems fair, all such requests are publicly available on the http://www.whatdotheyknow.com website [Correction – not all requests are made through this site – thanks to Laura McInerney for pointing this out]. A number of educationalists have been lodging Freedom of Information requests related to education and then publicising the results on Twitter. Some of these requests appear, at face value, to be rather strange.

Take this request, for instance, by Ross McGill, the former teacher behind the TeacherToolkit Twitter brand. McGill sought, “an anonymised sample of 2017/18 performance management appraisal targets (set by line managers) for the senior members of OfSTED [The English schools’ inspectorate] staff, ending 31st March 2018.” This request was declined on the basis that such information could not feasibly be anonymised.

McGill also lodged another odd request. For context, Nick Gibb, an education minister in the UK, occasionally Tweets out links to blog posts he finds interesting. For instance, here is Gibb Tweeting a link to one of my posts:

McGill sought the following ‘information’:

“Nick Gibb MP – aka @NickGibbMP appears to be quoting various teachers and academics on Twitter as sources of evidence and/or recommendations as blogs, all teachers must-read.

What is the rationale for Nick selecting:

– which teachers to follow
– which blogs to read
– who tweets from his Twitter account
– the rationale for blog selection
– can any teachers email Nick their blog links?
– if any, what criteria do you use?”

Most of this is not really information at all. The Department for Education responded with:

“The department does not hold information relating to your request. The Minister runs his own Twitter account without involvement from the department.”

There have also been quite a few Freedom of Information requests about free schools and academy chains, asking about specific approaches and policies.

Sue Cowley, another UK educationalist has lodged such requests, alongside a request that Ofsted, “…provide me with the name or names of the person or people who wrote [a speech by the Chief Inspector] and their qualifications and experience in the field of early childhood education/the EYFS.”

Ofsted did not disclose this information, stating that, “we are not disclosing this information to you as we consider it to be personal data,” before pointing out the following to Cowley in their response:

“…the Information Commissioner (ICO), listing the ‘do’s and don’ts’ for making information requests. Amongst other things, the ICO advises requestors not to:

“Use requests as a way of ‘scoring points’ against an authority (…)”

“Submit frivolous or trivial requests; remember that processing any information request involves some cost to the public purse. (…)”

“Level unfounded accusations at the authority or its staff” or

“Make personal attacks against employees.”

If you decide to make further requests to Ofsted in the future we would kindly ask for you to take into account the advice above”

It is interesting to put this in the context of the UK government pursuing policies on free schools and academies and on the curriculum that are against the progressivist orthodoxy favoured by many educationalists. I am not sure exactly what is going on, but this is not where I thought Freedom of Information would lead us.

Do progressivist teachers dislike the subjects they teach?

Embed from Getty Images

Last weekend, I examined the case of those who wish to mush mathematics into an inductive, subjective kind of a thing, sacrificing what makes mathematics special in the first place (see here and here). In the comments on one of these posts, the blogger Andrew Old wondered, “if the appeal of progressive education is to people who just don’t like the subject they are meant to be teaching.” Let’s examine this suggestion.

Very briefly, educational progressivism is one of the two main currents in educational thought. It is debatable how much impact it has made in practice, but it has certainly captured our schools of education and many of the other associated education bureaucracies. Some of the common themes of progressivism are a focus on the individual over the collective – often expressed in terms of being ‘child-centred’ rather than ‘teacher led’ – and a commitment to more natural, implicit forms of learning where students figure things out for themselves through play or other self-directed activities. Indeed, poor behaviour is often explained by progressivists as being caused by unnatural learning experiences that are not engaging enough. Although it might share the term ‘progressive’ with left-of-centre politics, there is no reason to conflate the two, with proponents of educational progressivism coming from both the left and right.

The central problem for progressivism is that unlike learning to speak or hunt or cooperate socially, we have not evolved to naturally acquire academic knowledge and skills because they are relatively recent inventions. So the hope of acquiring them in the same way is a forlorn one. Instead, we have traditionally grouped academic subjects semantically according to which knowledge and skills that they draw upon. Often, different academic fields represent different ways of establishing truths about the world – they have different ‘epistemologies’. My key concern about losing sight of the nature of mathematics is that we potentially lose an important way of thinking along with it.

There is therefore a clear tension between progressivism and subject disciplines. This is why we constantly hear the call to a revolution that will break down subject silos, either through project-based learning or cross-curricular topics. The trouble is that when you try to make ideas cohere around arbitrary projects and themes, you make the semantic links less clear. I suggest this affects the ability for students to form coherent schemas of related concepts and that is another reason why progressivism has a track record of failure.

Progressivism is also the reason why subject disciplines are constantly being diluted from within by calls to humanise maths or to devote English lessons to studying the lyrics of the latest ephemeral pop star rather than boring and irrelevant Shakespeare. History and geography become social studies. Science becomes climate change and pollution. Physical Education becomes healthy lifestyles. Foreign languages become how-to-get-by-on-your-holidays.

And if you devalue the actual content of the curriculum, you need to create a new purpose for education and that is the nebulous, ill-defined generic skills that our universities are apparently so keen to pursue. That’s no surprise.

If there is an inbuilt bias against subject disciplines in progressivism, this poses an interesting question: Which comes first? Are progressivists against subject disciplines because of their ideology or is it a dislike for subject disciplines that causes people to adopt progressivism in the first place?

Anecdotally, many of us once accepted key progressivist doctrines. It is hard not to when this ideology is largely assumed across vast swathes of the educational landscape and, not least, in schools of education. So this suggests that progressivism may be a cause. Alternatively, most of the people I can remember in my career who were comfortable with blowing up subject disciplines such as mathematics and science were school leaders from outside these subjects who appeared to hold something of a grudge. These are not people who don’t like the subject they are teaching, but they certainly don’t like the subject. So perhaps progressivism then becomes a convenient tool – a hammer to smash the idols of the old religion.

Yet going back to the original point: Are there art teachers who don’t like art? Are there mathematics teachers who don’t like maths?

I’m not sure, but if such teachers really are out there, what are they doing with their lives?

A lot of people don’t seem to know what mathematics is

Embed from Getty Images

Since my previous post, I have received a number of comments from people that suggest they don’t understand what mathematics is. Although there is no single, accepted definition, I think it’s important for maths teachers and educationalists to have a fairly clear idea of what maths is otherwise we risk innovating our way out of teaching it.

In short, my claim is that mathematics uses deductive reasoning whereas areas such as science and history use inductive reasoning. This means that mathematical results are certain in a way that scientific or historical truths are not. Instead, the latter are probabilistic. Here is a useful primer on inductive versus deductive reasoning. Critically, the certainty that mathematics provides has a pretty big flaw – truths are only true if the axioms they are based upon are true and mathematics cannot be used to prove its own starting axioms.

To give an example of deductive reasoning:

God favours the English

This woman is English

Therefore, God favours this woman

The logic of the syllogism is complete and flawless, but this does not mean that God favours the English. Such a question is obviously outside the realms of deductive reasoning (note that I am not claiming that all deductive reasoning is mathematics – I am using these syllogisms because they often make the point more clearly than a mathematical example).

This is similar to the supposedly subjective bits of maths that people point me towards, such as an assumption about a base-rate probability or whether a data point is an outlier. These questions are scientific, in the broadest sense. For instance, computing p-values for psychology experiments has become controversial in recent years. Nevertheless, the deductive logic involved in computing them remains the same. The controversy lies in the inferences we draw on the basis of these values and that, again, is a scientific question. We cannot mathematically prove the validity or otherwise of the p-value.

To give another example, suppose we have a sample of 200 people and we know that 5 have influenza, we can then do plenty of maths. We can determine the standard deviation, work out confidence intervals and so on. The axioms we use come from the data and from assumptions that sit behind standard deviations and confidence intervals. We cannot mathematically prove the value of confidence intervals and we cannot mathematically prove any scientific conclusions. However, these values may be used to inform scientific, inductive reasoning.

Essentially, applied maths makes use of maths in answering other kinds of questions. There is a maths part and a non-maths part to this process. If you are a trained scientist in the heat of problem solving, you may not see or even care about the distinction, but there is one.

Clearly, there are plenty of questions that cannot be addressed by mathematics and there are paradoxes within mathematics where mathematical reasoning breaks down (an example of a paradox is that you cannot prove the truth or otherwise of the statement ‘this statement is false’ using deductive reasoning). However, neither of these introduce matters of opinion as part of mathematical reasoning and neither of these falsify my claim that mathematics is right or wrong. That’s like suggesting the claim that ‘a car is a form of transport’ is false because cars sometimes break down or cars cannot transport minke whales.

Similarly, Gödel’s Theorem does not prove the subjectivity of mathematics, despite what some have claimed. Ironically for those who attempt to make this argument, Gödel’s theorem is itself a beautiful piece of deductive reasoning. It demonstrates that whatever formal system of mathematics we devise, there will be mathematical truths that cannot be demonstrated to be true within that system. This provides a theoretical limit on what we can prove within any given formal system. It does not introduce subjectivity.

The beauty of maths is that it’s right or wrong

Embed from Getty Images

One way of understanding the subject disciplines of an academic curriculum is to see them as representing different ways of thinking; different and powerful ways. In this schema, mathematics represents deductive logic. It is iron-clad. If x=3 then 2x=6. No question. Mathematics has none of the fuzziness of the inductive logic characteristic of science and, to a certain extent, history. And it has none of the raging ambiguity afflicting the interpretist arts such as English, where you can just make up words if you want to and where I am informed by domain experts that although the word ‘literally’ literally means ‘literally’, we should be relaxed about people using it to mean ‘really’ because those two contradictory meanings can exist in some kind of lexical superposition.

Mathematics is a refuge of certainty and I am sure that this is part of the appeal for many who fall in love with the subject.

No matter, there is no object in the world so beautiful that someone out there won’t want to deface it and there are folks who are set on defacing mathematics.

Apparently, there is a move afoot to ‘humanise’ mathematics. You may wonder what is so inhuman about deductive logic, but the term seems to originate in the narrower sense of the humanities disciplines. In other words, we are talking about a movement seeking to make maths more like history.

This clearly misses the point that mathematics is so powerful precisely because it represents a different way of thinking than other disciplines.

Writing on his blog, maths education pundit Dan Meyer proclaims that:

“Math is only objective, inarguable, and abstract for questions defined so narrowly they’re almost useless to students, teachers, and the world itself.”

Instead, we need to focus on questions such as how many bricks there are in a pile of bricks. These remind me of those occasional fundraisers where you are asked to guess how many jellybeans are in a jar; a mild distraction that would rapidly become tedious as a curriculum.

According to Meyer, we should:

“Ask students to make claims that demand to be argued and interpreted rather than evaluated by an authority for correctness… If our students leave our classes this year without understanding that they have had made unique and original contributions to how humans think mathematically, we have defined “mathematics” too narrowly.”

No. We already have humanities subjects. We don’t need to dress mathematics up as something that it is not. It serves no purpose and it misunderstands precisely what it is about mathematics that makes it valuable. Moreover, mathematics won’t work very well as a humanity, kids will see through it and check out. This is precisely what happened in the U.S. when history was redefined according to Deweyan doctrine to become an ugly form of social studies that everyone dislikes.

Sometimes people approach me via social media and plea for nuance. They seek to build bridges. They ask why I cannot make my peace with the proponents of fuzzy maths. What’s wrong with a few rich tasks here and there? I cannot see a clearer and more definitive divide than between those who want to respect the discipline of mathematics and pass this on to future generations, and those who want to pulp it into something else that involves endless windbaggery.

There is no room for compromise here.

New study shows an element interactivity effect but little support for productive failure

Embed from Getty Images

Ouahu Chen, Endah Retnowati and Slava Kalyuga* have just published a new study consisting of two experiments.

In the first experiment, students studied the formula for finding the areas of various shapes, such as the area of a circle. Half the students were randomly allocated to a group that studied worked examples before generating the formulas and the other half of the students generated the formulas before studying the worked examples. The tasks were low in element interactivity – that is to say that in order to learn that the area of a circle is π x r x r, for example, you need to understand what π, r and multiplication mean, but little else.

However, the process was then repeated for tasks that were higher in element interactivity. These tasks involved finding the areas of composite shapes. Given that the students were relative novices, they had no relevant schema to draw upon and had to process all of these new, dependent elements in working memory.

In the first set of tasks, there was no statistically significant difference between learning via either of the two conditions. However, in the second condition, studying worked examples first was found to be superior. This is consistent with the predictions of cognitive load theory that problem solving would overload working memory and lead to less learning, but in conflict with the predictions of productive failure which would suggest that problem-solving first would be superior because it activates relevant schema or makes students aware of their knowledge gaps.

In the second experiment, a similar process was followed, but with much more knowledgeable students. This time, there was no advantage to studying the worked examples first.

This illustrates a key point about the concept of element interactivity – it is not a property of the materials alone. The number of interacting element you need to process in working memory depend both upon the learning materials and the expertise of the learner. This is because knowledge held in schema in long-term memory can be deployed relatively effortlessly and is not subject to the same constraints and new knowledge that has to be processed in working memory.


*Kalyuga is one of my PhD supervisors