Teaching children how to think etc.

Another decade, another article in The Conversation illustrating the irony that advocates of critical thinking seldom think critically about it.

There are two interlinked but distinct concepts that we have to consider about critical thinking. Firstly, how general are these skills and, secondly, how transferable are they? They may sound like the same thing, but they are not. And they both have real, practical consequences.

Firstly, let’s consider generality. Those who tend to buy into the view that critical thinking is a general skill (or a ‘general capability’ as it is termed in the Australian Curriculum) can fall into two main traps. The first is to view the contexts for thinking critically as interchangeable. This is what we see when science lessons become projects involving cardboard boxes and LEDs. The precise science covered is not considered important because students are developing their skills of creativity, critical thinking, problem solving and so on. However, if you view these skills (or ‘virtues’ as Carl Bereiter calls them) as being highly domain specific, then these students have merely developed the ability to think critically, creatively and so on in the domain of cardboard boxes and LEDs. The value of this to the student then becomes debatable, especially when contrasted with actually learning some science.

The second trap is to think that, because these skills are general, you can teach a discrete critical thinking course or bolt a critical thinking module onto the curriculum that deals with these strategies in the abstract or in model contexts.

When we start to consider the kinds of strategies that may be more generally applicable, we often alight on maxims such as ‘look at the problem from different perspectives’ or the kinds of rules-of-thumb embedded in logical fallacies. I tend to think that there is a trade-off. The more widely applicable such a strategy or rule-of-thumb is then the less useful it tends to be. One example is perhaps the maxim that, ‘correlation is not causation’:

This maxim is applicable to a number of fields where we seek correlational evidence, from the sciences, through social science and into the humanities. There are even philosophers who will go on about it a bit. And you may think it has high utility. But let’s imagine you wade into a discussion between two public health officials about the health impacts of smoking with your pithy observation that ‘correlation is not causation’. You are likely to get short shrift. Why? Because they have a lot more contextual, domain-relevant knowledge. They can probably discuss the extent of the correlational evidence and the tipping point that signifies acceptance of a causal relationship. If you try to apply this maxim when reading an article written by one of these officials, it may lead you into error.

Now consider someone making claims that school exclusions cause knife crime based upon statistics showing rising levels of both. Here, the maxim that ‘correlation is not causation’ may point you in the right direction. It is plausible, for instance, that the same thing that causes students to be excluded from school is causing the knife crime. Or that knife crime causes students to be excluded from school. But how can you tell the difference? Armed only with your maxim, how will you know when to apply it and when not to apply it? The only way you can figure that out is by learning lots about the evidence behind smoking or knife crime. In other words, the way you ultimately establish a reasoned position is by learning about the domain in question. The maxim alone could lead you as often into error as to the truth.

So what of transfer?

Even if we accept that ‘correlation is not causation’ has some general utility, it is not certain that people will apply it generally. Odd as it sounds, there is quite a lot of evidence that when people learn strategies that do work across different domains, the strategy somehow becomes locked to the domain in which they learnt it and so people fail to apply it to other situations where it would also add value. Dan Willingham highlights this in his discussion of two problems that involve a tumour and a fortress.

Latterly, proponents of critical thinking appear to have become more aware of the common criticisms and will make statements such as, ‘of course, knowledge is important to critical thinking’. However, you cannot have it both ways. If critical thinking just represent the highest levels of performance within a traditional subject discipline, then we do not need critical thinking courses or a special focus on critical thinking. We just need to teach our subjects really well. However, if we believe that it deserves special treatment, then why do we think that, unless it has a more general impact?

Peter Ellerton, the author of the piece in The Conversation, proposes that we consider the example of Philosophy for Children, a, ‘program that involves teaching the methodology of argument and focuses on thinking skills.’ So it is a discrete critical thinking course. Ellerton also thinks it will have a general impact because he states that, ‘Studies involving a Philosophy for Children approach show children experience cognitive gains, as measured by improved academic outcomes, for several years after having weekly classes for a year compared to their peers.’ In other words, deliver Philosophy for Children and you will produce gains in wider academic outcomes. I am not convinced by the study Ellerton cites. It is not randomised and there is high attrition in the experimental versus control groups (28% versus 9%).

A perhaps more thorough study of the benefits of Philosophy for Children is the Education Endowment Foundation randomised controlled trial that found that children who had the intervention made greater gains in reading than those who did not. However, this was an odd trial which I have discussed at great length on this blog. The scale-up study is scheduled to report early next year and then we will see more evidence supporting or refuting the extraordinary idea that discussing whether it is OK to hit your teddy bear impacts on reading ability.

Until then, proponents of critical thinking, although they may be welcomed at conferences, have a credibility gap. They need to demonstrate two things, firstly that these skills are generally useful and, secondly, that students will transfer their learning of these skills to the various different domains where they apply. Until then, we should ask, as Carl Bereiter does:

“Nobel Laureates, captains of industry, cabinet ministers, school superintendents – any one of them is likely to end a commencement address or a discourse on the current crisis by declaring that schools have got to ‘teach students how to think’. The words roll easily off the tongue and the speakers show not the slightest doubt that the words mean something. But do they?”

Standard

6 thoughts on “Teaching children how to think etc.

  1. Kevin says:

    Decade? For a maths guy I’d have thought that counting to 10 was easy: 2020 is the last year of the decade that started with 2011 (like 1 is the first egg in a count of 10 eggs). Never mind. Great article. I had a recent discussion with nephew’s teachers about their ‘project learning’ infliction on students to develop ‘critical thinking’. Happily nephew’s team decided it was a CWOT (complete waste of time) and refused to submit anything, noting it did not contribute to overall marks.

    • Chester Draws says:

      So the “twenties” run from 2021 to 2030, but don’t include 2020? No. Just no.

      Incidentally, we often count from zero in Maths. If I ask students to plot the first five points on a positive Cartesian grid, I expect the first point to be at zero.

  2. Castro says:

    G’day Greg,

    I just wanted to congratulate you publicly on Clarendon College’s excellent results as published in the Age during school holidays.

    I have no doubt that the rapid rise in the school’s ranking is, in some part, due to your hand on the tiller. The way that you cut through the cant and the platitudes has obviously inspired your colleagues to focus on what actually works in education.

    I am not on Twitter, so I don’t know if these results have been discussed on that forum. I hope that they have been, in order to dampen the enthusiasm of those who attack you personally for your views. If not, I think that you, or one of your supporters, should post the results.

    All the best and see you at the next ResearchEd.

    Cheers!

  3. alandtapper1950 says:

    Hi Greg: I am one of those “proponents of critical thinking”. You raise some good questions. Here are my replies. I distinguish between lower case “critical thinking” = “the ability to think well”, from upper case “Critical Thinking” = “courses designed to teach that ability”. The first I’ll call “ct”, the second “CT”.

    1. You comment: “If critical thinking just represent the highest levels of performance within a traditional subject discipline, then we do not need critical thinking courses or a special focus on critical thinking. We just need to teach our subjects really well.” About this I agree, “ct” could be taught well without special courses on “CT”. But, as I see it, this approach could only succeed if the teacher had considerable knowledge of CT. Do teachers have this knowledge? My experience is that they do not. Hence the need for teacher PD or for standalone CT courses for students.

    2. You contend: “However, if you view these [“ct”] skills … as being highly domain specific, then these students have merely developed the ability to think critically, creatively and so on in the domain of cardboard boxes and LEDs. The value of this to the student then becomes debatable, especially when contrasted with actually learning some science.” People who advocate CT courses, such as myself, are not defending replacing science teaching with projects involving cardboard boxes and LEDs. That’s got nothing to do with the issue. A CT course is a course on reasoning and argument skills.

    3. You take as your example the maxim that “correlation is not causation”. Here you are talking about real CT. Every CT course that I know of teaches that maxim. Students get tested on whether they can recognise as fallacious the inference from a strong correlation to a causal claim. It is always fallacious. You seem to doubt this! You say: “If you try to apply this maxim when reading an article written by one of these [public health] officials, it may lead you into error.” But public health experts know that more is needed than mere correlation before it can be proven that smoking causes lung cancer or emphysema. Generally, we need also a causal direction and evidence of a mechanism, in addition to correlational evidence, before causality can be claimed. Correlations are not sufficient. Students should be taught these basic argument steps. CT courses teach and test this sort of issue.

    4. You note the 2007 Trickey and Topping study that Peter Ellerton cites. In response, you say: “I am not convinced by the study Ellerton cites. It is not randomised and there is high attrition in the experimental versus control groups (28% versus 9%).” You make a number of mistakes here. The attrition rate was 28% in the control group, not the experimental group. On that point, the authors say “Given the pattern of sample attrition, the group difference seems likely to be underestimated.” In other words the effect may have been underestimated by the study. Given Keith Topping’s eminence in the field of educational research, I’d trust his views on this. (https://www.dundee.ac.uk/people/keith-topping)

    5. In addition, the study is not really relevant to whether CT should be taught in schools. It was a study of Philosophy for Children (P4C), called “collaborative philosophical inquiry” in that study. CT and P4C are different things. They may overlap but they are essentially distinct. I can explain further if you wish. It was P4C that Trickey and Topping found had transferable benefits.

    6. It may be of interest that Topping, Trickey and Cleghorn have very recently published “A Teacher’s Guide to Philosophy for Children” (Routledge). I haven’t got a copy yet, but I recommend you order it for your school library.

    Alan Tapper

    • Chester Draws says:

      Alan, you say:

      “Correlations are not sufficient. Students should be taught these basic argument steps. CT courses teach and test this sort of issue.”

      Yes, I’m sure they do. And I’m also sure the students get it right. But the moment they leave the CT course, they will continue to confuse correlation and causation.

      We see this all the time with clever people, who make idiots of themselves the moment that they leave their field of expertise. A scientist who is highly critical of other scientists inside their own field, and who knows how much error there is, will often be completely uncritical of scientists in other fields and believe any old rubbish.

      We see this all the time with students unable to apply what they learn in Physics to what they learn in Maths — despite doing courses simultaneously, they just don’t apply what they learn in harmonic motion to what they learn in sine curves. Most people do not apply their knowledge and skills across boundaries, no matter how many times people tell them they should.

      CT courses cannot make people apply ct.

      It’s not like students aren’t taught that correlation and causation cannot be linked without other evidence in Maths classes. Yet they apparently need to be taught it again in CT!

  4. Pingback: The gritty truth, nearly all middle school teachers stressed and more in the news roundup — Psych Learning Curve

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.