What’s your theory?

I’ve read a couple of interesting posts recently about the role of theory in education. The first was by Naomi Barnes for the AARE blog site. Barnes starts by making an interesting observation:

“My student teachers often question the value of educational theory in their initial teacher education. Also often early career teachers tell me that the theory they were taught at university holds no value in their day-to-day practical lives.”

Yet she goes on to address this issue by discussing a theory of history rather than education; an argument that reminds me of the old Schools (Council) History Project in the U.K. (see this paper for an interesting discussion).

The second post was by John Andrews who again makes the case for educational theory. But this time, it is a theory I don’t recognise because it seems to be set in opposition to the, “tangible, the measurable, the calculable, the secure and testable and the practically applicable.”

As a scientist by training, this is not my understanding of what a theory is. Theory provides a conceptual framework to explain the measurable and, critically, a theory is testable. Specifically, a theory is something that is falsifiable: You can conceive and conduct experiments or observations that have the capability of proving the theory wrong. In fact, in science, we don’t even call something a theory until it has survived a number of such attempts. Until then, all we have is an hypothesis.

Some educational theories are not falsifiable. I’m pretty sure that productive pedagogies fits into this category. It’s proponents don’t seem able to imagine any kind of data that we could collect that would prove it wrong. Instead, it’s truth seems to be a starting assumption.

Other educational theories have actually been falsified but oddly remain undead. Piaget’s stage theories spring to mind.

When I mention the learning theories that I am interested in then I’m often told that these are not educational theories at all. Cognitive Load Theory is one of these but it doesn’t pass muster.

Cognitive Load Theory is not intended to be a theory of everything. For instance, it has little to say about motivation. And, as with any nascent theory, it’s had teething problems. For a while, it too was heading for unfalsifiability.

But I am attracted to Cognitive Load Theory because it makes some useful predictions about what happens when we learn and, even if it doesn’t survive in its current form, it has enough similarities to other promising theories – such as Mayer’s Cognitive Theory of Multimedia Learning – to suggest that there’s probably a core of truth to it.

Yet I am deemed to be atheoretical. Perhaps it’s because interested in the wrong theories.


The rights and wrongs of testing

‘Testing’ is a divisive term even if most teachers accept the need for ‘assessment’. I thought it might be worth discussing some of the issues surrounding testing with a view to avoiding the pitfalls.

Testing times

There are a few teachers and educationalists that have funny ideas about assessment. For instance, one fashionable view is that all assessment should be ‘authentic’ and involve the creation of complex projects or presentations. This is actually quite a bad idea for reasons that I will mention later.

However, I suspect that most teachers take a less dogmatic view and see assessment as a part of everyday teaching. And the evidence suggests that it should be. The Testing Effect – now re-branded as ‘retrieval practice’ – is well established. Asking students to recall information and concepts actually enhances learning. So we have a third purpose of assessment to add to those of benchmarking against a standard and of providing information to feed into the next round of teaching. Now, we have summative, formative and instructional assessment.

Nevertheless, these positive results sit in a context of some teachers and parents arguing for boycotts of ‘standardised tests’ and such testing generally being seen as a kind of evil conspiracy.

Standardised tests

I think standardised tests are really valuable. Most of the assessment that I conduct as a teacher is formative and often looks nothing like the kinds of questions you see on standardised tests. Yet the very fact that these tests are standardised is useful. It enables schools to mark their progress against other schools in the state.

Yes, we need to take into account contextual factors – and this often is not well done – but relative performance is still worth knowing. On the background of a general trend will be superimposed those schools who face difficult circumstances yet somehow manage to do very well, as well as those schools with pretty advantaged students who are doing badly. If nothing else, there is an argument that a system that attracts so many tax dollars needs to be open and accountable to taxpayers.

However, the way that many schools approach preparation for standardised tests is deeply troubling and this is what causes the problems.

Practice, practice, practice

If you don’t actually know how to teach something then your fallback position might be to endlessly ask children to do it in the hope that something rubs-off. This seems to be the strategy employed by schools that run students through endless rehearsals of the standardised test.

Students certainly should complete a practice test, especially if the test has unfamiliar kinds of items on it or it takes place in a different room such as an exam hall. Yes, you can quietly pass-off past test items as ordinary questions in lessons and this will help with familiarity. But there is nothing like having a go at the whole thing in the right conditions. Done well, such a practice test will help to allay students’ concerns.

However, if you endlessly cycle students through practice tests then you communicate to them that this test really matters. You transfer some of your anxiety as a teacher onto the students. And you will be feeling anxious because the only reason you are following this strategy is because you can’t think of anything else to do.

Components

One of the most important tasks in teaching is to identify the component parts that make-up a complex performance. If you ask students to write a persuasive piece of writing – the typical writing task in Australia’s NAPLAN assessments – then you are asking them to synthesise motor skills, spelling and grammar skills, understandings about paragraphs, topics sentences and the use of evidence as well as generate interesting ideas.

This places students’ working memories under a lot of strain. If a student forgets to paragraph her response then this could be because she doesn’t have a good conception of paragraphing or it could be because this dropped off the list of things that she was capable of simultaneously paying attention to. You don’t know. So what should you write in your paragraph of written ‘feedback’ at the end of her argument? Well, in a sense it doesn’t matter because she is only likely to be able to take on a couple of pieces of advice and you’ve probably already written three bullet-points about other aspects of her performance.

This is not a good way to go about things. Instead, it may be better to target something, work on that and then assess it. For instance, you could teach your students about topic sentences, explain what they are and live model the creation of a few on the board. You could then assess whether they can identify topic sentences through a multiple choice quiz before asking them to generate their own topic sentences and assess that.

Yes, there are an awful lot of different aspects of writing that you can teach and assess like this and that’s why you need a systematic plan for which ones to address and when. You also need a plan for bringing these components together once you know that students can handle them in isolation. The beauty of assessing in this way is that you can directly associate the assessment evidence with the episode of teaching. It gives you agency.

Don’t be afraid

Testing can be stressful and, sometimes, it should be, but we don’t need to place students under as much pressure as we sometimes do at the moment. Schools have to take responsibility for this, just like they have to take responsibility for teaching the kids. Stress is often the result of a feeling of helplessness and so the best way to fight it is often to have a plan and take control: teachers and student alike.


Apply neuroscience in your classroom!

Making Learners Extraordinary ™

I recently visited Barry Rubiou over at West Bay University’s Cognition Lab. There, they are working on a brand new form of instruction guided by the latest findings in neural imagining. They call this ‘Neuro-Scientific Pedagogy’ or NSP for short and it offers the potential to totally revolutionise the work of schools.

You can do this yourself with some brain clip art Different sections of the brain highlighted in different colours. Brain scanning studies conducted in Rubiou’s lab have demonstrated that we learn less efficiently when under extreme duress, when in physical pain or when intoxicated. We never knew this before and it totally explains why traditional forms of instruction are completely ineffective.

Rubiou’s team have also identified that our brainsactually growwhen we make a mistake! This happens even if we are not aware that we have made a mistake. It also happens before we have even made a mistake. This has massive implications. For instance, teachers should…

View original post 142 more words


When should we provide guidance to students?

Regular readers will know that I often link to a paper by Kirschner, Sweller and Clark to support my argument for explicit instruction. It’s a great paper but it is sometimes dismissed by critics due to its name, ‘Why minimal guidance doesn’t work.’ It turns out that nobody will own the concept of ‘minimal guidance’. They don’t recognise it in their own approach which they always insist contains loads of guidance.

This is a shame because the argument in the paper actually sets out the need for full guidance by providing worked examples or other forms of explicit instruction. Perhaps this is why, when they rewrote their article for an audience of teachers, the authors discussed the case for ‘fully guided’ instruction.

Many teachers and academics are against full guidance and so the argument applies to their methods. For instance, a form of mathematics instruction known as ‘Cognitively Guided Instruction‘ withholds explanations: 

“…teachers would not show the children how to solve these problems. In fact, teachers who use CGI usually tell the children to solve the problems any way they can… 

These teachers know that children are able to solve story problems without direct instruction on strategies, because children naturally direct model story situations about which they have informal knowledge.”

Interestingly, a number of CGI fans on Twitter have been arguing that this is not a form of discovery learning. If CGI is not a form of discovery learning then I don’t know what is. I think this indicates the strength of the argument against discovery learning: people would rather pretend it doesn’t apply to their methods than address this argument directly.

We tend to be attracted to discovery learning because we think it somehow leads to students learning things better. We imagine deeper kinds of learning. This idea was tested in one of the most misunderstood experiments in this field. Klahr and Nigam taught students a key scientific principle – the control of variables – either by explicit instruction or discovery learning. More of the students in the explicit instruction condition learnt the principle. But this is not the point. Of those who did learn the principle, students who learnt it by discovery were no better at evaluating science fair posters for control of variables than those who learnt by explicit instruction.

Following the publication of the Kirschner, Sweller, Clark paper and the fallout from it, Richard Clark wrote a chapter where he identified what he sees as the key difference in the way people view guidance:

“Guidance advocates suggest that learners must be provided with a complete demonstration of how to perform all aspects of a task that they have not learned and automated previously. So even if a learner could solve a problem with adequate mental effort, guidance advocates provide evidence that it is more effective and efficient to provide a complete description of “when and how”.”

Clark contrasts this position with that of those who would only provide guidance if it becomes clear that a student cannot solve a problem unaided.

I agree with Clark that there is evidence to support the position held by guidance advocates. So let’s debate that contention rather than the meanings of ‘minimal’ and ‘guidance’.


Dumbing-down NAPLAN numeracy? The plot thickens

Since my initial post on this topic, ACARA have added a note to their website to explain the changes. The main change to the NAPLAN numeracy assessment involves moving from two papers consisting of 32 questions each, one of which was a non-calculator paper, to a single paper with 48 questions. This single paper has a non-calculator section that only contains eight questions.

According to ACARA:

“…the test continues to cover all sub-domains of numeracy, allowing students to demonstrate performance across a range of numeracy skills. The reduction will not affect either the reliability or validity of the test.

Students in Years 7 and 9 will continue to answer calculator and non-calculator questions, and the number of questions requiring mental calculation (without the aid of a calculator) remains the same as in previous years – there is no reduction in the number of questions of this type.”

You may ask how it is possible that there has been no reduction in the number of ‘mental calculation’ questions when we have gone down from a 32 question non-calculator paper to just 8 questions. Well, there is some logic to this. A proportion of the questions on the non-calculator paper involved things like mentally rotating shapes, constructing expressions or reading graphs. A calculator would be of no benefit for these questions. However, it seemed unlikely to me that these items would constitute 24 of the 32 questions.

So I did a check. I looked at the 2016 Year 7 non-calculator paper. I was able to identify 18* questions out of 32 that involved some form of calculation that a student could complete with a calculator if it were available. That’s more than eight. It also represents 28% of the total whereas 8 questions out of 48 represents 17%.

Personally, I don’t think 28% of questions requiring a mental or pen-and-paper calculation is enough, particularly given the widespread concern about Australia’s continued decline in international assessments such as TIMSS and PISA and specifically in the science and maths subject areas.

Arguing – as I am sure the maths subject associations would – that there is no need for students to be able to do manual calculations in an age of calculators misses a number of key points. Maths is not purely functional – it’s not just about getting a result. The functional argument is like arguing that we shouldn’t teach children how to draw because we have cameras. As well as consolidating knowledge of maths facts, mental arithmetic is likely to support all sorts of activities such as proportional reasoning, factorisation and so on that lead into higher levels of maths. Even if we did accept the functional argument, a calculator user with no mental arithmetic will struggle to spot when he or she has made an error.

It’s worth pointing out that the suite of NAPLAN papers consists of five assessments. Only one of these is a numeracy assessment, with the other four assessing different aspects of literacy. Now, the numeracy element is going to be reduced in size and contain a smaller proportion of non-calculator questions.

*Questions 2, 3, 9, 11, 13, 14, 17, 19, 20, 23, 24, 25, 26, 27, 28, 29, 30, 31


The state of Victorian physics

Last week, I attended a physics conference organised by the Science Teachers Association of Victoria (STAV). It was a curious affair – as all physics teacher conferences are – and I was, as ever, left with the feeling that I had just attended a revival meeting for a religion I don’t quite believe in.

Don’t get me wrong, I love physics. And that’s the problem.

You see, Victoria has recently rewritten its senior physics curriculum to make it groovier and funkier. We no longer have unit titles that describe what the unit is about. Instead, we have facile questions such as, “How do things move without contact?” or, “How fast can things go?” [my emphasis].

The logic of such a change is obvious. Physics is really dull, right? So by changing the titles of units into questions we’ll make it more engaging. Students will find the learning irresistible. It’s all about inquiry. Drama and history classrooms will be mothballed as kids flock to a new, funkier, mutton-dressed-as-lamb physics.

You see, to take this attitude, you have to have both a pretty low view of physics and a predilection for constructivist teaching methods. Both stands are wrong.

Which is why it was such a breath of fresh air to read what Tom Alegounarias of the New South Wales Education Standards Authority had to say about the new physics syllabus that has just been published in that state. According to the Sydney Morning Herald:

“He [Alegounarias] said there would be more focus on the topic rather than the context. Instead of studying ‘moving about’ in Physics, students would learn ‘kinematics and dynamics’.”

Three cheers for that man.

You see, I distinctly remember the suggestion that one reason for the changes now impacting Victoria was to make physics more like the now defunct New South Wales course with its emphasis on ethics and the social impact of physics. So this new turn is most welcome.

While at the physics conference, we also spent much time discussing the requirements of the new practical investigation.

Physics students in Victoria have always had to complete an investigation but it seems as if some schools may have been squeezing this requirement in order to teach the students more physics.

So the latest Victorian physics syllabus beefs up this requirement with a series of more explicit regulations and insists that students must design and conduct an experiment themselves.

This is at odds with what we know from cognitive science – Year 12 students don’t have the expertise to make this a worthwhile activity – and is essentially an imposition of inquiry learning on all schools.


Is NAPLAN numeracy being dumbed-down?

Today, we heard through a Victorian Curriculum and Assessment Authority bulletin of changes to the format of national NAPLAN numeracy tests for Years 7 and 9.

Previously, there were two papers; a calculator and a non-calculator paper worth a total of 64 marks. Now there is only going to be one paper worth 48 marks. Only 8 of these marks are non-calculator. This seems incredible.

Yet I can’t find any information online at present from ACARA, the authority in charge of NAPLAN. And I also don’t recall any consultation.

On the surface of it, it certainly looks like dumbing-down: a victory for the who-needs-to-know-maths-because-calculators party.

Watch this space.