I remember when we used to have computer rooms. Students would go along to them and do a bit of IT; word-processing, excel sheets, looking things up on the internet and so on. The general consensus was that this didn’t really work very well and that IT needed to be integrated into the curriculum. So science labs began to house about four or five desktops at a bench along one side. If you wanted to do anything with them then the students had to take turns.
In science, we were convinced that the future was all about datalogging. This was where, instead of using a thermometer to measure the temperature of something, you used a temperature sensor that you connected to a computer. The computer would then plot a graph for you, removing the need for students to learn how to draw graphs. Dataloggers could also collect data over night, which sounded pretty cool. Someone went to the BETT show and came back with some dataloggers so we used them with the desktops. But it wasn’t very practical because we didn’t have enough.
Then, one year, someone went to the BETT show and came back with a box with some antennae on it. You could plug this in to the network access point at the front of a lab and then wirelessly connect lots of laptops. This was fantastic. Suddenly, every room became a potential computer room. We bought a trolley and a set of laptops and stared down the future like a boss.
Except it wasn’t very practical. The laptops started to lose keys. Every time I booked them there would be a few keys missing or some keys would have switched places. And the little antennae thing didn’t work very well. I tried putting it on top of a box and I tried fiddling with the aerials. One day, I tried standing on a chair and holding it above my head but this was somewhat limiting and had health and safety implications. Whatever I attempted, there would always be four or five laptops that couldn’t log on. Of those that did, there would be a few students who couldn’t remember their network passwords or whose accounts had been suspended.
I largely missed the iPads trend – I’d moved to Australia and no longer knew people who went to BETT – but I do remember my whole school going wireless. Now things would be different. This wasn’t just some little box, this was a proper wireless system. So I asked permission to run a trial. We gave my senior physics class a laptop each to use in class. The laptops were touch-screen so that the students could write on them with a stylus – this circumvented the need to type equations.
Except that it wasn’t very practical. There were the same issues with connectivity, passwords and logins as I had experienced before.
And so I abandoned the trial the next year. We went back to paper and exercise books but I had only written-off laptops temporarily. Despite the false-dawns, I assumed that the technology would work, one day. At this point, we could all reap the benefits of having computers in the classroom. The future might have won for now, but we’d be back.
I had never questioned the basic assumption that sat beneath all of this; the assumption that computers were good for learning. This just seemed obvious. After all, there was a whole industry trying to figure out ways of getting more tech into classrooms. Why else would they be doing that?
When I started to develop an interest in research, I became aware of a strange gap. Where was the research showing that tech led to better learning? Why did there not seem to be much? After all, lots of people were buying and using edtech products. It would not be hard to run a randomised trial or two. We could soon sort out the relative benefits of using different tech in lessons but this evidence just didn’t seem to be there. Why not?
Astonishingly, the evidence that did exist seemed to hint at potential negative effects. I make the point that it was only a hint. We can’t be too conclusive when looking at correlations. For instance, Australia’s heavy investment in computers seems to be associated with worse outcomes but it is hard to strip out the different factors involved or to isolate a cause. It could be the case that increased computer use causes lower outcomes or it could be the case that lower levels of literacy and numeracy cause greater investment in computers. Perhaps there is no direct cause-and-effect relationship and other factors are at play. It is suggestive, but not the kind of evidence you might get from a randomised controlled trial (RCT).
Which makes a recent randomised controlled trial all the more interesting: 726 students at West Point military academy were randomly assigned an economics class. Half of the classes allowed the use of laptops and other devices and half banned them. Banning laptops was associated with a small, statistically significant improvement in performance. I take the point that it was small, that these are students at a military academy who are very different to children in schools and that economics is not as central to the school curriculum as English and maths. If you want to disregard this evidence then perhaps you can but, if so, I ask you this:
Why are we all so convinced that laptops and iPads are great?
Should we not perhaps pause before prescribing more computers for schools?
Should we not at least question whether computers are necessarily a good thing?
If the evidence I have presented does not trouble you then what evidence are you being guided by?