The digital revolution is a ubiquitous juggernaut; hardly any element of social and working life is now carried out unaccompanied by a screen of one sort or another. It is a brilliant technology, and has enhanced many facets of our working and cultural lives. So it’s perhaps inevitable that digital technology is increasingly used in education, even primary education. But thereby hangs a potential problem, one best summed up by Edward Tenner’s comment: ‘it would be a shame if brilliant technology were to end up threatening the kind of intellect that produced it’. Whatever one’s feelings about the value of digital media for adults, we need to take great care when we consider the effect of those media on children, and this is especially true in the realm of education.

It is difficult to find the independent research that has been carried out into the use of computers in school, even though a good deal of research has been undertaken, largely because it is buried under a snowdrift of positive promotion from the adherents of the ‘digital is good’ school of thought (that’s Google searching for you). But it does exist, and it suggests that we should exercise a good deal of caution before we make our curriculum an online or screen-dependent process.

Where independent research has been carried out, a consensus has arisen; put crudely, computers make good schools a little bit better, and bad schools a little bit worse. That is to say, add computers to a school with plentiful resources, and motivated, skilled teaching staff, and the learning process is enhanced (though not radically, studies suggest); put computers into an under-resourced school, with low teacher morale and a basket of social problems, and things get worse (sometimes substantially so). This ought not to be a surprise; the most successful teaching methods, low-tech and old-fashioned as they may appear, are well-established precisely because they are successful, and they are about good teachers in a good environment. They were established long before the advent of digital technology, or any other mass medium, including the printed book.

The arguments around the use of screen technology in schools are complex, and can only be hinted at in a brief article, but they are worth stating, and worth thinking about. I want to draw a few themes out of this complex argument, because I think they are core to our understanding of what computers can and cannot do for education: the case for computer literacy; the problem of attention; memory and retention of information; economics; and innovation.

The argument that computer literacy is vital for schoolchildren is now obsolete, for two reasons: first, because the technology is becoming more and more simple to use (think flick screen); we have all seen how quickly even very small children learn to operate computers, smart phones, I-Pods etc. Secondly, the argument fails because the skills that comprise ‘literacy’ are used only in increasingly specialised areas – being able to programme a computer doesn’t teach a person to think better in any other context, and the skill is useful only if you are going to be a computer programmer. Yet the argument about literacy is still used as a justification for increasing the amount of digital media in education.

The history of technology in education carries a warning we ought to heed. In 1927, Edison famously said that film would revolutionise education; that didn’t happen. When television became the pre-eminent medium for mass culture, it was again said that it would revolutionise education; again, that didn’t happen. We now have stacks of data confirming that television is not a particularly good medium for education, and certainly can’t compete with good teachers and good books. It is a perennial truth of technological innovation that those who espouse it tend to espouse it for everything; it is also perennially true that they are usually wrong to do so.

Attention is a prerequisite for education; if a child is distracted, that child will not learn as easily as the child who concentrates (indeed, the most vital part of the process of education is, surely, teaching children how to concentrate). For that reason, the ideal classroom, while a stimulating environment, is not a distracting one. There is considerable evidence to show that putting a screen (of any kind: television, computer, video) into a child’s environment tends to make the child distracted, and tends to lower the amount – and value – of personal interaction that happens in that environment. This is already a problem in many children’s homes; it may not be helpful to bring the same problem into the classroom.

There is a whole avalanche of studies showing quite clearly that digital gadgets reduce our attention spans. This is largely because modern screens are dynamic; they rarely contain only one piece of text or a single image, but rather are filled with other images, other texts, other buttons for us to press: a forest of distractions on every page, and a recipe for over-stimulation. And as Torkel Klingberg puts it, when we are over-stimulated, ‘we find distractions more distracting’; given that children are easily distracted to begin with, does this sound like a good idea in a school?

The medium itself makes a huge difference to how and what we learn, and this has implications in terms of attention. Computers are attention-grabbing, but don’t encourage retention of information: ‘the Internet seizes our attention only to scatter it’ (Nicholas Carr, the Shallows). This is particularly true when it comes to working memory. Most neurologists now agree that we operate two different memory systems; working memory (what’s in our conscious minds right now) and long-term memory (the vast store of individual learning). One could define education as the process of moving information from our working memory into our long-term memory (this is perhaps what William James meant when he said that ‘the art of remembering is the art of thinking’). But screens tend to crowd our working memory with distractions, and make it more difficult to get information to stick; to paraphrase Dave Brooks, the magic of digital technology is not that it allows us to know more, but that it allows us to know less – and I might add that it also gives us the illusion that we know more when we know less.

The economic argument is not convincing either. If one uses the model of the total cost of ownership (TCO) then computers, given their relatively short working lives, are expensive, and place demands on limited resources that are not justified by their educational value. And there is little value in the argument we might characterise as the ‘get used to it’ gambit; computers are everywhere, so of course they should be in schools; by that argument, we should be teaching all schoolchildren to drive.

A major difference between the use of books in schools, and the use of computers, is that by the time books became part of a system of universal education, they were already a fixed technology – we knew what we were getting. This is not the case for computers; the technology is still in its infancy, still in what scholars of innovation term a period of ferment; there are considerable risks in allowing a developing technology to become prevalent in a system as sensitive as education; at the very least, it makes our children into experimental subjects for an unproven idea. At some point, we need to have a public debate about the value of computers and screens in schools, and that point should be before we lose a generation of children to learning.