Education Technology or Technology Education? Can computers make an impact in schools?

"Computers 'do not improve' pupil results", say headlines this week on a new OECD report. Is the evidence telling us that digital learning if flawed? Should schools be moving away from new technologies and back towards traditional skills?

In my previous blog post I explored the debate we should be having on what ‘educational outcomes’ we mean, and argued that learning about technology is in itself a hugely important aim. What of the potential of technology to help with learning more traditional outcomes such as Reading, Maths and Science?

As Decoding Learning showed, there are some strong examples of technology being used to support these outcomes in schools. However, there is little evidence of digital technology being used to make a significant consistent difference to them at scale. In fact, this is also often the case for educational approaches and interventions more generally. In education we tend to try things we think will make a difference based on experience. Until recently there has been relatively little large scale experimental testing and evaluation of whether they do. This deficit of evidence led to the UK government setting aside £110 million in the form of the Education Endowment Foundation to run experiments on 'what works' in teaching and learning.

As cited in the OECD report, Hattie & Yates recommended digital technology could make a difference to traditional learning outcomes if we integrate it into existing approaches. Although evidence for what works is not as comprehensive as we might want it to be, there are areas in which there is good evidence for impact.

One such area is one to one tutoring, where the research evidence bears out the logical assumption that giving young people one to one support can enhance their learning. We are exploring whether this can be built upon with the application of technology through our remote tutoring research project. This project has provided expert maths tutors to primary school children falling behind in their maths. The explicit aim is to secure their knowledge and skills leading up to their standardised tests at the end of primary school.

This intervention builds on the existing evidence base for a teaching approach, uses technology to make it more available and efficient, then applies it to a clearly defined problem in traditional learning outcomes. It is a focused application of technology based on a plausible theory of change. The difference it makes to this clearly defined outcome is now being measured. A rigorous evaluation is taking place as a randomised control trial involving 600 primary pupils. Currently we await the data on this particular trial, but we think it exemplifies an approach.

If digital technology is to make an impact on traditional learning outcomes within schools, it is this kind of approach that is needed. A use of existing knowledge and evidence, clear and explicit aims and focus, and a commitment to evaluating whether those defined aims have been met. This is a long way from a characterisation of simply putting computers in classrooms and hoping that children learn better.

In measuring how many students have desktop, laptop or tablet computers in schools, the OECD capture only what is there, not how it is used. More detailed questions into the kinds of activities students report using computers for in mathematics give a glimpse into what they actually do with them, but this only covers one subject and a limited range of activities are presented. It does not therefore comprehensively capture how computers are used by students, and does not capture at all how they are used by teachers.

[Students] also report […] that teachers use ICT equipment during lessons (perhaps projectors and smartboards). Such teacher-centred approaches to integrating ICT into education are only imperfectly covered by PISA measures. Similarly, the use of smartphones at school may not be captured by the questions referring to “computer” use. p.54

Interestingly, the methods of data collection used by the OECD themselves demonstrate the potential for digital technology in the area of assessment. Young people were tested in their abilities in ‘digital reading’ by being asked questions that required them to navigate multiple pages. Assessment of their skills was achieved not just by looking for the right answers, but by analysing how they navigated through the texts. Novice and experienced ‘digital readers’ navigate through online texts in different ways. PISA was able to track this in their tests and make assessments of not only the answers, but the process students went through to get to them.

For a report getting headlines questioning the use of computers, it certainly provides an excellent case study of the power of digital technology for assessing learning.

The OECD reach similar conclusions to our own; that digital technology has great potential, but we are not yet reaching this. Computers in schools can be used to develop the understanding and skills needed to manipulate the technology that forms such powerful tools in our society. They can also be used to achieve more traditional learning outcomes, but they need to be designed and implemented in an intentional way in order to do this.

 

Photo Credit: crimfants via Compfight cc

Author

Oliver Quinlan

Oliver Quinlan

Oliver Quinlan

Head of Impact and Research, Raspberry Pi Foundation

Oliver was a programme manager for Nesta’s digital education projects. He is now Head of Impact and Research at the Raspberry Pi Foundation.

View profile