What Teachers Should know
One of the most frequently cited reasons for justifying the need for change in education, or at least for labeling education as old-fashioned, is the enormous technological (r)evolution our world has undergone in recent years. Nowadays, we have the Internet in our pocket, in the form of a smartphone, which has exponentially more computing power than the Apollo Guidance Computer that put the first men on the moon!
A school with desks, blackboards or whiteboards, and—perish the thought—books seems like some kind of archaic institution, one that, even if it does use a smartboard or a learning platform, operates in a manner that bears a suspiciously strong resemblance to the way things were done in the past.
In education, we often have the feeling that we are finding it harder and harder to reach our students. That is why we are so feverishly interested in smartboards or learning platforms or anything new on the market that might help. Every new tool seems like a possible solution, although sometimes we really don’t know what the problem is or even if there is one.
Regrettably, we have become saddled with a multiplicity of tools, methods, approaches, theories, and pseudotheories, many of which have been shown by science to be wrong or, at best, only partially effective. In this article, which is drawn from our book Urban Myths about Learning and Education, we discuss these miracle tools and the idea that young people today are somehow “digital natives,” and we examine the fear that technology is making our society and our students less intelligent.
To illustrate that many claims about technology in education are in fact spurious, we will focus in this article on five specific myths and present the research findings that dispel them.
Myth 1: New Technology is causing a revolution in education
School television, computers, smartboards, and tablets such as the iPad—it was thought that all these new tools would, or will, change education beyond recognition. But if you look at the research of someone like Larry Cuban, it seems that classroom practice has remained remarkably stable during recent years. Even Microsoft cofounder Bill Gates—whom you would hardly suspect of being against technology in education—summarized his view on the matter as follows:
“Just giving people devices has a really horrible track record.”
The correct use of tools and resources nevertheless does have the potential to change education. Very often these change phenomena are general rather than specific. For example, the influence of the printed word is gigantic, but this influence—like so many other tools and resources—is anchored in society as a whole. You need to come down to the level of something like the book or the blackboard if you want to consider a resource that has specifically changed education.
In 1983, Richard Clark published a definitive study on how it was pedagogy (i.e., teaching practice) and not the medium (i.e., technological tools and resources, such as whiteboards, hand-held devices, blogs, chat boards) that made a difference in learning, stating that instructional media are;
“mere vehicles that deliver instruction but do not influence student achievement any more than the truck that delivers our groceries causes changes in our nutrition.”
In 1994, Clark went as far as to make a daring prediction: namely, that a single medium would never influence education. He based this position on his opinion that, at that time, there was no proof to show that a medium was capable of ensuring that pupils and students could learn more or more effectively. He saw the medium as a means, a vehicle for instruction, but that the essence of learning remained—thankfully—in the hands of the teacher.
We are now 20 years further down the line, and the question needs to be asked: Does Clark’s position still hold true? During those 20 years, we have seen the explosion of almost unimaginable technological possibilities. Even so, Clark and Richard Mayer continue to assert that nothing has fundamentally changed. They argue that 60 years of comparative studies about teaching methods and teaching resources all confirm that it is not the medium that decides how effectively learners learn.
Clark and David Feldon confirm that the effectiveness of learning is determined primarily by the way the medium is used and by the quality of the instruction accompanying that use.
When media (or multimedia) are used for instruction, the choice of medium does not influence learning. John Hattie described, for example, how instructional methods that are more effective within conventional environments, such as learner control and explanative feedback, are also more effective within computer-based environments.
This can be called the “method-not-media” hypothesis, as tested in a study where students received an online multimedia lesson on how a solar cell works that consisted of 11 narrated slides with a script of 800 words. Focusing on the instructional media being used, students received the lesson on an iMac in a lab or on an iPad in a courtyard. But they also used different instructional methods.
Students received either a continuous lesson with no headings (this was the standard method) or a segmented lesson in which the learner clicked on a button to go to the next slide, with each slide having a heading corresponding to the key idea in the script for the slide (this was the enhanced method). By combining changes in both medium and method, we can see what matters most. Across both media, the enhanced group outperformed the standard group on a transfer test where students had to use the information in settings other than those in the text, yielding a method effect on learning outcomes for both desktop and mobile medium.
Across both methods, looking at the medium, the mobile group produced stronger ratings than the desktop group on self-reported willingness to continue learning, yielding a media effect on motivational ratings for both standard and enhanced methods.
Effective instructional methods can improve learning outcomes across different media, whereas using hand-held instructional media may increase students’ willingness to continue to engage in learning.
If we look at the influence of technology on the effectiveness of instruction, the picture is not fully clear. This can partly be explained by the fact that relatively little research has been carried out that involves the comparison of two similar groups, one group learning with and the other group learning without the benefits of a new technology.
The different metastudies on this subject, analyzed by Hattie, reveal a considerable variation in results.
A review study on the implementation of technology, more specifically Web 2.0 tools such as wikis, blogs, and virtual worlds, in K–12 and higher education, suggests that actual evidence regarding the impact of those technologies on student learning is fairly weak. There are still a number of studies that point to a positive gain in learning terms, but the majority equate the positive learning effect resulting from the good use of technology with good teaching.
The crucial factor for learning improvement is to make sure that you do not replace the teacher as the instrument of instruction, allowing computers to do what teachers would normally do, but instead use computers to supplement and amplify what the teacher does.
A 2009 metastudy about e-learning did, however, tentatively conclude that the use of both e-learning and contact education—which is known as blended learning—produces better results than lessons given without technology. This is also the case when you use computer game–based learning; the role of instruction still needs to have a real significant learning effect, reflecting the conclusion of one meta-analysis. Such instructional support may appear in several forms, such as providing feedback, scaffolding, and giving advice.
Still, there remain some questionable claims that technology can change, by itself, the present system of education. Clark and Feldon summarize the various claims and responses:
- The claim: Multimedia instruction accommodates different learning styles and so maximizes learning for more students. Clark and Feldon describe how learning styles have not proven to be “robust foundations on which to customize instruction.” And, as we explained in our book, the idea of learning styles* in themselves is already a very stubborn and harmful urban myth in education.
- The claim: Multimedia instruction facilitates student-managed constructivist and discovery approaches that are beneficial to learning. In fact, Clark and Feldon found that “Discovery-based multimedia programs seem to benefit experts or students with higher levels of prior knowledge about the topic being learned. Students with novice to intermediate levels of prior knowledge learn best from fully guided instruction.”† This is another example of how the medium does not influence the learning. Prior knowledge is an individual difference that leads to learning benefits from more guidance at low to moderate levels but not at higher levels, regardless of the media used to deliver instruction.
- The claim: Multimedia instruction provides students with autonomy and control over the sequencing of instruction. Although technology can deliver this, the more important question is whether this is a good thing. Letting students decide the pace of learning (e.g., by allowing them to pause or slow down videos or presentations) is beneficial to learning. But only a small group of students has the benefit of being given the chance to select the order of lessons, learning tasks, and learning support. For the majority of students, this has a mostly negative influence on learning.
The point that teachers should remember is this: the medium seldom influences teaching, learning, and education, nor is it likely that one single medium will ever be the best one for all situations.
Myth 2: The Internet belongs in the classroom because it is part of the personal world experienced by children.
Technology in Education: How often have you heard this? It sounds so logical, doesn’t it? At the same time, many teachers have discovered, at their expense, that using information and communications technology in their lesson “randomly,” in an unstructured way, does not always have lasting success. The problem is that most research studies have been evaluations of relatively short-term projects. Some research, for instance, focuses on the extent to which participants liked the medium being used during the actual test, which for a student actually lasted for about 12 minutes.
Also note that in this research, being motivated because of the medium did not help learning as much as the chosen pedagogical approach. But when we discuss implementing technology and the Internet in the classroom, people argue not for using it once or only for a short period, but for long-term implementation. Therefore, it is the impact over a longer period that really needs to be determined.
A study by the Canadian Higher Education Strategy Associates described how students had a preference for “ordinary, real life” lessons rather than e-learning or the use of some other technology. It was a result that surprised the researchers.
“It is not the portrait we expected, whereby students would embrace anything that happens on a more highly technological level. On the contrary—they really seem to like access to human interaction, a smart person at the front of the classroom.”
The findings also revealed that the more technology was used to teach a particular course, the fewer the students who felt they were able to get something out of that course. While the 1,380 students from 60 Canadian universities questioned for this survey were generally satisfied with the courses they took, the level of satisfaction fell significantly when more digital forums, online interactions, or other technological elements were involved. Yet, at the same time, more than half the respondents said that they would skip a lesson if there was more information or a comparable video lesson online.
Although these results at first glance seem to be fairly negative for e-learning, the responses to some additional questions were more positive. The majority of students (59.6 percent) said that they would like more electronic content in their courses. When asked what they would specifically like to see online, 53.6 percent answered that they would like more online course notes, with 46.4 percent advocating more recordings of lessons on the web.
These findings are broadly in keeping with the results of a 2011 literature study that investigated the expectations of young people with regard to new forms of education and information and communications technology.
The study reached the following conclusions:
First, the technological gap between the students and their teachers is not enormous, and certainly not so large that it cannot be bridged. In fact, the relationship is determined by the requirements teachers place on their students to make use of new technologies. There is little evidence that students expect the use of these new technologies.
Second, in all the studies consulted, the students persistently report that they prefer moderate use of information and communications technology in their courses. (“Moderate” is, of course, an imprecise term that is difficult to quantify.)
Third, students do not naturally make extensive use of many of the newest technologies, such as blogs, wikis, and virtual worlds. Students who need or are required to use these technologies in their courses are unlikely to object to them, but there is not a natural demand among students for any such use.
Maybe this will change as technology becomes more and more ingrained. However, a study of students in Glasgow, Scotland, found little change; these students appeared to conform to fairly traditional pedagogies, albeit with minor uses of technology tools that deliver content. Research comparing traditional books with e-readers shows that students prefer paper.
The sad thing is that even if students did prefer to use technology in school, this would not mean that they would learn more. In 2005, Clark and Feldon wrote, “The best conclusion at this point is that, overall, multimedia courses may be more attractive to students and so they tend to choose them when offered options, but student interest does not result in more learning and overall it appears to actually result in significantly less learning than would have occurred in ‘instructor led’ courses.” A decade later, based on 10 years of additional research, Clark and Feldon stand by this conclusion.
In her book, Danah Boyd describes the main reasons young people use technology. These reasons are mainly social, such as sharing information with each other, and meeting each other online and in real life. They do discuss schoolwork with each other, but this is very different from using Facebook as a learning tool or their phone as a learning machine.
Myth 3: Today’s “digital natives” are a new generation who want a new style of education.
Digital natives! Whenever the question of digital innovation in education is discussed, this is a term that immediately comes to the surface. But it should be avoided. Even the person who coined the term digital natives, Marc Prensky, admitted in his most recent book, Brain Gain, that the term is now obsolete.
The concept is usually used to describe young people who were born in the digital world and for whom all forms of information and communications technology are natural. The adults who were born earlier are therefore “digital immigrants,” who try with difficulty to keep up with the natives. Prensky first coined both terms in 2001.
With this concept, he referred to a group of young people who have been immersed in technology all their lives, giving them distinct and unique characteristics that set them apart from previous generations, and who have sophisticated technical skills and learning preferences for which traditional education is unprepared. However, Prensky’s coining of this term—and its counterpart for people who are not digitally native—was not based on research into this generation, but rather created by rationalizing phenomena that he had observed.
As the digital native concept became popular, extra claims were added to the initial concept. Erika Smith, of the University of Alberta, describes eight unsubstantiated claims in the different present discourses on digital natives:
- They possess new ways of knowing and being.
- They are driving a digital revolution and thereby transforming society.
- They are innately or inherently tech savvy.
- They are multitaskers,‡ team oriented, and collaborative.
- They are native speakers of the language of technologies and have unique viewpoints and abilities.
- They embrace gaming, interaction, and simulation.
- They demand immediate gratification.
- They reflect and respond to the knowledge economy.
Smith is not alone in concluding that there is little to no proof for these claims. A meta-analysis conducted in 2008 had already shown that there was little hard evidence to support the use of the term digital natives.
But maybe the concept of digital natives was more a kind of prediction, and we just had to wait. Perhaps today’s young people are true digital natives. If we look at the research performed in high-tech Hong Kong by David M. Kennedy and Bob Fox, the answer is more nuanced. Kennedy and Fox investigated how first-year undergraduate students used and understood various digital technologies. They discovered, like Danah Boyd did with the American teenagers, that the first-year undergraduate students at Hong Kong University do use a wide range of digital technologies.
The students use a large quantity and variety of technologies for communicating, learning, staying connected with their friends, and engaging with the world around them. But they are using them primarily for “personal empowerment and entertainment.” More importantly, Kennedy and Fox describe that the students are,
“not always digitally literate in using technology to support their learning. This is particularly evident when it comes to student use of technology as consumers of content rather than creators of content specifically for academic purposes.”
Other researchers have reported that university students use only a limited range of technologies for learning and socialization. For example, one study found that “the tools these students used were largely established technologies, in particular mobile phones, media player, Google, [and] Wikipedia. The use of handheld computers as well as gaming, social networking sites, blogs, and other emergent social technologies was very low.” This finding has been supported by a number of other researchers who came to similar conclusions, namely that university students do not really have a deep knowledge of technology, and what knowledge they do have is often limited to basic Microsoft Office skills (Word, Excel, PowerPoint), emailing, text messaging, Facebook, and surfing the Internet.
When looking at the same topic in another continent, Europe, the large-scale EU Kids Online report of 2011 placed the term digital natives in first place on its list of the 10 biggest myths about young people and technology. Just 36 percent of Europe’s 9- to 16-year-olds said that they knew more about the Internet than their parents.
Studies in other countries, including Australia, Austria, Canada, Switzerland, and the United States, all come to the same conclusion: there is no such thing as a generation of digital natives.
Myth 4: The Internet makes us dumber.
n recent years, a number of authors—often neurologists—such as Baroness Susan Greenfield and Manfred Spitzer, in his 2012 book Digitale Demenz (“Digital Dementia”), have appeared from a new group of technological critics who seem to agree that we are all becoming more stupid because of the technology we are using.
Though what they posit in their books—very strong, sometimes not completely well-founded positions —need to be taken with a grain of salt, they refer to the plasticity of the brain in arguing that the Internet is rewiring our brains in a harmful way.
It is certainly true that what’s known as the Flynn effect (the observed rise in IQ scores over time) has come to a halt in some countries, but the reasons for this halt are neither uniform nor clear. James Flynn, who named this effect, shared his doubts in his 2012 book Are We Getting Smarter? about whether the effect actually measures that we really have become smarter. There are other plausible reasons for the rise in the test scores, such as education more closely mimicking IQ tests. Research even suggests that the better scores on IQ tests result from increased luckier guessing on harder test items.
As a result, it is not easy to say whether the Internet might be partly responsible for the halt in the phenomenon, as we do not know for certain what actually caused the Flynn effect. Some authors even see the use of new media as an important contributory factor in the rise of average IQ that has been evident in recent years.
Nowadays, we are relying more and more on technology. As an illustration of this fact, Betsy Sparrow, a professor at Columbia University in New York, has described the “Google effect.” Together with her team, she discovered that students remember information more easily if they think that this information is not likely to be available on the Internet. Her study also revealed that students are better at remembering where to find something on the Internet than they are at remembering the information itself. In this respect, the popular Google search engine is increasingly acting as a kind of “external memory.”
But is this really evidence to show that the Internet is making us dumber? To be honest, we don’t know. At the moment, there is no conclusive, empirical proof that decides the issue one way or the other. Although Nicolas Carr has provided many indications in his book The Shallows, his arguments are personal and anecdotal, rather than scientific. Perhaps Steven Pinker is right when he says that we are now making better use of our brains by using Google for “unnecessary information,” just as we now use satellite navigation or another global positioning device instead of a map. And in the final analysis, we certainly know more now than we did in the past. So why should we be more stupid?
In an opinion piece from 2010, in reaction to the publication of Carr’s book, two leading neurologists explain why the digital alarmists are wrong:
The basic plan of the brain’s “wiring” is determined by genetic programs and biochemical interactions that do most of their work long before a child discovers Facebook and Twitter. There is simply no experimental evidence to show that living with new technologies fundamentally changes brain organization in a way that affects one’s ability to focus. Of course, the brain changes any time we form a memory or learn a new skill, but new skills build on our existing capacities without fundamentally changing them. We will no more lose our ability to pay attention than we will lose our ability to listen, see or speak.
Still, there are reasons to consider being careful with the total amount of screen time that children may have in a normal day. The American Academy of Pediatrics (AAP) warns that studies have shown excessive media use can lead to attention problems, school difficulties, sleep and eating disorders, and obesity. This view has been confirmed by a study by researchers from Iowa State University.
Therefore, the AAP recommends no more than one to two hours of screen time a day for children two years and older. John Hattie also describes a clear negative impact of excessive television consumption on learning. Finally, a recent review article in The Neuroscientist paints a disturbing picture of what is happening to this group:
Growing up with Internet technologies, “Digital Natives” gravitate toward “shallow” information processing behaviors characterized by rapid attention shifting and reduced deliberations. They engage in increased multitasking behaviors that are linked to increased distractibility and poor executive control abilities. Digital natives also exhibit higher prevalence of Internet-related addictive behaviors that reflect altered reward-processing and self-control mechanisms.
Recent neuroimaging investigations have suggested associations between these Internet-related cognitive impacts and structural changes in the brain.
Note that many of these studies examined the influence of television rather than the influence of interactive technology, such as smartphones and social media. Also note that most of these studies found a correlation rather than a causal relation; that is, there may be other reasons why children who watch a lot of television have poorer school results.
Myth 5: Young people don’t read anymore.
Technology in Education: of course young people read. They read a lot. As Amelia Hall Sorrell and Peggy F. Hopper explain, teenagers constantly read what is available to them through the different forms of technology that continue to evolve. But when people think that young people today read less, it’s not about reading online content or text messages, it’s about reading books.
In 2010, Reader’s Digest in the United Kingdom conducted a survey on the reading habits of some 2,000 adults and 700 children. The results revealed that one in five children hardly ever reads a book, one in three never reads a book, and one in 20 has never read a book. These figures support a perception that many people seem to have; namely, that young people and children don’t read anymore, and certainly not for pleasure. But is a survey in a popular monthly magazine a reliable source for such a sweeping claim?
Perhaps more scientifically gathered data could tell us more. A 2007 report, To Read or Not to Read, describes a significant decline in reading by youngsters in the United States in the previous 20 years. The study compared data from 1982 and 2002, and found that less than one-third of the 13-year-olds were daily readers. The percentage of 17-year-olds who read nothing at all for pleasure doubled over the same 20-year period. Yet the amount they read for school or homework stayed the same. However, these data are already quite old and stem from the beginnings of the digital era.
The Programme for International Student Assessment (PISA) study carried out by the Organisation for Economic Co-operation and Development (OECD) looks not only at learning results but also at the learning behavior of the respondents. In 2011, PISA published a report analyzing the pleasure reading of young people. This study found that, on average, two out of three students read every day for pleasure. It also noted that the percentage of students who reported that they read for enjoyment daily dropped in the majority of OECD countries between 2000 and 2009, but in some countries that proportion increased. In the United States, the average remained the same. Boys and girls from families with a higher socioeconomic status read more than young people from families with a lower socioeconomic status; moreover, the gap between the two has increased between 2000 and 2009.
In 2012, Stage of Life polled teenagers about their reading habits and found that 77.7 percent of them read at least one extra book per month for personal pleasure beyond what is required for school. Nearly a quarter (24.5 percent) read five or more books per month outside of school. These figures are much higher than the PISA figures, but this probably is due to the way the teenagers were selected.
In the United States, the Pew Research Center examined the reading habits of the American audience in 2012, youth included. Book readers under the age of 30 consumed an average of 13 books in the previous 12 months and a median of six books; in other words, half of book readers in that age cohort had read fewer than six, and half had read more than six.
Still, even in these digital times, libraries remain important to many American youngsters.** Pew found that in the 12 months before the survey in 2013, 53 percent of Americans aged 16 and older had visited a library or bookmobile, 25 percent had visited a library website, and 13 percent had used a handheld device such as a smartphone or tablet computer to access a library website.
To sum all this up, young people are still doing a lot of reading, and these statistics make clear that many of them are reading for pleasure. However, we need to be careful about making too many sweeping assertions, since the reading figures in many countries are falling. Even so, we know that reading continues to be important: both reading by young people themselves and parents reading to their children.
Though there is good empirical proof out there refuting these myths, they persist. Why? Anthropologists tell us that myths function in culture and society to express, enhance, and codify belief, while language historians attribute their persistence to increased, almost unlimited, information availability. Our society serves up so much instant and pervasive information, which we fail to examine discerningly, that we end up circulating and strengthening myths through repetition and enhancement.
This vicious cycle is compounded by what journalist Farhad Manjoo discusses in True Enough: Learning to Live in a Post-Fact Society. Self-styled experts (educational charlatans) publish anything they want and come at us from all directions, in every medium, without any “check” on their expertise. The “real danger of living in the age of Photoshop isn’t the proliferation of fake photos,” Manjoo writes. “Rather, it’s that true photos will be ignored as phonies.”
In education, how do we combat this? In our view, there is only one answer: the educational sciences must be driven by theories and theory development, and not by simple observations and conclusions. Strong empirical data must come from experiments set up according to good research methodologies (i.e., randomized control trials, real control conditions, samples large and representative enough to justify implementation decisions, etc.) rather than legends and hype. Only after these evidence-informed methods are slowly but surely tested in real-life settings can we think about large-scale implementation.
Finally, teachers, administrators, and politicians must learn to become knowledgeable and aware consumers. To that end, we suggest keeping in mind the following: if something sounds too good to be true, it probably isn’t true.
Adapted by Pedro De Bruychere, Paul A. Kirschner, Casper D. Hulshof.
Pedro De Bruyckere is a researcher at Arteveldehogeschool University College in Ghent, Belgium. Paul A. Kirschner is a professor of educational psychology and Distinguished University Professor at the Welten Institute at the Open University of the Netherlands; he is also a visiting professor of education at the University of Oulu, Finland. Casper D. Hulshof is a researcher at the University of Utrecht, the Netherlands. This article is adapted from their book Urban Myths about Learning and Education (Academic Press, 2015). Reprinted with permission from Elsevier Inc. Copyright ©2015.