Everyone working in any facet of education and educational nonprofits needs to read Geek Heresy: Rescuing Social Change From the Cult of Technology; put down whatever other books you’re reading—you are reading, right?—and get a copy of this one.
In it, Kentaro Toyama describes how computers and related technologies are not a panacea for education or any other social service fields. He writes that, “like a lever, technology amplifies people’s capacities in the direction of their intentions.” Sound familiar? It should: we’ve written about “Computers and Education: An Example of Conventional Wisdom Being Wrong” and “How Computers Have Made Grant Writing Worse.” We’ve been writing grant proposals for programs that increase access to digital technologies since at least the late ’90s; for example, we’ve written numerous funded 21st Century Community Learning Centers proposals. Despite all that effort and all those billions of dollars spent, however, it would be polite to say that educational outcomes have not leapt forward.
As it turns out, the computers-in-education trope is part of a general pattern. After years in the field, Toyama eventually realized that technologically driven educational projects tend to follow stages: “the initial optimism that surrounds technology, the doubt as reality hits, the complexity of outcomes, and the unavoidable role of social forces.” That’s after Toyama describes his work in India, where he discovers that “In the course of five years, I oversaw at least ten different technology-for-education projects [. . .] Each time, we thought we were addressing a real problem. But while the designs varied, in the end it didn’t matter – technology never made up for a lack of good teachers or good principals.” Studies of the One Laptop Per Child project show similarly disappointing results.
Chucking technology at people problems does not automatically improve the people or solve the problem: “Even in a world of abundant technology, there is no social change without change in people.” Change in people is really hard, slow, and expensive. It can be hastened by wide and deep reading, but most Americans don’t read much: TV, Facebook, and the other usual suspects feel easier in the short term. Everyone who thinks about it knows that computers are incredibly useful for creating, expressing, and disseminating knowledge. But they’re also incredibly useful for wasting time. Because of the way computers can waste time and drain precious attention, I actually ban laptops and phones from my classrooms. Computers and phones don’t help with reading comprehension and writing skill development. That primarily happens between the ears, not on the screen.
Problems with laptops in classrooms became apparent to me during my one year of law school (I fortunately dropped out of the program). All students were required to use laptops. During class, some used computers for the ends imagined by administrators. Most used them to gossip, check sports scores, send and receive nude photos of classmates, etc. And those were law students, who’d already been selected for having decent discipline and foresight. What hope do the rest of us have? Laptops were not the limiting factor in my classes and they aren’t the limiting factor for most people in most places:
Anyone can learn to Tweet. But forming and articulating a cogent argument in any medium requires thinking, writing, and communication skills. While those skills are increasingly expressed through text messaging, PowerPoint, and email, they are not taught by them. Similarly, it’s easy to learn to ‘use’ a computer, but the underlying math skills necessary for accounting or engineering require solid preparation that only comes from doing problem sets—readily accomplished with or without a computer.
Problem sets are often boring, but they’re also important. I tell my college students that they need to memorize major comma rules. They generally don’t want to, but they have to memorize some rules in order to know how to deploy those rules—and how to break them effectively, as opposed to inadvertently. Computers don’t help with that. They don’t help with more than you think:
Economist Leigh Linden at the University of Texas at Austin conducted experimental trials in India and Colombia. He found that, on average, students exposed to computer-based instruction learned no more than control groups without computers. His conclusion? While PCs can supplement good instruction, they don’t substitute for time with real teachers.
The obvious counterpoint to this is “yet.” Still, those of us who have computers and Internet connections are probably sensitive to how much time we spend doing stuff that might qualify as “work” versus time spent on YouTube or games or innumerable other distractions (pornography sites are allegedly among the largest sites, measured by megabytes delivered, on the Internet).
Moreover, the poorer the school districts or communities, the harder it was to setup and maintain the equipment (another challenge many of us are familiar with: Don’t ask me about the fiasco that upgrading from OS X 10.6 to 10.10 entailed).
In addition, Toyama points out that there is a long history of believing that technology in and of itself will ameliorate human problems:
We were hardly the first to think our inventions would transform education. Larry Cuban, a veteran inner-city teacher and an emeritus professor at Stanford, has chronicled the technology fads of the past century. As his examples show, the idea that technology can cure the ills of society is nothing new. As early as 1913, Thomas Edison believed that ‘the motion picture is destined to revolutionize our educational system.’ Edison estimated that we only learned 2 percent of the material we read in books, but that we could absorb 100 percent of what we saw on film. He was certain that textbooks were becoming obsolete.
Oops. Radio, TV, filmstrips, overhead projectors and other technologies were heralded with similar promise. The problem is that technology is much easier than motivation, concentration, conscientiousness, and perspicacity.
Some quotes should remind you of points we’ve made. For example, Toyama says, “Measurement undoubtedly helps us verify progress. There’s a danger, though, of worshipping the measurable at the expensive of other key qualities.” That’s true of many grant proposals and is consilient with our post on why evaluations are hard to do. Measuring what’s easy to measure is usually much easier than measuring what matters, and funding authorities rarely care in a deep way about the latter.
In his chapter on “Nurturing Change,” Toyama notes that individuals have to aspire to do more and to do better in order for a group or culture to see mass change. This is close to Robert Pirsig’s point in Lila’s Child: An Inquiry Into Quality, which extols the pleasure and importance of of craftsmanship. Defined broadly, “craftsmanship” might mean doing the best work you can regardless of who’s watching or what the expected consequences of that work might be.
Geek Heresy is not perfect. Toyama repeats the dubious calumny that the poverty rate “decreased steadily [in the United States] until 1970. Around 1970, though, the decline stopped. Since then, the poverty rate has held steady at a stubborn 12 to 13 percent [. . . .]” But the official rate is likely bogus: “If you look at income after taxes and transfers you see that the shape of American public policy has become much friendlier to the poor during this period.” Or consider this reading of the data, which finds the “Adjusted percent poor in 2013 [is] 4.8%.” This also probably jibes with what many of our older readers have actually experience: Most manufactured goods are far, far cheaper than they used to be, and official definitions of poverty rarely account for those. On a non-financial level, far more and better medical treatments are available. In 1970 there was no chickenpox or HPV vaccine, regardless of how wealthy you were.
The flaws in Geek Heresy are minor. The important point is that technology will not automatically solve all of our problems and that you should be wary of those who think it will. Until we understand this—and understand the history of attempting to use technology to solve all of our problems—we won’t be able to make real progress in educational achievement.