[In this article by Paul Peterson and Michael Horn for Education Next, a publication from the Harvard Kennedy School, the authors tap into crowd-sourcing as well as experts to come up with the “ideal.” ]
As the use of technology in schools grows rapidly—whether in blended-learning environments, for project-based learning, or just because it’s the fad du jour—how much time students should spend learning on a computer is a point of contention. More and more people seem to agree that digital learning in K–12 classrooms works best when it is used with the oversight of a teacher. The chants of “teachers not technology” and “laptops for layoffs” increasingly appear to be relics of the past. A student can learn effectively via computer if an educator is around to assist and supplement, and teachers are realizing the power computers—properly used—have to enhance their craft.
But differences remain. The pessimists worry about students having too much screen time, about technology interfering with relationships between students and teachers, and about potential violations of privacy. Optimists contend that technology can personalize learning for each student; create more-engaging learning environments that free teachers to do what only humans can do well—provide empathy, understanding, and mentorship; and help students learn core knowledge to free up class time for projects and discussions.
So how should schools navigate these differences? What is the right balance of computer and teacher? How much time should students spend learning independently on a computer? To answer these questions, we did some carefully designed crowdsourcing of the sort for which Yelp and TV game shows have become famous. If you don’t know something, ask the audience. More often than not, the crowd, on average, will get pretty close to the right answer.