Personalized learning is the hot new idea in education reform, but some versions could get a little too personal.
While personalized learning is a broad and ill-defined field these days, many folks want to harness computer power to match students up with perfectly suited educational materials. This involves some sort of algorithm that collects and crunches data, then spits out a result, not unlike the way Facebook or Netflix collect data with users in order to match them up with the right products, or at least the best marketing for those products. As we’ve seen with the Cambridge Analytica scandal, there are some real privacy issues with data mining on this scale, but that has not stopped developers from digging deeper and deeper.
Personalized learning can be as simple as an exercise management system. Pat completes Widget Studies Worksheet 457A/rq, and because Pat missed questions 6, 9, and 11, the algorithm says Pat should next complete Worksheet 457B/sg, and so on until Pat completes Unit Test 1123-VZ and is declared a master of widgetry. This may sound like a boring mass work worksheet, but instead of paper worksheets, the modern system puts all the worksheets on a computer and students complete them on a computer screen, so it’s like super-exciting.
Data mining academics is central to many personalized systems. AltSchool, the Silicon Valley Wunderschool (now a business marketing wunderschool-in-a-box) touted its massive data mining, with teachers recording every significant learning moment and turning it over to a data team in order to create a program of perfectly personalized instruction for each student.
But many personalized learning developers are certain that data mining the academics is not enough. Social and emotional learning is another growth sector in education programming, and also, many folks have suggested that the young people are not automatically entranced by dull work just because it’s on a computer screen.
So we’re seeing attempts to mine other sorts of data. NWEA, the company that brought us the MAP test, now offers a feature that tells you whether or not the student taking the computer test is engaged or not. They believe that by analyzing the speed with which a student is answering questions, they can determine whether or not said student is trying. During test time, the teacher dashboard will toss up a little warning icon beside the name of any not-trying-hard-enough student so that the teacher can “redirect” the student.
That is more redundant than creepy; many teachers perform a similar analysis and intervention with a technique called “looking with their eyes.” But the personalization can get creepier.
There are several companies like LCA and its Nestor program. The program uses the students’ computer webcam to track and analyze facial expressions in order to determine if the instructional program is working. Monitoring programs like Nestor (there are several out there) claim they can read the student’s face for different emotional reactions the better to personalize the educational program being delivered. The beauty of these systems, of course, is that if we have students taking computerized courses that read their every response, we don’t really need teachers or school. Anywhere there is a computer and a webcam, school is in session and the program is collecting data about the students.
Does that seem excessive? Check out Cognition Builders, a company that offers to help you deal with your problem child by monitoring that child 24/7.
There are huge issues with all of these. From the educational standpoint, we have to question if anyone can really develop an algorithm or a necessarily massive library of materials that will actually work better than a trained human. From a privacy standpoint, the data collection is troubling. It’s concerning enough to create a system that allows employers to “search” for someone who is strong in math and moderately strong in written language based simply on algorithm-driven worksheet programs. It’s even more concerning when the program promises that it can also screen out future workers who are flagged as “Uncooperative” because of behavior patterns marked by a computer program in third grade.
And we still haven’t found the final frontier of creepitude.
Meet the field of educational genomics. The dream here is to use genetic information to create “precision education,” which much like “precision medicine,” “precision agriculture” and “precision electioneering” would use huge levels of data down to the genetic level to design a perfect program. The MIT Technology Review this spring profiled $50 DNA tests for IQ.
Imagine a future in which doctors perform a DNA test on an embryo and by the time that child is born, an entire personalized education program is laid out for her. The constant computer monitoring collects her performance and behavior data, so that by the time she’s ten years old, her digital record already makes a complete profile of her available, with an algorithm judging her on academic abilities as well as judging whether she’s a good person.
There are a thousand reasons to question whether or not we could do any of this well or accurately. But before we try to see if we can enter this impersonally personalized brave new world, we really need to talk about whether or not we should.