Winona Ryder and the Internet of Things

min read

New Horizons

"In the living room the voice-clock sang, Tick-tock, seven o'clock, time to get up, time to get up, seven o'clock!"

—Ray Bradbury, "There Will Come Soft Rains"

The 2015 film Experimenter is based on the true story of Stanley Milgram, the Yale University psychologist who became famous for his 1961 social behavior experiments that tested the obedience of volunteers who thought they were administering electrical shocks to strangers. In the film, the character of his wife, Alexandra "Sasha" Milgram, is played by Winona Ryder, and she serves as the on-screen stand-in for the film audience. Our ethical response to what happens in the film is registered on her face. In several scenes, the camera focuses on the face of Winona Ryder watching the experiment unfold—her skin twitching, her body shifting uncomfortably, her eyes wide with both horror and also a certain awe at what humans are capable of.

In his experiment, Milgram asked a "teacher" (the subject of the experiment) to shock a "learner" (an actor) for getting wrong answers on a simple test. An "experimenter" would order the teacher to give increasingly powerful shocks, and more often than not, the teacher complied. The study is not without baggage,1 but the results remain compelling nonetheless. At one point in the film, Winona Ryder as Sasha Milgram asks to experience the shock herself, the same very small shock that the teachers were also given during the setup of the experiment. The scene is played out with a certain menace as the various accoutrements are put into action. Visually, she is overwhelmed by the devices that surround her: the electrodes, the teacher's microphone, a series of digits that light up to show the learner's answers, a pen, a clipboard, the gray of the experimenter's lab coat, a recording device, and the large box of switches through which the teacher delivers the shocks.2 All of the devices play clear roles in maintaining and even eliciting compliance. And the subtler and more intricate or inscrutable the mechanism, the more compliance it appears to generate—because the human brain fails to bend adequately around it. The camera works a similar magic on the film viewers as it ominously traces over these objects. Like our on-screen surrogate, Winona Ryder, we too sit still—complicit, both horrified and awed by what we see and our inability to stop it.

In the 1915 book Schools of To-Morrow, John Dewey wrote: "Unless the mass of workers are to be blind cogs and pinions in the apparatus they employ, they must have some understanding of the physical and social facts behind and ahead of the material and appliances with which they are dealing." The less we understand our tools, the more we are beholden to them. The more we imagine our tools as transparent or invisible, the less able we are to take ownership of them.

At the interview for my current job at the University of Mary Washington, the inimitable Martha Burtis asked me to reflect on the statement: "It's teaching, not tools." What assumptions does this oft-bandied-about phrase make? What does it overlook? Like Martha, I find myself increasingly concerned by the idea that our tools are without ideologies—that tools are neutral. Of course, they aren't. Tools are made by people, and most (or even all) educational technologies have pedagogies hard-coded into them in advance. This is why it is so essential that we consider them carefully and critically—that we empty all our LEGOs onto the table and sift through them before we start building. Some tools are decidedly less innocuous than others. And some tools can never be hacked to good use.3

In 2014, the EDUCAUSE Learning Initiative (ELI) report "7 Things You Should Know About the Internet of Things" noted: "The Internet of Things (IoT) describes a state in which vast numbers of objects are interconnected over the Internet and can collect data and transmit and receive information. . . . The IoT has its roots in industrial production, where machine-to-machine communication enabled the manufacture of complex items, but it is now expanding in the commercial realm, where small monitoring devices allow such things as ovens, cars, garage doors, and the human heartbeat to be checked from a computing device."4 At the point when our relationship to a device (or a connected series of devices) has become this intimate, this pervasive, the relationship cannot be called free of values, ethics, or ideology.

I'll be candid. I am quite often an unabashed fan of the Internet of Things. I like that my devices talk to one another, and I enjoy tracking my movement and my heart rate. I even find myself almost unable to resist my curiosity about something like the new Bluetooth-enabled cup [https://www.myvessyl.com/] that can track how much water I drink. I like controlling my car from my phone and feeling the tickle of an incoming text message on my wrist. But my own personal curiosity and fascination are outweighed by my concern at the degree to which similar devices are being used in education to monitor and police learning.

The ELI report continues: "E-texts could record how much time is spent in textbook study. All such data could be accessed by the LMS or various other applications for use in analytics for faculty and students." I am worried by how words like "record," "accessed," and "analytics" turn students and faculty into data points. I am worried that students' own laptop cameras might be used to monitor them while they take tests. I am worried that those cameras will report data about eye movement back to an algorithm that changes the difficulty of questions. I am worried because these things take us further away from what education is actually for. I am worried because these things make education increasingly about obedience, not learning.

Remote proctoring tools can't ensure that students will not cheat. The LMS can't ensure that students will learn. Both will, however, ensure that students feel more thoroughly policed. Both will ensure that students (and teachers) are more compliant. In his 1974 book Obedience to Authority: An Experimental View, Milgram described "the tendency of the individual to become so absorbed in the narrow technical aspects of the task that he loses sight of its broader consequences." Even if I find the experiment itself incredibly problematic, Milgram offers useful reflections on the bizarre techno-theater that helped elicit obedience.

When Internet-enabled devices have thoroughly saturated our educational institutions, they run the risk of being able to police students' behavior without any direct input or mediation from teachers. By merely being in the room, the devices will monitor students' behavior in the same way that the cameras and switches and lab coats did in Milgram's experiments. How will learning be changed when everything is tracked? How has learning already been changed by the tracking we already do? When our LMSs report how many minutes students have spent accessing a course, what do we do with that information? What will we do with the information when we also know the heart rate of students as they're accessing (or not accessing) a course?

I maintain a great deal of excitement about the potential of the Internet of Things. At the same time, I find myself pausing to consider what Milgram called "counteranthropomorphism"—the tendency we have to remove the humanity of people we can't see. These may be people on the other side of a wall, as in Milgram's experiment, or people mediated by technology in a virtual classroom.

Winona Ryder has very few lines of dialogue in Experimenter, and yet her performance is a pivotal one because she offers a guide, a moral compass, for the off-screen audience. She is complicit in her passivity and yet rebellious in her willingness to register raw and genuine emotion, something no other character can muster. And as the film unfolds, the shock and awe on her face gives way to compassion. As she looks upon the scene of the experiment, she sees human beings and not the experiment.

We must approach the Internet of Things from a place that doesn't reduce ourselves, or reduce students, to mere algorithms. We must approach the Internet of Things as a space of learning, not as a way to monitor and regulate. Our best tools in this are ones that encourage compassion more than obedience. The Internet is made of people, not things.

Notes

  1. Cari Romm, "Rethinking One of Psychology's Most Infamous Experiments," The Atlantic, January 28, 2015.
  2. Milgram described this last device as "an impressive shock generator. Its main feature is a horizontal line of thirty switches, ranging from 15 volts to 450 volts, in 15-volt increments. There are also verbal designations which range from SLIGHT SHOCK to DANGER—SEVERE SHOCK." (Stanley Milgram, Obedience to Authority: An Experimental View, 1974) I sense glee in the language Milgram uses ("impressive"), something theatrical in his excess ("thirty switches"), and a fastidiousness in his attention to detail in reporting all of this.
  3. Jesse Stommel, "Who Controls Your Dissertation?" Vitae, January 7, 2015.
  4. EDUCAUSE Learning Initiative (ELI), "7 Things You Should Know About the Internet of Things," October 6, 2014. I find something ominous about the capital I and capital T in the acronym IoT, a kind of officiousness in the way these devices are described as proliferating across our social and physical landscapes.

Jesse Stommel (Twitter: @Jessifer) is executive director, Division of Teaching and Learning Technologies, at the University of Mary Washington. He is the director and founder of Hybrid Pedagogy and Digital Pedagogy Lab. His own personal site is at http://www.jessestommel.com.

© 2016 Jesse Stommel. The text of this article is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

EDUCAUSE Review 51, no. 4 (July/August 2016)