The Connection Business

min read

People's attitudes about technology and disconnection have changed dramatically over the years, with varied implications for the "connection business" of higher education information technology.

two orange goldfish in separate bowls facing each other
Credit: Adam Gault / Getty Images; FISHES: Cookelma / iStock © 2020

Nearly a decade ago, in August 2010, a post appeared on the EDUCAUSE CIO Constituent Group listserv titled "Classroom Internet ‘Kill Switch.'" The writer had been asked by faculty whether there might be a way to equip classrooms with a device that would allow them to disable wireless access to the internet. With such a switch, professors would no longer have to police students who had become distracted by Facebook, online games, and other entertainments. The query was, for the most part, dismissed by those who responded on the listserv. They noted that the goal of technology is to connect people, not to disconnect. If students were distracted, the fault wasn't due to the technology so much as to the instructors.

We were puzzled by the reaction to the initial post. Giving faculty the option to turn classrooms into Faraday cages that would block off the internet seemed (to us) like a good idea.1 After all, back in 1845 Henry David Thoreau had done something similar. Tired of the hustle and bustle of his family's pencil-manufacturing business, Thoreau built a small one-room cabin on the side of Walden Pond and lived there for more than two years. Having removed himself from the distractions of everyday life, he was able to write and philosophize; his time away allowed him to begin work on essays and books like "Civil Disobedience" and Walden, which have become centerpieces in the American literary canon.

If it was good for Thoreau, why shouldn't we extend a bit of that privileged remove to today's students? Wouldn't spending a little time off the proverbial grid be helpful? We wondered why disconnection was no longer seen as a virtue. What historical events might explain the change? And was there anything in Thoreau's technological asceticism worth recovering?

These questions ultimately impelled us to research and write Bored, Lonely, Angry, Stupid: Changing Feelings about Technology, from the Telegraph to Twitter (2019).2 As we discovered, people's attitudes about technology and disconnection have changed dramatically. Today, we have far less tolerance for boredom and empty moments than did earlier generations. We regard and manage attention differently. Many of us can no longer abide being alone. These changes are creating a new sense of self in the 21st century. Many today regard their emotions and desires as timeless and natural. And many justify their technological choices by suggesting that those choices serve innate psychological needs such as alleviating boredom and loneliness. But it turns out that these needs and emotions are much more contingent than we often realize. We change. Our feelings change. And as we change, so too does our understanding of which technologies are best able to cater to our shifting needs and desires.

For instance, consider the experience of having nothing "exciting" to do. Today, we might think of ourselves as bored and try to end the feeling as soon as possible. Certainly our students seem eager to avoid boredom, turning to their phones whenever an empty moment presents itself. Constant connectivity allows them to do so. This is a departure from the past: in the 19th-century, Americans did not necessarily expect that everything would be exciting or that every moment would be full of sensory stimulation. Empty time sometimes hung heavy, but few worried about it. People sometimes described tasks or days as monotonous or tedious, but they were relatively unconcerned about the emotional effects of such dullness. In fact, the word boredom—denoting an internal psychological state of understimulation—did not even exist until the mid-nineteenth century. Only after it entered the language did boredom become an inner problem in need of a solution.

Just as earlier generations did not expect constant entertainment and stimulation, neither did they expect constant socializing with hundreds, perhaps thousands, of friends. As the Unitarian minister William Rounseville Alger explained in 1867: "There is more loneliness in life than there is communion. The solitudes of the world out-measure its societies." Like Thoreau, many people at the time found that being disconnected, alone, and on one's own brought benefits and opportunities for insight. They often termed this solitude. Some even believed that solitude ultimately made one more sociable. Alger, for instance, noted: "One of the most valuable uses of solitude is to prepare us for society. He who studies, when alone, to understand himself, and to improve himself . . . takes the surest means to commend himself to his fellow-men. He employs the best method both for giving and securing pleasure when he shall return from his retirement to mingle with others again."3

When 19th-century men and women did seek social connection, they typically looked for it with their neighbors, family, and friends. Unlike many of us in the 21st century, they did not believe their loneliness would end by being in touch with strangers from across the nation or the globe. Little wonder, then, that they did not initially see much use for the telegraph or telephone; in fact, many resisted these new devices and looked for the Victorian equivalent to the "kill switch." For instance, journalists and other observers reported on recurring sabotage to telecommunications systems in the United States. There were numerous accounts of men and women chopping down telegraph and telephone poles and wires, which they considered to be noisy, unsightly invaders that brought few benefits. In fact, some believed that telegraph wires might be harmful to both body and spirit. Rumors circulated that the wires transmitted disease and were responsible for the cholera epidemic of 1849. Meanwhile, preachers claimed the telegraph was defying God's will, since it seemed to give humans supernatural powers of communication. In later years, although most people had embraced both the telegram and the phone call, many remained skeptical that the new wires and poles could "cure" aloneness.

The rise of radio in the 1920s and 1930s sparked similar debates. While cultural critics recognized that radio was popular because it brought entertainment into the home, they suggested that its constant noise diminished individuals' capacities to sit by themselves in quiet. By 1942, a reporter noted that Americans had become so dependent on the radio that they could no longer accept or appreciate solitude. She wrote: "I've nothing against radios. Indeed, I . . . have installed them myself in various parts of my home, including my car." However, she added: "I am very much against our hysterical need of constant noise and diversion as a means of escape from solitude." She explained: "Solitude is not a blight nor a nightmare. It is a normal and necessary part of our human experience, and no character can become . . . poised without large amounts of it."4

In the early days of radio, such attitudes were often shared by college and university leaders. The New York Times surveyed educators in 1930 about their campus policies toward radios. Yale University leaders discouraged radios on campus, as Clarence Mendell, dean of Yale College, explained: "We have, at Yale, no central radio for broadcasting to the student body, nor do we encourage private sets. I believe that life is already too complicated and noisy for the best results without introducing any further disruptions." Lieutenant Colonel S. Whipple, of the United States Military Academy, explained that his institution prohibited radios in students' rooms, believing they were "a hindrance to the concentrated study required."5

This fear that students might be overwhelmed by too much information reflected another key assumption that shaped both higher education and the larger social world. Through the early twentieth century, physicians and psychologists believed that humans had finite storage capacity in their brains and inherent limits on how much information they could take in. In 1881, for example, the neurologist George Beard critiqued the educational system of his day and declared that the brain "is an organ of very feeble capacity. . . . The brain can hold but little."6 He therefore warned educators against trying to pack too much into their students' heads. Like Beard, other physicians, philosophers, educators, and psychologists filled scientific journals with accounts of students and businessmen suffering from dangerous illnesses that they labeled as cerebral hyperaemia, neurasthenia, and mental fatigue—all the consequence of studying too hard and of using too much brainpower to pay attention to the torrent of information rushing over telegraph wires, railroad tracks, and telephone lines. They believed the brain possessed natural limits that should not be transgressed—for the brain was, after all, a finite organ.

Exceeding these natural limits could have fatal consequences. According to medical experts, excessive brain work caused "the decomposition of brain substance,"7 distended blood vessels, intense head pains, fainting, vertigo, even death. Doctors filled scientific journals with accounts of "mental workers" who had collapsed, had strokes, or died after concentrating too hard. The cure was mental rest. Patients must limit the amount of information they took in, retreat from the world of work and thought, and avoid any mental exertion. Even a game of chess might endanger those suffering from mental strain. Humans needed mental recesses, empty moments when they were disconnected from the world of work and excitement.

These views shaped how men and women conceived of themselves and their abilities. They believed there were limits on experience and on themselves. They did not expect or even want constant connection, unlimited friends, unceasing entertainment, infinite information. Nor did they believe their brains could handle such a deluge. Humans were finite.

These attitudes have changed dramatically. Today we flee boredom, fear loneliness, and believe our brains are infinitely powerful. Those changes have implications for how we see ourselves. They shape how we train our students and what they expect from life. These changes also explain why it is so hard for many today to see disconnection as desirable or even possible.

As noted earlier, one key change was in how people regarded boredom. As labor became more industrialized and a growing number of workers toiled on assembly lines, complaints of boredom multiplied. By the 1930s, psychotherapists suggested that individuals had a right to expect diversion from the world around them. By the 1950s, psychologists were labeling boredom a pathology, and by 1986, they had developed a "boredom proneness scale" to measure the condition.8 Whereas people in the 19th century had expected and endured dullness, in the 20th century they came to fear it as a psychological malady and tried to avoid it at all costs.

The invention of laptops, smartphones, podcasts, video games, and social media aided the flight from boredom, as users tried to fill every waking moment with some kind of activity. By 2014, the Onion, in its typically satirical style, reported: "Citizens are loudly calling for a device or program capable of keeping them captivated as they move their eyes from a computer screen to a smartphone screen, arguing that a new source of video and audio stimulation is vital to alleviating the excruciating boredom that currently accompanies this prolonged transition."9

In our own research, we found that the need for constant entertainment was widespread among the students we interviewed. H., a student at Grinnell College, told us boredom felt "dangerous" to her. She believed that her smartphone, and technology in general, had accustomed her to constant entertainment. As a result, she noted: "Boredom has to immediately be filled in with something. I think that's a product of technology. Always having something at your fingertips. It's like, if you're bored, fix it fast."

Just as boredom changed, so too did the experience of loneliness. In the 18th and 19th centuries, people regarded loneliness as an expected, though not always pleasurable, part of the human condition. By the 20th century, this view was shifting. Many were coming to see near-constant connection as necessary for a good life and were starting to regard loneliness as a problem. Telecommunications companies helped promote this view, promising that they could eliminate the problem. In 1912, the Nebraska Bell telephone company advertised that a phone "banishes loneliness." Phonograph and radio companies made similar promises. "Buy an Edison Phonograph and you will never be lonesome," assured a 1905 ad.

Advertisers were joined by self-help writers who also denigrated aloneness. In the 1930s, success advisors like Dale Carnegie told readers they should worry about being alone and should try to have as many friends as possible. Those who were sociable and outgoing would succeed in life. If they failed, it was a result of their own personalities and their inabilities to connect in a meaningful—and profitable—way with others. By the mid-20th century, this attitude was so widespread that a new word entered the American language: loner. It was a pejorative term, a label for someone who stood outside the bustle of social life, who didn't try hard enough to be sociable. To be disconnected from others risked social stigma.

The pressure to connect only increased. By the 1970s, the sociologist Robert Weiss had declared that a "loneliness industry" seemed to be both publicizing and profiting from the feeling.10 An array of psychologists had taken up the study of loneliness, creating new anxieties about social disconnection. They developed a loneliness scale, offered self-help cures to encourage outgoing behavior, and celebrated gregarious sociability as a sign of psychological adjustment. Telecommunications companies also refined their advertising messages during this era, with Bell Telephone encouraging Americans to "reach out and touch someone."

In the 21st century, technology companies have celebrated connection even further, and loneliness has come to appear all the more worrisome. Facebook Co-Founder and CEO Mark Zuckerberg, for instance, has promoted online connection as a "human right" and Facebook Vice President Andrew Bosworth, has stated: "The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good."11 Whereas people in the 19th century were sometimes ambivalent about being connected, saw its downsides as well as its upsides, and believed sociability should be calibrated, in the 21st century we fear being disconnected even momentarily. Doing so runs afoul of prevailing social norms.

College and university students surely feel this imperative in their own lives, particularly online. In 2014, the Pew Research Center reported that the average number of Facebook friends per adult user was 338 and that the median number was 200. Of users who were ages 18 to 29, 27 percent had more than 500 friends.12 Alta, a radiology major at Weber State University, told us that she knew people so eager to appear popular that they had "friends on their Facebook that they don't even know just so that they can have the numbers. . . . I think there's too much importance on it now." While she said that she was not obsessed with the number of friends she had, she added that she nevertheless felt the need to be constantly connected: "When I leave the house now and I don't have my phone, I feel naked. Like, where is it? I have to have it on me at all times. . . . Just because what if someone calls or what if someone texts and I'm not there to respond? I don't love that feeling but . . . it's with me all the time. . . . So walking away from it is very nerve-wracking." The fear of missing out on social life and the worry that one doesn't have enough friends reflect contemporary expectations for constant connection and intolerance of solitude.

That people think they can process dozens (or hundreds) of social media updates, respond to scores of texts, and still manage their homelife is symptomatic of the new view of the brain. Unlike those in the 19th century, who believed that they had to limit the amount of information coming their way because their brains had finite capacity, people in the 21st century believe their brains can handle it all. This faith that they can manage the data deluge drives the multitasking we so often see today.

The early-20th-century philosopher William James played a crucial—if unwitting—role in promoting the idea of infinite brainpower when he declared: "Few men live at their maximum of energy."13 He maintained that with training, people could do more. From this statement emerged the idea that humans were underutilizing their brains and that they might have excess capacity still to use. James hadn't quantified how much brainpower people wasted, but others did. In 1936, Lowell Thomas's preface to Dale Carnegie's How to Win Friends and Influence People claimed that individuals used only 10 percent of their mental power, attributing the idea to James. Carnegie's book, a "bible" for anyone pursuing success, reassured readers that their minds could do ever more; there were no limits on their abilities or aspirations. This new vision of the power of the brain spread during the twentieth century, and though unfounded,14 it continues to shape how we conceive of our mental powers—even when they are degraded by distraction. Today many people believe that their brains have untapped capacity and that, with the right apps and drugs, they can take in more information. A 2013 Harris Interactive poll found that 65 percent of those surveyed believe they are using only 10 percent of their brain.15

When smartphones, laptops, and tablets emerged on the market, they reinforced the belief that it was possible to multitask and seemed to offer a way to use our alleged untapped mental powers. Unlike those in the 19th century, who believed that they should control the amount of information they took in and that they should sometimes disconnect so as not to surpass their natural limits, today we believe that we can absorb it all. Will, a student at Grinnell College, offered a sense of this optimistic attitude: "I have at my desk my laptop screen and . . . a second monitor screen. . . . Whether I am typing up a paper or . . . just relaxing . . . both screens are active at the same time. . . . I'll have my paper up here, and on the other screen there'll be like 50 tabs open. . . . All the time, I'm multitasking with the two screens. . . . I have the ability to search a billion things at the same time." When he wasn't multitasking, he felt bored, convinced that his brain wasn't "doing anything."

Will is not a lone case. Most of the students we interviewed for our research found it difficult to disconnect. Though today that may seem natural, their need to be hyperconnected is actually the result of technological and cultural changes that have reconfigured our sense of self. In the past, many people accepted loneliness, boredom, and limited knowledge as part of the natural order of the world. Today, that acceptance of limits has largely disappeared.

When higher education IT leaders make policy choices about which technology solutions to implement and which to dispense with, we often justify those choices by arguing that they cater to a specific need or desire of our students. We turn the choices into an ergonomic argument: our students have a particular set of feelings and needs, so we should deploy technologies that fit those needs. But as we have seen, those needs are not completely innate. They have been—and continue to be—reshaped and amplified over many years by cultural forces and business imperatives, including the loneliness industry and the entertainment industry, which have economic incentives to paint empty moments and aloneness in a negative light. When those of us in higher education information technology make policy, let's keep this history in mind. Even though we are in the connection business, perhaps we should reconsider our 19th-century ancestors' thoughts on the virtues of occasional disconnection.

Notes

  1. See Luke Fernandez, "Should IT Provide Faculty with Tools to Disable Wi-Fi in Their Classrooms?" Transforming Higher Ed (blog), EDUCAUSE Review, April 30, 2018. For an opposing viewpoint, see Joe Moreau, "Managing Campus Wi-Fi Networks: In Favor of Connectivity," Transforming Higher Ed (blog), EDUCAUSE Review, April 30, 2018.
  2. Luke Fernandez and Susan J. Matt, Bored, Lonely, Angry, Stupid: Changing Feelings about Technology, from the Telegraph to Twitter (Cambridge: Harvard University Press, 2019).
  3. William Rounseville Alger, The Solitudes of Nature and of Man; or, The Loneliness of Human Life (Boston: Roberts Brothers, 1867), pp. 20, 145.
  4. Elsie Robinson, "Listen, World! Learn to Be Alone," Portsmouth [NH] Herald, February 3, 1942.
  5. "Radio Inside Campus Gates," New York Times, December 7, 1930.
  6. George M. Beard, American Nervousness: Its Causes and Consequences (New York: G. P. Putnam's Sons, 1881), p. 317.
  7. William A. Hammond, Cerebral Hyperaemia: The Result of Mental Strain or Emotional Disturbance (New York: G. P. Putnam's Sons, 1878), p. 15.
  8. Woodburn Heron, "The Pathology of Boredom," Scientific American, January 1957; Richard Farmer and Norman D. Sundberg, "Boredom Proneness: The Development and Correlates of a New Scale," Journal of Personality Assessment 50, no. 1 (1986).
  9. "Americans Demand New Form of Media to Bridge Entertainment Gap While Looking from Laptop to Phone," Onion, July 30, 2014.
  10. Robert Weiss, quoted in Zick Rubin, "Seeking a Cure for Loneliness," Psychology Today 13, no. 5 (October 1979).
  11. Maeve Shearlaw, "Mark Zuckerberg Says Connectivity Is a Basic Human Right: Do You Agree?" The Guardian, January 3, 2014; Ryan Mac, Charlie Warzel, and Alex Kantrowitz, "Growth At Any Cost: Top Facebook Executive Defended Data Collection in 2016 Memo—and Warned That Facebook Could Get People Killed," BuzzFeed News, March 29, 2018.
  12. Aaron Smith, "What People Like and Dislike about Facebook," FactTank, Pew Research Center, February 3, 2014.
  13. William James, On Vital Reserves: The Energies of Men, the Gospel of Relaxation (New York: Henry Holt and Company, 1911), p. 7.
  14. Robynne Boyd, "Do People Only Use 10 Percent of Their Brains?" Scientific American, February 7, 2008.
  15. "New Survey Finds Americans Care about Brain Health, but Misperceptions Abound," Michael J. Fox Foundation for Parkinson's Research (MJFF), September 25, 2013.

Luke Fernandez is Assistant Professor in the School of Computing at Weber State University. He is co-author of Bored, Lonely, Angry, Stupid: Changing Feelings about Technology, from the Telegraph to Twitter (2019).

Susan J. Matt is Professor of History at Weber State University. She is co-author of Bored, Lonely, Angry, Stupid: Changing Feelings about Technology, from the Telegraph to Twitter (2019).

EDUCAUSE Review 55, no. 1 (2020)

© 2020 Luke Fernandez and Susan J. Matt.