My Computer Romance

min read

© 2007 Gardner Campbell

EDUCAUSE Review, vol. 42, no. 5 (September/October 2007): 12-27

Gardner Campbell is Professor of English at the University of Mary Washington. Comments on this article can be sent to the author at [email protected] and/or can be posted to the Web via the link at the bottom of this page.

In modern literature . . . [romance] denotes a work of prose fiction in which the scenes and incidents are more or less removed from common life and are surrounded by a halo of mystery, an atmosphere of strangeness and adventure. —William Rose Benét, The Reader's Encyclopedia, second edition (1965)

The SuperBrains were in the back of Wilson Hall.

That's where they kept the machines in the early 1980s at the University of Virginia. I passed the SuperBrains room several times each day as I went to my English teaching assistant desk in the office I shared with a faculty member. Now and then I'd look in at the machines and the people sitting at attention in front of them. The tableau looked like a typing pool, or like the control room of a television studio. I had read the English department newsletter describing the machines and the protocols for reserving and using them. I had seen, in the secretary's office, the five-and-a-quarter-inch-square SuperBrain data and program disks. I knew my colleagues were busily writing their papers and dissertations with the machines, and I dimly understood the advantages of writing in this manner. As I look back on this scene, however, what surprises me most in light of my own subsequent development is how incurious I was about what I was seeing. I never reserved a machine, never typed a single word of an essay or dissertation on a SuperBrain. My writing tool was a Smith-Corona electric typewriter, proudly purchased with fellowship money in my first semester of graduate studies—a machine that sported a ribbon cartridge (no threading!) with built-in correcting tape (no whiteout!). A rapid but not always accurate typist, I was greatly enamored of this feature. My goal was always to exhaust the ink ribbon before I exhausted the correcting tape.

Today I know that the fullest realization of that modest goal lay in the SuperBrains room, where the green and glowing word-processing technology meant I would never run out of either ink ribbon or correcting tape. I've been using a computer to write for nearly twenty years now. Given my habits of vision and revision, I can't imagine writing any other way. Using a computer sets my writing free. For example, I'm writing these words with a computer, a tablet PC running on batteries, as I sit in a coffeeshop on Cary Street in Richmond, Virginia, enjoying a fluid medium of syntax and semantics in which I can play to my heart's content.

Although the SuperBrains were immobile, their displays rudimentary, and their operation mysterious to me, they offered a substantial number of today's computer-writing benefits. What kept me from seeing and acting on those benefits? The question interests me, and not only out of self-regard. The question is at the heart of "faculty development," a crude, even misleading phrase that cannot suggest the trick of imagination needed to bring substantial, important knowledge into plain sight and to develop in faculty the resolve and courage to risk failure. For an academic, "failure" is often synonymous with "looking stupid in front of someone." For many faculty, and maybe for me back in the 1980s, computers mean the possibility of "pulling a Charlie Gordon," as the narrator poignantly terms it in Daniel Keyes's Flowers for Algernon.

There's probably an even larger issue here of coteries, of carefully guarded knowledge that can make one feel not only stupid but excluded. I don't want to make too much of this, since I admire the 1960s hacker culture of devoted and sometimes wild-eyed intensity, but there is certainly a "private language" aspect to information technologies, just as there is in any immersive pursuit. The problem, of course, is that we're not talking about fly-fishing, scrapbook-making, or mountain-biking; nor are we talking about pursuits as self-evidently serious as literary exegesis or scientific research. We're talking about a technology as basic, pervasive, and revolutionary as printing, or perhaps as writing, or perhaps as language itself. (Language too is an information technology.)

In this regard, I remember the one thing I did use a computer for in Wilson Hall, in the mid-1980s: my short-lived work on the "Cultural Literacy" project. I was one of the four graduate students who were editing the project materials (and, truth to tell, generating a good bit of them too), and in our work we all used WordPerfect on a departmental IBM PC-XT to compile our contributions. One of my colleagues, the first one in my career whom I would describe as "computer-savvy," knew how to append files to each other in order to create one large file combining all our work. When we offered to do some of that compilation ourselves if he would teach us how, he politely declined, no doubt as a courtesy to us but also, I suspect, because he knew the value of having what was at that time a fairly arcane skill.

Outside of Wilson Hall, there was one other truly interesting thing I did with computers during my stay in Charlottesville (besides the Captain Video games that ate most of my disposable income—measured, in true grad-student fashion, in quarters). That interesting thing involved a network. My experience was brief and limited, but looking back, I can see that it was a turning point for me. My wife was taking some computer-programming classes to aid her work in helping to bring the university's library catalogue online. At the same time, a friend of mine from Wake Forest University had decided to get a master's degree in computer science at the University of Virginia. Surrounded by their talk, I became curious about BASIC. My friend told me that I could experiment with BASIC on the university's VAX minicomputer. To do so, however, I would need to have an account set up. (That was the first time I heard the word account used in this way. It still gives me a small shiver of delight.) Once my account was ready, I could "log on" and begin to experiment.

I enjoyed playing with BASIC, very much so, but what really captured my imagination was the simple act of logging on to a networked computer. I vividly remember sitting at a terminal in Clemons Library and entering my password for the first time. A password! I felt like Alice with her key to the garden door, at last able to enter a hidden world. The strange thing is that I had no idea what the hidden world was. I had no e-mail address, no knowledge of file-transfer protocols—nothing. I knew only a little bit of BASIC to fool around with. But the idea that I was on a computer with other people, all of us using this central brain in various ways more or less simultaneously, gave me a persistent galvanic charge, not unlike the way I feel when I'm watching a good movie in the presence of other people. I dimly intuited something about communication and community in that moment, and though I could not have articulated or predicted where that would lead me, I knew I should pay attention.

Arcane skills, word-processing, programming languages, networked time-sharing—for years I kept bumping up against these things, and my interest was piqued, but for years they made absolutely no difference in my development as a Ph.D. student. The turning point came in 1988, when I bought my first personal computer. Back then, choosing a personal computer was even harder than it is today. At that time, buyers faced a choice among multiple platforms. Some CP/M machines were still around. The Atari ST was the choice of many musicians, and it could do word-processing too. Commodore, makers of the venerable Commodore 64, had recently introduced the Amiga, a multimedia personal computer that could do everything from office work to 3D animation and video editing. The Mac-vs.-PC question was even more charged than it is today: Apple's market share was much closer to IBM's, and for any serious multimedia or desktop publishing, the Macintosh was the only real choice. Of course, all these platforms were incompatible. The machines were typically quite expensive, and a 20-megabyte hard drive seemed a mandarin extravagance. For those wanting only the basics and unwilling to pay the high price, there was even the possibility of buying a dedicated "word processor," a kind of super-typewriter that was compatible with nothing and that forced the user to work from a coarse, three-line LCD display. I discarded this option quickly, for I knew I wanted a full-fledged microcomputer, not only a machine to do a task but an extensible device that could run programs and thus do things I had not yet imagined. Looking back, I can see why the computer store salespeople cringed when I came through the door. Their stock question was always: "What do you want to use the computer for?" I would itemize a few tasks but inevitably close my description with: "I want to use it to teach me what I want to use it for." In my early computer romance, I somehow sensed (though I don't know how) the recursive potential of this universal machine we call a "computer."

I asked a trusted professional writer for advice, and he steered me toward an IBM PC or a PC-compatible computer, advising me that the many Mac options for fonts and layouts would distract me from my main purpose of writing. (I suspect he's changed his mind since.) He also recommended that I bypass all the top word-processing applications in favor of XyWrite, which he used and recommended for its interoperability with professional typesetting applications. After a little more research on my part—remember, this was a decade before Google and long before I had ever seen, let alone purchased, a modem, so we're talking research in back issues of Byte and PC Magazine—I discovered Nota Bene, a XyWrite derivative crafted especially for language and literature scholars. I ordered this intriguing package right away and learned another important "development" lesson: this computer was not so much a machine that I would run as it was a socket for custom-built and customizable applications tailored for my specific needs. That part of my computer romance wouldn't bloom fully until around 2004, when I discovered the open-source/Web. 2.0 world, but it was certainly a powerful force in the early seduction and courtship stage of the relationship.

So there I was, armed with Nota Bene and a brand-new IBM PS/2 Model 30 (unkindly but accurately termed "an XT in drag" by one reviewer), complete with grayscale monitor and a built-in MCGA graphics controller. The first thing I did with my new gear had nothing to do with my scholarship, however. Instead, I put in the demo disk that came with the computer, a disk that explained what a computer was and how it worked. And I was immediately hooked. I saw moving pictures generated by the data on a pocket-sized disk. I discovered that I would not break the computer by opening it up and installing expansion cards. I figured out the difference between RAM and the data storage of a hard drive, a tricky distinction that still eludes many first-timers. And I learned that the computer's operating system (PC DOS 3.3, since my little "XT in drag" would not run the new OS/2 operating system) not only let the computer do its work but offered me the opportunity to think about information architecture: root directories, subdirectories, copying, deleting. Somewhere in my dazzled neophyte mind, I could sense the fluid possibilities here for organizing and representing information.

At this point, I am uncomfortably aware that my story diverges pretty dramatically from that of many of my colleagues, for whom computers offer no sense of quest or adventure and for whom computers inspire no affection. Over time I have learned that the moment of delight I felt when I saw a crudely animated sketch of my computer's innards displayed across the monitor is fairly rare for most scholars and indeed for most computer users. Much more common is this response, from a newspaper story quoted by Donald A. Norman in Emotional Design: "[Computer rage] starts out with slight annoyance, then the hairs on your neck start to prickle and your hands begin to sweat. Soon you are banging your computer or yelling at the screen, and you might well end up belting the person sitting next to you."1

Yet the world of new media in which children today play and grow has little of this rage, if my own experience of child-rearing is representative. Perhaps this, even more than ideas of fluency or digital nativism, is the sea change we await. Perhaps the next generation of faculty will have had enough computer romance in their lives (in the sense of both love and adventure) that, in my playful imagination, computer rage will be as socially unacceptable as kicking a dog.

Are there other such rages in our professional lives? I've never heard of library rage, for example. No colleague has ever screamed in frustration about how many books there are in the university library, though I imagine most scholars have felt that sinking feeling, probably while working on their dissertations, when it becomes abundantly clear how little mastery of the world's knowledge they will ever achieve, even in their various sub-sub-sub-specialties. But that's a different feeling altogether. Computer rage is more like the rage we feel when we confront a stack of papers or a useless committee assignment or when we lose our car keys. Knowledge about computers seems irrelevant, and working with them is at best a necessary evil. Very few mainstream scholars demonstrate much curiosity about computers as manifestations of mind, which of course is all they are. Very few traditional colleges or universities introduce computers in this way to students or the faculty who teach them. At the extreme, and I suspect my colleagues in higher ed IT shops will bear me witness on this, knowing something about computers in 2007 is like knowing how to type in the mid-twentieth century: a quotidian and somewhat demeaning skill farmed out to support staff. Of course, many in the nobility felt that way about literacy in the pre-modern era. But I digress.

I wish I could say that when my computer romance was developing, I was a far-seeing, all-grokking visionary, but I wasn't. Not only did I not "get it," I had almost no idea what "it" was or what "getting it" might mean. Yet the possibilities I distantly intuited would not leave me alone: possibilities for representation, for communication, for a meta-awareness of the way minds encountered and made sense of the world and shared that sense-making with others.

My computer romance turned from adventure to quest in 1990, when I began my full-time teaching career. I was a visiting instructor in the English Department at the University of Richmond, teaching while finishing my dissertation at the University of Virginia. During my orientation at my new institution, I was introduced to the university's computer network as a place for communication via e-mail, a new concept for me, and e-mail groups, mysteriously called "listservs." I could gain access to these tools by way of an account (ah, that word again), a modem that would let me dial into the university's network, and some training in BITNET, a communications network linking many colleges, universities, and research institutions throughout North America and Europe.

Now my wonder, and the work I was doing on my dissertation, became something much larger, something I could not have predicted. When I bought my $300 2400-baud external modem (I still hadn't opened the cover on my PC, though I knew I wouldn't break it by doing so) and logged on to the campus network for the first time, I had a flash of excitement like the one I had experienced the first time I logged on to the VAX minicomputer at the University of Virginia. This time, however, the network of minds became explicit to me as I began to communicate in this digital medium. Here culture also played an important part, as the kindly, owlish geek who set up my university account also had materials on hand related to a local Bulletin Board System (BBS) called "The Blue Ridge Express," a discussion forum, communications hub, and downloadable software repository to which I could also dial up and log on. Now the representation of mind became for me also an enactment of mind. I got so excited that I subscribed to Prodigy, an online service that I soon tired of—an early lesson that not all networks were going to be equally satisfying, and that for me the deepest satisfactions came from a suddenly enlarged sphere for sharing knowledge and enthusiasms. I had begun to sense where the intersections lay between communication, community-building, teaching, and learning, all via this digital medium of networked computers. Online, I met someone who loved movies and Glenn Gould as much as I did—someone who would become a lifelong friend. (I verified his Gould enthusiasm by inspecting his music collection at our first face-to-face meeting.) Online, I received a congratulatory e-mail on the birth of my first child. Shortly before I left Richmond for San Diego and my first tenure-track job, Kevin Creamer (now of the University of Richmond Center for Teaching, Learning, and Technology) set up Milton-L, an international listserv for Miltonists. Milton-L would become a key resource for an entire generation of scholars, and it certainly kept me alive during my first years as a Miltonist. I could feel my consciousness expanding.

Still, it would be another five years before I starting using these technologies in my teaching. That was partly because most students did not start bringing computers with them to campus before the mid-1990s and partly because campuses did not routinely furnish faculty, staff, and students with access to high-speed data networks before that time. There is also the small matter of a technology called the World Wide Web, whose use began to expand in the mid-1990s as well. As I reflect on those revolutionary developments, however, I see that they did not make nearly so dramatic a difference in my own thinking about computers as did an interesting little project I first saw demonstrated at an IBM convention in downtown San Diego in 1992. In this project, I suddenly saw the potential for information technologies as a powerfully integrative platform that could connect students and faculty and that could help both groups synthesize different information sources, and different modes of experiencing that information, in ways hitherto impossible. I could also see that this integrative platform not only would further traditional modes of teaching and learning but also would, inevitably, lead to innovative understandings of what "teaching" and "learning" can mean in practice.

This IBM project was called "Illuminated Books and Manuscripts," and the example I saw, one of the first developed, illuminated Tennyson's "Ulysses." In an October 31, 1991, post to the SHAKSPER listserv, Ann Miller, at the Carrier Library of James Madison University, quoted from a brochure that itemized the hardware involved:

The Illuminated Books and Manuscripts is recommended for use with the IBM Personal System/2 Model M57 SLC with 6 MB of memory and an 80 MB hard disk drive. The following components are also required:

--IBM VGA Monitor or Large Screen Computer Display
--IBM PS/2 Mouse
--IBM SCSI Internal CD-ROM Drive
--Pioneer LD-V8000 LaserDisc Player
--M-Audio Capture/Playback Adapter Card
--Matrox Illuminator-16/Micro Channel Video Card2

This equipment, top-of-the-line in 1993, illuminated the poem from without and within, imbuing it with the light of many attentive minds and responsive hearts. The text of Tennyson's poem was augmented by pop-up text that commented on and glossed the poem, video clips of talking heads (including the poet and scholar John Hollander as well as several students, as I remember) analyzing parts of the poem, multiple performances of the poem, and a thick set of pictures and reference materials that also helped to elucidate and contextualize the poem. The "illuminated book" of "Ulysses" rendered the essentially performative aspect of literary studies plain and compelling. It was also beautiful to view. Most of all, it made the experience of the poem as rich, complex, and—for lack of a better word—musical at first glance as it had become for me after many years of study.3

My first thought, after I had recovered my composure (I actually felt dizzy when I first encountered this "illuminated book"), was that I wanted to take "Ulysses" home with me. My second thought was that I wanted to create something like it for every poem I loved. My third thought was that I wanted to teach this poem with this tool. I felt as if I no longer had to give directions on how to get to the subtle, complex experience of a poem. Instead, my students and I could travel there together and immerse ourselves in the poem. We could share our minds in the moment and trust that sharing, because what we would share had emerged from many powerful, inspired, expert minds working through time to bring us that moment. In other words, the "Ulysses" project integrated the components of understanding into a richly reflective performative instance of that understanding. It did what a great teacher does: it brought the poem to life. Every bit of information it conveyed—and the information bandwidth was very wide indeed—served the goal of bringing the poem to life. And that complex, integrated liveliness could not have been accomplished in any other medium.

Communication of mind, representation of mind, enactment of mind: all brought to life by computers. These machines were useful as appliances, as super-typewriters, but what grabbed my imagination and never let go of it was the experience of mind that I could have, and dream on, in the presence of computers.

I strongly believe that there is a "Ulysses" project for every scholar, for every faculty member sitting through endless training sessions or "how to" seminars teaching Excel or Photoshop techniques. There is nothing wrong with Excel or Photoshop—nothing at all. But these programs are a means to an end. In many respects, the goal is easier to demonstrate in a Web 2.0 world. Photoshop makes contributions to Flickr more attractive and popular. Excel provides charts a user can stick into a blog to demonstrate a point. Communication, representation, and enactment are easier to find and easier to understand these days—but only if one understands that these are the goals. For many of my colleagues, they are not the goals. For them, computers are still inert appliances, even "useless nuisances that ought never to have been invented," as George Minafer complains of automobiles in Booth Tarkington's The Magnificent Ambersons. I'm sympathetic to these complaints. Computers, like most innovations, are disruptive technologies. Moreover, the argument that "this is the way the world is going, and we'd better accept it" has never by itself been a persuasive one for me. Higher education has a responsibility not only to reflect culture but to reflect on it and, at times, to be countercultural. For me, the reason to use computers emerges from the way they augment human intellect. That augmentation, different in degree but not in kind from that of all other technologies of augmentation (e.g., language, money, transportation, clothing, housing), brings with it some overhead. There are nuisances involved with these technologies. Yet computer-enabled augmentation conveys an uncanny all-at-once-ness that feels to me like what the Miltonist William Kerrigan has described as "the enfolded sublime"—a part that not only implies but somehow contains the whole. As I sometimes explain it to my students: "The inside is bigger than the outside."

Networks, animation, information architecture, communication, integration: these are the defining episodes of my computer romance, but they are not its culmination. It took a classroom—in fact, several of them—to bring all of this home to me. This stage of my development, flowing from and recursively flowing back into all the earlier stages, was and is both quest and consummation. This part of my history is still being written. I first used listservs in the classroom in the mid-1990s. In 1997 I began experimenting, alone and in collaboration with my colleague Bill Kemp, with publishing freshman writing projects to the World Wide Web.4 At about this same time, as an homage to the "Ulysses" Illuminated Book project, I created a tool called "HyperPoem" in an effort to bring the activity of interpretation to life for literature students by enabling direct comparisons of recorded performances of lyric poetry. Since then, I have used blogs and wikis and other new media—in fact, nearly everything I can think of—to illuminate the works I teach and to focus the lights from the minds and hearts of my students onto these works until the students feel the excitement of their own illuminations. But that is another story, the story of how computers became, more or less completely, an extension of my own mind, so thoroughly a part of how I imagine and express my experience that to give an account of it would mean a radical shift in the narrative—and would probably double the length of this essay.

The story I've tried to tell here is the story of what led me to decide that computers are as important to my professional life as are pencils, chalk, and books. It's a story as well of how my professional life and my personal life intersected in a realm of love and adventure, in a genre known as romance (though, contrary to the definition that opens this article, my romance is nonfiction). I love computers, these machines, for the same reasons that I love symbols—for the qualities of mind and heart, apprehension and comprehension, that they represent and encourage. An emerging consensus in the cognitive sciences emphasizes that mind and heart not only are related but are, in some respects, versions of each other. In Emotional Design, Norman states: "As I've said, cognition interprets and understands the world around you, while emotions allow you to make quick decisions about it. Usually, you react emotionally to a situation before you assess it cognitively, since survival is more important than understanding. But sometimes cognition comes first. One of the powers of the human mind is its ability to dream, to imagine, and to plan for the future. In this creative soaring of the mind, thought and cognition unleash emotion, and are in turn changed themselves."5

My computer romance taught me how these sophisticated calculating machines—particularly in a high-speed, high-bandwidth network—could represent, communicate, and integrate these creative soarings of the mind. I have a hunch that a log-on moment, or a "Ulysses" transformation, awaits every thoughtful scholar. Those moments and those transformations will differ from mind to mind and from discipline to discipline. But they are there. We must find them, and we must share them with our colleagues if we want to truly enable faculty development and, with that development, transform the lives of both faculty and students. If we cannot find these moments and transformations, we must make them. For a computer romance is a romance with civilization, with community, with the possibilities of collective intelligence and fresh conceptualizations. A computer romance is a romance with human capabilities—with symbols themselves and with the infinite suggestiveness they embody.

This computer romance is my own, but it has a point beyond mere reminiscence. My computer romance is also a story of literacy. My parents were literate, though they were not highly educated. My father, born in 1907, had a sixth-grade education and worked as a laborer all his life. My mother, born in 1919, went to secretarial college instead of becoming a statistician, because a guidance counselor told her that "women didn't become statisticians." Neither of them grew up in intellectual surroundings. Yet both understood the value of education, and both understood the vital importance of literacy. Both read constantly, mostly from newspapers and magazines and the Bible. My mother, in fact, yearned to be a professional writer. From my parents I learned that full engagement with the world and its citizens would come only through the deep understanding, and effective use, of the skills of reading and writing.

Now, in 2007, "reading and writing" still means working with words. I do not propose that books be abolished, nor do I think that working with words has lost its central importance. But today there are other literacies that have become centrally important. We live in a world mediated through sound, image, and moving pictures. The tools we need to work within these new media are inexpensive, numerous, and increasingly simple to use. Students are creating interesting, detailed, and sometimes important works with these tools, and their creations can be shared for free with a sizable fraction of the world within weeks or even days of publication. Their creations offer valuable windows into their learning minds, windows onto cognition, windows that can help faculty carry out the wish that Nadia Boulanger—the French composer, conductor, and music professor—shared with her students: "May I have the power to exchange my best with your best."

These tools are the tools of learning. They are increasingly the tools of citizenship, of democracy. This year, four of the French candidates for president had campaign offices in Second Life, a persistent online virtual world. Political candidates in the United States are exploiting blogs, YouTube, Twitter—indeed, just about any new avenue of communication or aggregation. These tools are, in fact, among the most vital tools of civilization itself.

Are we playing in a zero-sum game? Will the noise introduced by networked computers inevitably drown out any gains in useful signal? I don't know the answer to that question, but I do know that the question is not new. The waste products of civilization, and the efforts of shallow or unscrupulous individuals, may finally reduce our accomplishments to the level of that of Shelley's Ozymandias—our legacy, like Shelley's king, little more than a ruined monument whose inscription amid the rubble solemnly instructs its viewers: "Look on my works, ye mighty, and despair." But that is only one possible answer to the question I'm asking, a question that computers do indeed make more urgent. As the far-seeing engineer Doug Engelbart would put it, the speed and sophistication of these computers also raise the level of hopefulness that we can increase our capabilities faster than our liabilities.6 Is that not the dream of education itself—that if we school ourselves and our children, we may be able to progress as a society, if only to take two steps forward for every one step back?

We live in 2007. Faculty complaints are real and serious. In lives full of teaching, advising, reading, marking papers, writing, presenting at conferences, publishing—more demands each year—faculty do not have the time to learn these new literacies. Based on their past experience, faculty fear that whatever they do learn will likely be obsolete within a few years. If faculty are successful in learning these new forms of reading and writing and in working within them, their achievements are often not valued in the tenure and promotion process. And if faculty incorporate these new literacies into their teaching, they still may not understand how to evaluate student work within these literacies. I hear these complaints from my faculty colleagues, faculty at other U.S. colleges and universities (from liberal arts to research institutions), and faculty around the world. These are valid complaints. They must be addressed, especially by administrators who can align institutional resources to bring relief and opportunities to those faculty ready to engage with these tools. All faculty who are ready deserve a place where they too can enjoy a computer romance.

Yet faculty must move forward before the professional infrastructure is completely hospitable. Faculty can no longer afford to wait. We faculty live in 2007, and we all must be ready. These technologies are not going away. Their promise is enormous and only beginning to be realized. They are essential components of every aspect of our lives, and we owe it to ourselves and our students not only to understand them but to delight in them, to learn within them, and to share those delighted experiences of learning with our students. Only when our students see our own learning blossoming within a computer romance will they listen to us when we tell them to use these tools more wisely themselves.

Lives of curiosity, creativity, and discovery within this new digital realm await us all if we are prepared to calm our fears, share our ideas (whether or not they're half-baked7), and remember the excitement that called us to this place, this vocation. The computers are us. The world is our wiki.

If we carry the ring, we will find the way.

Web Bonus!

The author's reading of "My Computer Romance" is available as a podcast. The podcast is also available on the author's blog, Gardner Writes: <http://www.gardnercampbell.net/blog1>.

Notes

1. Donald A. Norman, Emotional Design: Why We Love (or Hate) Everyday Things (New York: Basic Books, 2004), 7–8.

2. Archived at http://www.shaksper.net/archives/1991/0275.html.

3. For a contemporary review of the IBM project, see Peter H. Lewis, "Personal Computers: Importance of Being Multimedia," New York Times, November 5, 1991, http://query.nytimes.com/gst/fullpage.html?res=9D0CE1DA133CF936A35752C1A967958260.

4. See, for example, our Stranded project, begun in 1997, at http://www.stranded101.info.

5. Norman, Emotional Design, 13.

6. For an explanation of Engelbart's "augmentation framework," see his Web site: http://www.bootstrap.org.

7. See Jon Udell's excellent blog post on academia's often destructive unwillingness to share half-baked ideas: http://weblog.infoworld.com/udell/2006/09/15.html#a1524.