© 2005 EDUCAUSE
EDUCAUSE Review, vol. 40, no. 4 (July/August 2005): 62–75.
The following is an excerpt of "From Tin Cans to the Holodeck: The Future of Networking in Higher Education," chapter 11 in Judith A. Pirani and Gail Salaway, with Richard N. Katz and John Voloudakis, Information Technology Networking in Higher Education: Campus Commodity and Competitive Differentiator, EDUCAUSE Center for Applied Research (ECAR) Study, 2005, vol. 2 (Boulder, Colo.: EDUCAUSE, 2005), key findings available at http://www.educause.edu/ir/library/pdf/ecar_so/ers/ERS0502/ekf0502.pdf.
Any sufficiently advanced technology is indistinguishable from magic. —Arthur C. Clarke
Networking’s history has been breathtaking. The 1980s promise of information "anytime and anywhere" has been achieved and surpassed. Siebel Systems’ tag line updates this promise succinctly: the network provides "what you need, when you need it." Where the computer has indeed evolved into a sophisticated tool to compute and to store and edit files that reside on our desktops (and much, much more), the emergence of networking and the Internet have extended the computer’s reach and indeed have recast the computer as a communication tool. It is a tool for looking outward as well as inward. Sun Microsystems’ Bill Joy makes the point forcefully: "Disconnected from the network, my computer is nearly as useless as a cell phone in an area with no cellular service."1
It was in many ways easier to forecast networking’s progress to this point than to predict its future. Early advocates and pioneers who could anticipate and envision personal computers, a common user interface, search engines and the like could also foretell how linking more people and devices through a shared network would alter how we work, learn, socialize, and recreate.2 Looking ahead to the next 25 years isn’t so easy. In fact, most futurists refuse to speculate this far into the future and are quick to point out that Moore’s Law and other rules of thumb that predict the growth of storage devices, network connectivity, and number of network connections suggest growth by a factor of a million in fewer than 30 years. Such capabilities and their social, economic, and political implications more closely resemble science fiction than science, and it is reasonable to restrict our discussion to a shorter time frame, focusing on changes likely to occur on "this" side of the science–science-fiction boundary.
Two Shifts Ahead
According to HP CEO Carly Fiorina, "We have entered an age now where every process and all content will become mobile, virtual, and personal."3 In the higher education context, Fiorina’s characterization will likely present itself in the form of two interrelated shifts that will accelerate in the next 5 to 10 years.
Toward an Integrated Cyberinfrastructure
Computers and networks have changed how research is conducted in many academic disciplines. New and emerging disciplines like computational chemistry, computational biology, bioinformatics, atmospheric informatics, and others bear witness to the revolution in scholarship that is under way. The 2003 report of the National Science Foundation Blue Ribbon Advisory Panel on Cyberinfrastructure described these changes as constituting "a revolution in how we create, disseminate, and preserve scientific and engineering knowledge."4 The NSF panel described the concept of an advanced infrastructure layer on which "scientific and engineering research and education environments could be built."5
This layer sits between the exponentially growing network and computing infrastructure and the complex of scientific instruments, data, knowledge, disciplines, and communities of practice. The envisioned cyberinfrastructure is a layer of "enabling hardware, algorithms, software, communications, institutions, and personnel [organized] … for the empowerment of specific communities of researchers to innovate and eventually revolutionize what they do, how they do it, and who participates."6 This layer would include "grids of computational centers … comprehensive libraries of digital objects … multidisciplinary, well-curated collections of scientific data … thousands of online instruments and vast sensor arrays … toolkits for resource discovery, modeling, and interactive visualization … and the ability to collaborate with physically distributed teams of people using all of these capabilities."7
Recognizing the importance of such a coordinated cyberinfrastructure for all scholarship, the American Council of Learned Societies (ACLS) organized the Commission on Cyberinfrastructure for the Humanities and Social Sciences in 2004 and charged its members with
- describing the cyberinfrastructure’s current state for the humanities and social sciences,
- articulating the requirements and potential contributions of the humanities and social scientists to the cyberinfrastructure’s evolving definition, and
- recommending areas of emphasis and coordination between the ACLS and other organizations and institutions that will be developing the cyberinfrastructure.8
The ACLS will disseminate its findings and recommendations in 2005.
Elements of the envisioned cyberinfrastructure already exist. Federally funded projects such as the Network for Earthquake Engineering Simulation (NEES), the National Virtual Observatory, and the Space Physics and Aeronomy Research Collaboratory (SPARC) represent cyberinfrastructure elements that support the physical sciences, while the Human Genome Project and the National Institutes of Health (NIH) Biomedical Informatics Research Network (BIRN) represent just two such elements that support the life sciences. States and regions are preparing the ground for the cyberinfrastructure by acquiring dark fiber and establishing governance structures that will link K–12 and higher education to enable collaboration among physically distributed teams of people described by the NSF panel.
Campuses, too, are building and linking elements of this envisioned cyberinfrastructure. Syracuse University, for example, is funding work on wireless grids to explore the intersection of wireless technology and high-performance grid supercomputing, while the Cal-(IT)2 program at the University of California, San Diego (UCSD), explores the use of light pipes (lambdas) to provide researchers with a solid, "jitter free," predictable network on which to build the grid.
The integration challenge will be daunting. Organizations like Internet2, the National LambdaRail (NLR), and the Globus Alliance are collaborating to develop and promote the standards and governance that will knit the disparate research networks, cybertools, and data sets into the envisioned cyberinfrastructure. Internet2’s Van Houweling counsels, "This time around, the technical issues are larger than they were in the past, so we need more brains and better cooperation than we had in the past. We have many efforts and activities in all these areas throughout the higher education community. We need to make sure that these efforts are integrated and use a common architecture, so our network infrastructure does not become Balkanized."
Toward Pervasive and Personalized Intelligence and Communications
If the networking community’s battle cry of the 1980s was "[information] anytime, anywhere," and if that cry evolved in the 1990s into "what you want, when you need it," then perhaps networking’s driving vision in the future will be "all you can imagine, all the time." This vision of the cyberinfrastructure suggests quite clearly what Frances Cairncross called the "death of distance" and the blurring of the lines between real and virtual in the context of learning and scholarship.9 In the coming years, the lines that distinguish the real from the virtual will indeed grow fainter, driven by four key trends:
- logical connectivity,
- smart and talkative devices,
- convergence, and
- personalized on-demand and reliable services.
These capabilities, of course, must be integrated and deployed in ways that are compellingly human and that foster community and not social isolation.
The Institute of Electrical and Electronics Engineers (IEEE) defines pervasive computing as systems that are mobile and ubiquitous. Mobility and ubiquity in turn depend on systems that are portable; untethered from desktops; always on, and in fact, scalable on demand; and that, as Sun Microsystems’ Bill Joy puts it, "unbottle" media and interactions from their conventional containers.
Logical Connectivity
Much of networking’s history has been the history of wires. In fact, UCSD’s Larry Smarr sums it up well when he proclaims, "Conduit is power." Continuous engineering and management effort have been devoted over 35 years to expanding the number of bits that can be passed along electrical currents in copper wires and, more recently, to the transport of bits on light waves through optical fibers. These techniques, and the accompanying electronics, made networking a captive of the physical environment. In higher education, therefore, early and continuous attention has been paid to "wiring the campus"—that is, installing backbone networks and distribution systems across campuses and into offices, classrooms, laboratories, dormitories, and so forth, and connecting these backbones to access points to the Internet, Abilene, or other specialized external networks. The idea and benefits of transporting bitstreams on the backs of radio waves, microwaves, or other spectral waves that demand no wires, conduit, trenches, and building construction have long been understood and used, but the technologies and standards needed to make this possible cost-effectively are recent.
Today, as this report suggests, wireless networking is widespread throughout higher education. And as this technology penetrates higher education more broadly, and as wireless broadband and security solutions present themselves, higher education’s cyberinfrastructure managers are enlarging their views about wireless networks’ role and importance. The Darwin Group’s Mike Roberts notes, "Broadband wireless has a long way to go in terms of utility. But it has already changed computing more than the old-timers thought it would. Look at the extent to which everyone is assuming a wireless environment—in the classroom and now even homes." Indeed, Intel now tracks and publicizes the "Most Unwired Cities," and a recent survey of senior corporate executives revealed that the most popular technology among respondents was wireless Internet connections at home.10 Spending on home networking in 2004 reached $8.4 billion and is forecast to reach $17.1 billion by 2008.
To a great extent, campus networking has already gone wireless. In 2002, ECAR reported that 7 percent of survey respondents had implemented comprehensive wireless networks and that an additional 52 percent had implemented a limited amount of wireless networking on campus. By 2004, more than 75 percent of those responding to this study’s survey had implemented or were planning to deploy 802.11g-based wireless networks widely in the next 12 months. Our students are responding as well. The 2004 ECAR Study of Students and Information Technology reported that 93.4 percent of the 4,374 freshmen and seniors who responded to this survey owned computers and that 46.8 percent of these owned laptop computers. Importantly, many laptop computers now come preconfigured with wireless access capabilities, and ownership of laptop computers among freshman respondents in the ECAR study is higher (52.7 percent) than among seniors.11
Smart and Talkative Devices
Another amazing and challenging aspect of the future of networking is the embedding of communicating "intelligence" in anything and everything. The Information Age is moving from the extreme early skepticism of those like Digital Equipment Corporation’s Ken Olsen or Microsoft’s Bill Gates, who could not imagine uses for home computers, to Wal-Mart’s mandate that its 100 largest suppliers were to have all of their cases and pallets "chipped" with radio frequency identification devices (RFID) by January 1, 2005.12
In terms of pervasive networking, placing a brain and vocal chords in things like paper, postage stamps, cases of wine, children’s backpacks, and so forth means that ultimately everyone and everything is reachable on the network. Today, for example, logistics and distribution management firms like United Parcel Service of America (UPS) and Federal Express (FedEx) integrate bar codes, scanners, wireless messaging, databases, and the Internet to track shipments’ progress from supplier to consumer. This complex of technologies fosters information resources and business processes that incorporate the customer, reducing customer phone calls and enabling things like proactive online customer alerts. Tomorrow, RFID chips and sensor networks will let packages, library books, and other objects announce themselves, their whereabouts, expiration dates, and condition to servers and ultimately to the Internet. These capabilities will enable orders to be refilled, perishable stocks to be replenished, shoplifters to be nabbed, and lost pets and children to be found. New York’s Museum of Modern Art (MoMA), for example, has already "chipped" works of art to track them in transit. More interestingly, MoMA curators plan ultimately for highly intelligent chips that will interact with museum patrons’ handheld or wearable devices, enriching the patrons’ education and experience.
The implications for other cultural and educational institutions are evident. Gartner Research Fellow Martin Reynolds describes a world in which "a hospital could track every patient and every pill in the building. Airlines could track every passenger and every bag." In fact, the United States is already deploying "chipped" passports, and the European Union (EU) is contemplating embedding intelligence in paper euros. UCSD CIO Elazar Harel concludes that "with embedded chips and RFID coming, not just everyone, but everything, is reachable—for better or worse."
The proliferation of widespread embedded technology will require the development, deployment, and management of sensor networks. David Culler of the University of California at Berkeley and Intel is among those working on the hardware and software that will enable thousands or even millions of sensors to communicate. His team has pioneered ways to conserve power and developed the TinyOS for sensor devices. Other teams are working on making sensors smaller and smarter and developing the network electronics needed to route and switch the things that things say.
Convergence
A second force driving the pervasiveness is the convergence of voice, data, and video networks and the deployment of converged services. In fact, Clifford Stoll describes the Internet as a "telephone system that’s gotten uppity!" According to Burton Group’s Irwin Lazar, "Disjointed forms of personal communications will rapidly converge into a unified application that combines voice, Instant Messaging (IM), video, collaboration, and presence. The result will improve organizational efficiency, allowing individuals or groups to communicate directly with each other through a common system, regardless of device or application."13 Lazar goes on to describe the characteristics of a converged communications environment:
- Individuals control how they are contacted.
- Various forms of communication can interact.
- Communicators can learn about each other’s availability and location.
- Communications systems keep track of people’s accessibility regardless of the device or system they use.
- People can set the parameters for application-to-voice interactions.
- The converged environment will support legacy communications applications.14
In the next few years, higher education’s IT leaders will be challenged to manage convergence on the technical, organizational, legal, and social levels. Technically, we will need to deploy sufficient bandwidth to accommodate the inevitable rise in video traffic on institutional networks. We will need to deploy an infrastructure—or acquire one—that will support unified messaging. We will need to review, acquire, and deploy tools that will let end users really integrate converged services and tools into their work. Indeed, the evidence is strong that when telephones got smart, most of us did not, and our phone systems’ "advanced features" today go largely unused. Technical convergence also signals a multiplicity of intelligent devices. The emergence of "integrated communicators" will pose tremendous support challenges as technical capabilities race, as always, with technical standards and human learning curves. Those who manage the institution’s IT and information resources will have to make and continually revisit choices about which platforms to support and about the breadth, quality, and duration of help desk hours as devices cross the boundaries between institutional and personal use. The challenge, as Lazar suggests, is presence: the ability to convey real-time information about people’s current location and the forms of communication (audio phone, IM, video conference, and so on) they can use.
Convergence will also challenge traditional campus assumptions about networking control, as campus citizens walk in and out of campus networks’ range and into the range of cellular carriers and others (see The Vanishing Frontier—Regulation and Taxation, below). Harmonizing these disparate environments will prove complex from economic, technical, and policy perspectives.
Organizationally, convergence is already disrupting the college and university workplace. Voice communications are going digital, are ripe for technical convergence, and no longer represent an attractive source of institutional cost recoveries. As the economics of stand-alone voice communications become increasingly problematic and as the technical barriers to integrating communications services abate, most institutional leaders will move to invest not only in technical convergence but also in the organizational integration of often separately organized voice and data communication teams. Burton Group’s Irwin Lazar advises, "To understand and plan for this fundamental change, enterprises must bring together disparate teams responsible for separate individual applications into a unified convergence work group that will set strategic direction for enterprise communications."15
Convergence, in the end, is less a technical exercise than a social one. It promises technology-mediated collaboration and community. According to the University of Manchester’s Mark Clark, "The nature of documents is increasingly trending to compound documents that incorporate image, data, text, and voice annotation. E-mail is likely to shrink as a way of sharing documents, giving way to the increased use of collaborative working environments for document development analysis, editing, and even drafting. Video conferencing, particularly that on the high end associated with technologies such as access grids, is showing exponential growth. Increasingly, virtual communities will be built upon networks as the glue to provide social cohesiveness." Managing the deployment and then integration of converged technologies into a cohesive, converged service environment—and ultimately into the kind of rich collaborative environment Clark describes—will likely demand considerable attention in the future.
Personalized "On-Demand" and Reliable Services
The commercial sector describes a world of "competitive Darwinism" in which "unstoppable drivers are creating a new on-demand environment where competition is intense, change is continuous, financial pressures are unrelenting, and threats are unpredictable."16 These drivers are impelling businesses to deploy and manage technical environments (and business processes) that are focused, responsive, variable, and resilient. In higher education, this technical challenge is often described as grid computing, a vision of distributed computing wherein large-scale resources are shared in a flexible, secure, and coordinated manner among individuals, institutions, and resources.
The promise of grid computing, on-demand business, five-9s reliability, personalized "lamdas," and other technical and organizational innovations and directions will lead to the creation of a secure cyberinfrastructure that is
- highly leveraged (across individuals, institutions, and resources);
- responsive to real-time changes in demand; and
- available 24 x 7 in an uninterrupted manner.
Realizing these complementary visions will not only require flexible technical architecture, protocols, services, interfaces, and software development kits but will also depend on "coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations."17
Implications
These shifts have enormous and exciting implications for global society in general and higher education in particular, including
- broadband for all;
- a new era of data-intensive scholarship; and
- increased virtuality, mobility, and community in the academy.
Broadband for All
Tom West, president of National LambdaRail builds on the point about networking’s collaborative imperative and on higher education’s past and future role: "I believe that the research and education community has a stewardship role to the larger community. We must provide connectivity to every location in the country—that means every school connected by fiber to state, regional, and national networks. Higher education has a part to play in our society’s challenge to make the capabilities of networking available to every citizen."
This ethos impels initiatives like "One Gigabit or Bust" from the Corporation for Education Network Initiatives in California (CENIC). According to CENIC President Jim Dolgonas, "The [California] gigabit initiative is designed to stimulate one-gigabit broadband to all Californians by 2010. Our goal is to help the research sector serve our community best, by doing what the commercial sector isn’t doing." Again, the tradition of collaboration in higher education networking is a paramount success driver. Dolgonas sees "a real challenge in creating and maintaining the sense of community necessary to bring about our higher education private networks. If I buy from a carrier, it is a traditional vendor-customer relationship, and supposedly I am in control. In contrast, in regional and national network consortia, we have equal players, requiring that some participants have to compromise. It is a bit like shared services and—if you can do it—the benefits are great and there are gigantic payoffs."
The enthusiasm and optimism that suffuse this section do not intend to minimize the digital divide’s importance. EDUCAUSE Vice President Mark Luker reminds us that "the missing link is that last mile, where cities and rural areas are not fully wired." For such locales, Luker advises "the huge breakthrough is wireless. Small towns are starting to wire whole communities with wireless—putting antennae on grain silos and water towers. Technology is moving very fast and getting faster. Wireless is a true breakthrough for places that have been underserved."
The Era of Data-Intensive Scholarship
UCSD Professor Larry Smarr describes the future as an "era of data-intensive science."18 At the November 2004 ECAR Symposium, Smarr described some of the key layers and elements that will constitute this era and the place of the network amidst this complex mix of technologies, academic disciplines, and human behaviors.
In November 2004, the U.S. Congress approved a bill that increases funding for supercomputing initiatives in the United States and extends greater access to such systems to academic researchers. The bill, which both houses of Congress passed and President Bush is expected to sign, directs the Department of Energy to "deploy a high-end computing system that is among the most advanced in the world." The bill also requires the Department of Energy to give academic researchers access to supercomputing systems. Although the bill does not appropriate funds, it authorizes the department to spend up to $165 million over three years. Daniel A. Reed, vice chancellor for information technology at the University of North Carolina at Chapel Hill, believes that in the past few years the United States has not devoted sufficient resources to high-tech research projects and said the bill will help put U.S. supercomputing "back on the front burner."19
The era of data-intensive science ahead will not just be defined by or confined to supercomputing and scientific uses of data. Today’s young social science investigators are exploring big econometrics, big sociology, and even big history, for example. Choreographers are using visualization and simulation techniques to model and teach dance, orchestral performers conduct master classes across great distances, and literature scholars are using algorithms to conduct content analyses of texts long believed to have been "mined out." Geographers are experiencing an intellectual renaissance, having incorporated remote sensing, global positioning systems (GPS), and other data-intensive techniques into their research practice.
The scale of computing, storage, and networking is changing profoundly. Two University of Houston engineering professors recently won a $1.1-million grant from the National Science Foundation to develop a storage device using nanotechnology. This technology could allow the complete contents of the Library of Congress to fit on a handheld computer.20 Doug Van Houweling describes the era of data-intensive scholarship in terms of "disruptive applications," which by themselves can take much of any shared bandwidth that is available. Such applications include
- real-time access by physicists to particle collisions at CERN, FermiLab, and elsewhere that require 6- to 7-gigabit throughput;
- access to pathology tissue banks for telemedicine, requiring gigabit speeds per simultaneous user; and
- access to data from distributed radio telescopes, microscopes, and other high-performance instruments.
According to Smarr and to Burton Group’s Daniel Golding, trends like these are changing how connectivity between data centers is architected. Past models called for relatively small carrier circuits connecting many enterprise data centers. However, several recent developments suggest we rethink how data centers (and sources) are designed. In private industry, data-center consolidation efforts, combined with increasing regulatory burdens and more-serious disaster recovery planning, are changing enterprise data-center design. In research-intensive universities, where Gigabit Ethernet traffic out of scientific instruments is becoming commonplace, institutions with greater bandwidth demands and more data and storage networks to pass between data centers are taking a new look at an established carrier technology: wave division multiplexing (WDM).
WDM lets an optical fiber carry many signals by combining several light wavelengths into a single transmission. Each light wavelength carries a discrete channel of data, and each channel can carry as much data as a classical optical network—10 gigabits or more. This multichannel approach allows many networks to be carried on a single fiber pair. The decision to support optical networking will in turn drive changes in wavelength capacities, network topology, and protection and restoration mechanisms. As with traditional network technologies, the economics, capacity, expandability, and manageability of optical networking solutions will vary, suggesting the ongoing need for sophisticated engineering talent within institutions that remain on the frontier of higher education networking capabilities. In an era of data-intensive scholarship, scholars in the humanities and social sciences will also need remote access to large data sets, instruments, and archives. These disciples might not have their own high-performance networks, suggesting the need for commercial access to bandwidth through Internet2 or NLR or through shared services arrangements with state, regional, or other academic network providers.
Mark Clark, CIO at the University of Manchester, raises important research questions that designers of tomorrow’s networks must answer: "How can the scholar handle complexity that is enabled by bigger and bigger supercomputers which are harder to program and where it is harder to understand the phenomena that are being simulated or analyzed? And how do we handle the data deluge? Data will be the problem of the future: handling larger and larger volumes, mining and visualizing complex data sets, and managing the data-sharing issues such as privacy, confidentiality, provenance, and archiving. There is a changing in the skills and education needed by the research, professional, and general workforce."
Virtuality, Mobility, and Community in Academe
This chapter has, to this point, dealt exclusively with the potential of evolving network infrastructure and services to foster an era of data-intensive research. Of course, higher education’s mission is broader than discovery, and networking’s transformative potential on teaching, learning, community engagement, and administration is similarly exciting. Higher bandwidth, sound identity management, pervasive wireless networking, and the affordable availability of many computing and communications devices will open the door to the proliferation of rich-media virtual environments. The Web will likely become a three-dimensional environment with navigation that uses virtual portals and avatars. Progress on Web descriptors and locator frameworks will proceed as a continued race between growth in the number and size of virtual haystacks and the tools in place for finding virtual needles. Collaborative environments will likely become increasingly "human," particularly as rendering tools and techniques from the gaming world become widely available, and usable, and as voice over IP (VoIP) enables those collaborating in virtual spaces to converse and to modulate, locate, and attenuate their voices in group settings. As the standard technologies for displaying network media improve (high-definition video and megapixel displays, for example), video conferences and other activities that incorporate the real and the virtual will increasingly blur the boundaries between the two by tricking the senses. These environments exist today in limited scale.
These capabilities’ ongoing improvement, in concert with ongoing investments in network capacity and performance, will likely render debates about virtual, distance, hybrid, or face-to-face education meaningless. Questions of educational policy and practice will become simultaneously simple to ask and complex to answer: Whom do I teach? How do I teach? How, when, with whom, and at what cost do I learn? And how does all of this cool stuff get paid for? As network-mediated learning opportunities disrupt higher education’s traditional market segmentation, pricing strategies, brands, and so on, the focus of institutions will likely shift to the achievement of social and educational outcomes; of teachers, to pedagogy; and of learners, to affordability, lifestyle, and learning, career, and social goals. The question "Where did you go to college?" may in the long run yield to the question "With whom did you study?" Institutional personalization of experience will compete with faculty free-agency to determine whether higher education institutions, like many others, will be disintermediated by the network. Institutions wishing to compete in part on the basis of "place" will likely continue in what some have called an "arms race" of investment in the campus built environment and in student services.21 Changes like these led one higher education association to declare, "We have become a people unable to comprehend the technology we invent."22
The emergence of plentiful, customizable, and secure bandwidth along with the eventual integration of rich media will also foster the formation and diversity of learning communities. As collaborative work tools are rendered more and more human, as the vision of a collaborative cyberinfrastructure matures, as humans become better acclimated to cyberspace, and as incentives are redrawn to foster interdisciplinary and interinstitutional work, new and rich linkages between teachers, learners, and others in the academy will prosper. The network has the potential to become the ultimate leveler of social distinctions in higher education as interests seek and find compatible interests, and talent seeks and finds compatible talent in cyberspace.
For wireless networking and mobility, the future is now. Wireless networking is already subtly changing all institutions, and once again higher education is in the fray. In some cases, these changes force reconsideration of long-standing space-usage practices and could lead ultimately to reconsiderations of mission. For example, responding to reports showing strong demand for Internet access, officials at the British Library announced in November 2004 the implementation of a wireless network in the library’s reading rooms, auditorium, restaurants, and outdoor area. "A study recently showed that 86 percent of library patrons carry laptops and that 16 percent came to the library to use it as a business center."23 The story on higher education’s campuses is no different. Many institutions appear to be "leading from the rear," observing new patterns of student behavior enabled by mobile network access and then redefining common physical spaces into "information commons," "flexible learning environments," and so forth.
The shift toward wireless networks in higher education and mobility’s implications are important. Coalition for Networked Information Executive Director Clifford Lynch reminds us that "this ubiquity business is really important. We all remember the rhetoric in the 1980s about the wired classroom, built with Ethernet to every desktop. A few were built at very high cost, they were a specialized place, and they were scheduled from dawn to dusk." Lynch goes on to remind us that when classroom networking is a scarce good, faculty rarely have an incentive—or the means—to radically rethink their pedagogy to incorporate networked information resources. "Now," says Lynch, "we have actually done it wirelessly—creating the wired classroom of the 1980s on a very broad scale. And a lot of faculty are freaked about this. Including the broad implementation of wireless networking in classrooms has led to reports of faculty requests to ‘turn the network off’ in failing attempts to curtail students’ Web surfing or, even worse, passing of derogatory instant messages and evaluations in class! These tales remind us of Arthur Schlesinger’s observation that ‘science and technology revolutionize our lives, but memory, tradition, and myth frame our response.’"24
In the long run, virtuality, mobility, and community in higher education will also reshape its business landscape. In mitigating the effects of distance, the network is already enabling some institutions to rethink IT and resource governance. Soon more institutions will consider data-center consolidations as one means of curbing IT spending growth while assuring service levels, providing prudent backup, and the like. Over time, more institutions will use the network to reach beyond the campus to share or host services with other institutions, corporations, cultural institutions, or state governments. In the longer term, institutions could successfully implement service-oriented architectures that will permit invoking the services necessary to operate the enterprise on demand, over the network. In this longer-term vision, the service provider’s location, governance, or ownership will be transparent and irrelevant to that service’s consumer, except as regards service quality and cost. And advantages will accrue to the virtual. ECAR Senior Fellow Robert Albrecht reminds us that "traditional universities cannot handle the increasing number of students between now and 2015 when their numbers peak. Virtual universities—and those that blend the physical and the virtual—are not only better positioned to respond to this enrollment demand, but they can scale quickly as enrollments escalate and can thus continue to grow."
Importantly, adopting highly distributed systems to operate higher education’s business enterprise will depend not only on reliable networks but also on a trust fabric woven in middleware and relationships.
Conclusion
Predicting the course of networking’s future is simultaneously easy and difficult. Predicting this future is also ultimately unnecessary.
Prediction is easy because the broad outlines and vectors are clear:
- Networks will get faster.
- More things will be attached to networks.
- The "exponentials" are changing as bandwidth begins to become cheaper than storage.25
- More and more services will be delivered over networks.
- Network bandwidth and services will become easy to customize on demand.
- Distinctions between the real and the virtual will become unclear.
Predicting the course of networking’s future is difficult because of at least two factors. First, the adoption rate and the ultimate shape and texture of the networked information future will depend, as always, on human factors. And these will depend on how imaginative, cost-effective, interoperable, secure, and easy to use are the services that become available over networks, and on humans’ abilities to adapt to and toggle between virtual and real environments. As Larry Smarr comments, "What we really need are social scientists to figure out what people need to work together well over the network."
Second, predicting our network future proves difficult because we are at a nexus point from which emanate several possible futures that will be determined less by technology and funding and more by public policy, regulation, trust, and human behavior. Technically, we will be able to deliver dedicated, end-to-end terabit speeds affordably. We will also be able to guarantee QoS, affordably. The public policy questions relate to the fundamental issues of openness, scalability, and collaboration: will the signal-to-noise ratio on the "open Internet frontier" be in such a balance that users of a shared Internet will value the benefits of openness (with noise) more highly than they do closed networks, with reduced noise? These options more likely will manifest themselves as a range of choices among virtual political economies, with unregulated democracy on the one extreme and tight, private (totalitarian) controls on the other. And of course, events like 9/11 will surely shape dominant directions within the continuum of choice. The potential for innovation could hang in the balance.
Within any of these future scenarios, too, one can imagine social movements among those wanting to go "off grid" and the digital divide implications of such scenarios. Massachusetts Institute of Technology’s Neil Gershenfeld reminds us that "to a species that seeks to communicate, offering instantaneous global connectivity is like wiring the pleasure center of a rat’s brain to a bar that the rat presses over and over until it drops from exhaustion."26 Under any conditions, the network can and will amplify educational divides. Those who have neither literacy, numeracy, nor information literacy will be increasingly estranged from social institutions, cultural institutions, and government.
Predicting networking’s future is unnecessary because the network’s logic and its potential have been embedded in our global psyche. Nearly everyone would agree on the six trends described above. Indeed, the Center for the Digital Future recently identified 10 trends worth noting here:
- In the United States, the digital divide is closing but is not yet closed, as new divides emerge.
- The nation’s media habits have changed, and continue to change.
- The Internet’s credibility is dropping.
- We have just begun to see the changes to come in buying online.
- The Internet’s "geek-nerd" reputation is dead.
- Privacy and security concerns remain high.
- The Internet has become the number-one information source for its users.
- The Internet’s benefits and drawbacks for children are still coming into focus.
- E-mail: "E-nuff already."
- Broadband will change everything—again.
The real prediction issues, then, are about when these eventualities will occur and who controls them, and not about what these eventualities are or whether they will occur. Antoine de Saint-Exupéry reminds us, "Machines do not isolate Man from the great problems of nature, but plunges him more deeply into them."27
For higher education, networks present great hope for the future, while our cultures continue to pose great challenges. California State University Executive Vice Chancellor Richard P. West says it best: "We are now at the stage when technology can really pervade what we do. This is how some of us who have been active for many years in higher education’s IT journey idealized it. Technology and our networks are prevalent in their use and will begin to have a significant compounding effect on our missions and our students. At Cal State, incoming students now take math placement assessments online and can take required courses online at their own pace. It really is anytime, anyplace–oriented learning—with outcomes-based learning. This simple application is extremely usable and embeds rich pedagogy, making it very much a Trojan horse. The challenge is getting the faculty to do things like this. Technology today is truly making it possible to revolutionize higher education delivery. We are going to need some sort of catalyst to make that happen, since even without any changes we continue to grow 3 to 5 percent per year. So why should the faculty change? Why should we change? Maybe when 2015 comes and the student population peaks, maybe there will be a change."
1. B. Joy, "Design for the Digital Revolution," Fortune, Mar. 6, 2000, p. F-10.
2. See, for example, R. Heterick, "A Single System Image," CAUSE Professional Paper No. 1 (Boulder, Colo.: CAUSE, 1988). See also R. N. Katz and R. P. West, "Sustaining Excellence in the 21st Century," CAUSE Professional Paper No. 8 (Boulder, Colo.: CAUSE, 1992).
3. M. J. Miller, "HP’s Fiorina Talks Tech," PC Magazine, Oct. 19, 2004, p. 8.
4. National Science Foundation, Revolutionizing Science and Engineering through Cyberinfrastructure: Report of the NSF Blue Ribbon Advisory Panel on Cyberinfrastructure, Jan. 2003, p. 4, http://www.cise.nsf.gov/sci/reports/toc.cfm.
5. Ibid., p. 5.
6. Ibid.
7. Ibid., p. 7.
8. American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, The Charge to the Commission, http://www.acls.org/cyberinfrastructure/cyber_charge.htm.
9. F. Cairncross, The Death of Distance: How the Communications Revolution Will Change our Lives (Cambridge: Harvard University Press, 1997).
10. See "Big Businesses Back BlackBerrys," e-Marketer, Nov. 24, 2004, http://www.emarketer.com/Article.aspx?1003149.
11. R. B. Kvavik, J. B. Caruso, and G. Morgan, ECAR Study of Students and Information Technology, 2004: Convenience, Connection, and Control (Boulder, Colo.: EDUCAUSE Center for Applied Research, Research Study, Vol. 5, 2004), p. 32.
12. D. Barlas, "Wal-Mart’s RFID Mandate," Line 56, June 4, 2003, http://www.line56.com/articles/default.asp?ArticleID=4710.
13. I. Lazar, Communication Convergence: The Power of Presence (Salt Lake City: Burton Group, In-Depth Research Report, 2004), p. 1.
14. Ibid., pp. 7–8.
15. Ibid., p. 5.
16. R. Hancock, P. Korsten, and G. Pohle, On Demand Business: The New Agenda for Value Creation (New York: IBM, 2003), p. 4, http://www-1.ibm.com/services/us/imc/pdf/g510-3312-00-on-demand-business-new-agenda.pdf.
17. I. Foster, C. Kesselman, and S. Tueckle, "The Anatomy of the Grid: Enabling Scalable Virtual Organizations," Int’l J. Supercomputer Applications, Vol. 15, No. 3, 2001.
18. L. Smarr, "Providing Your Faculty Access to the Instruments of Scientific Discovery," a presentation made at the Third ECAR Symposium, Nov. 16, 2004.
19. V. Kiernan, "Congress Passes Supercomputing Bill That Could Help Academic Research," Chronicle of Higher Education, November 19, 2004, http://chronicle.com/prm/daily/2004/11/2004111902n.htm.
20. See the University of Houston Daily Cougar, Vol. 70, Issue 64, Nov. 19, 2004, http://www.stp.uh.edu/vol70/64/index.html.
21. G. C. Winston, "The Positional Arms Race in Higher Education," in Forum Futures (San Francisco: Jossey-Bass Publishers, 2000).
22. Association of American Colleges, Integrity in the College Curriculum: A Report to the Academic Community, 1990, http://www.csuci.edu/accreditation/1.%20EER/2.%20Task%20Force%20
Reports%20and%20Exhibits/Standard_4/Task_4.7(a)/Exhibits_4.7(a)/4.7a.6%20FAC%20
DEV%20Library.pdf.
23. BBC News, "British Library Gets Wireless Net," November 18, 2004, http://news.bbc.co.uk/2/hi/technology/4020241.stm.
24. A. J. Schlesinger, "The Challenge of Change," The New York Times, July 27, 1986.
25. L. Smarr, op. cit.
26. N. Gershenfeld, When Things Start to Think (New York: Henry Holt &Company, 1999), p. 3.
27. A. de Saint-Exupery, Wind, Sand, and Stars (New York: Harcourt, Brace, 1939).