Surveying the Digital Landscape: Evolving Technologies 2004

min read

© 2004 EDUCAUSE Evolving Technologies Committee

EDUCAUSE Review, vol. 39, no. 6 (November/December 2004): 78–92.

The EDUCAUSE Evolving Technologies Committee

Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference.

This year, under the leadership of Committee Chair Charles R. Bartel, the committee selected six evolving technologies, presenting a brief overview at EDUCAUSE 2004. Published below are excerpts from the white papers on each topic, written by individual members of the committee: "Spam Management," by Pablo G. Molina; "Legal P2Ps," by John C. Meerts; "Learning Objects," by Michael D. Roy; "Convergence of Libraries, Digital Repositories, and Web Content Management," by James M. Duncan; "Nomadicity," by S. Alan McCord and Leslie P. Hitch; and "Regional Networks," by Bonita M. Neas. The full white papers, with contributions by additional committee members, can be found on the committee Web site (http://www.educause.edu/issues/etcom/). These full papers address many other strategic areas for each evolving technology: key questions to ask; the implementation challenges; the major vendors and how to judge among them; how to proceed and the issues to be addressed; and the likely impacts in the next three to five years.

2004 EDUCAUSE Evolving Technologies Committee

Charles R. Bartel, Committee Chair
Director of Network Services
Carnegie Mellon University
John S. Bojonny
Director, IT Services
Montgomery College Rockville
Emilio DiLorenzo
Director, Infrastructure and Technical Support Services
Rochester Institute of Technology
James M. Duncan
Assistant Director for Technology Services, Hardin Library
University of Iowa
Leslie P. Hitch
Director, Academic Technology Services
Northeastern University
S. Alan McCord
Professor of Management and Executive Director of Academic Program Administration
Lawrence Technological University
John C. Meerts
Vice President for Information Technology
Wesleyan University
Pablo G. Molina
CIO, Law Center Campus
Georgetown University
Bonita M. Neas
Assistant Vice President, Federal Government Relations, and Director, CHPC
North Dakota State University
Michael D. Roy
Director of Academic Computing Services and Digital Library Projects
Wesleyan University

Spam Management

Two things are certain in Internet life in 2004: spam and outages. Successfully dealing with the former may help reduce the latter. To do so, colleges and universities need to develop methods, processes, policies, and tools to manage and control the flow of spam—unsolicited e-mail messages.

Why Is Spam Management Important to Higher Education?

Spam management is critically important in order to preserve the technology resources of an institution, including but not limited to bandwidth, e-mail infrastructure, support services, and information security defenses. For example, the Georgetown University Law Center Campus recently implemented an outsourced spam-management solution. In June 2004, a slow month for academic and computer virus activity, the service trapped more than 50 percent of incoming e-mail messages as useless: 10 percent of the e-mail traffic was trapped because it was infected with computer viruses; and 42.5 percent of all campus e-mail traffic was identified as spam. For a campus of about 3,000 people, the number of monthly spam/virus e-mail messages was over 500,000, an average of 167 messages per person. Since many community members have been using personal spam-avoidance techniques for the last couple of years, this number is worrisome.

Similarly, the e-mail system used by the majority of faculty, staff, and students at Georgetown University’s main campus was so overloaded with virus and spam messages that the system was brought down on several occasions during the first half of 2004. Fortunately, the technology team alleviated the problem with an overhaul of the e-mail infrastructure. A working group continues to explore an institutional-wide spam-management approach.

Spam management can help respect institutional values and meet community expectations. A clear example for me was an alarming phone call from Sister Dorinda Young, who called the technology team to say that she was receiving some "very strange e-mails about body parts that I do not have." Likewise, Associate Dean James Feinerman expressed great concern when he was working at home and accidentally opened an offensive spam message, received on his university account, while family members were present. The Georgetown University Acceptable Computer Use Policy does clearly state: "The University cannot protect individuals against the existence or receipt of material that may be offensive to them. As such, those who make use of electronic communications are warned that they may come across or be recipients of material they find offensive." Obviously, individuals’ expectations differ.

How Is Spam Management Evolving?

New technologies are being deployed to reduce spam. On May 5, 2004, the Wall Street Journal announced: "Microsoft Corp. is adopting an anti-spam tool that gives favored treatment to certified mass e-mailers, the first major operator to do so." This technology was developed by a company called Ironport Systems. It is based on associating e-mailers with IP addresses to guarantee the authenticity of the messages and the legitimacy of the senders.

A second approach under research is the idea of charging a tax for every e-mail message sent. The tax does not need to be assessed in monetary units. For example, the system could require that senders’ computers solve a mathematical equation for every e-mail message sent. In the case of spammers, this would require significant computing power.

Legal and administrative actions are proving partially useful in controlling spam. Many states have passed spam laws over the last few years, yet many of their provisions may be preempted by federal laws. On December 16, 2003, President George W. Bush signed the "Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003." The CAN-SPAM Act, as it is known, went into effect on January 1, 2004; it regulates interstate commerce by imposing limitations and penalties on the transmission of unsolicited commercial e-mail via the Internet. On April 28, 2004, the Washington Post reported: "U.S authorities charged four people in Detroit with e-mailing fraudulent sales pitches for weight-loss products, the first criminal prosecutions under the government’s ‘can spam’ legislation, which requires unsolicited e-mails to include a mechanism so recipients can indicate they do not want future mass mailings."

Dan Updegrove, the University of Texas at Austin vice president for information technology, shared with other college and university CIOs the following news on March 27, 2004: "United States District Judge Sam Sparks issued an important ruling supporting our university’s right to block unsolicited commercial email (spam), even if the email is judged to be legal under the recently-enacted CAN-SPAM Act of 2003. . . . In this case a company obtained over 50,000 UT Austin student, faculty, and staff email addresses last spring, via a valid state open records request, then used the addresses to promote the company’s business (a dating service, ‘LonghornSingles.com’). After the company did not comply with a request to cease and desist such email broadcasts, UT blocked the site. The company sought and received a temporary restraining order to remove the block in state court, which order was rescinded soon after by a federal judge. The company then sought a permanent injunction, which was denied last week."1

On June 15, 2004, the Federal Trade Commission announced that a "do-not-spam registry" service would not be created in the near future. The commission had been obligated to consider the proposal under the CAN-SPAM Act. The reason behind this decision was that the FTC would find itself "largely powerless to identify those responsible for misusing the registry."2

Conclusion

Most higher education organizations should be developing and implementing interim strategies to reduce spam. At the same time, technology companies and government agencies must address the problem from technical, legislative, and enforcement perspectives.

Legal P2Ps

It was only a few years ago that Napster was the scourge of every CIO at every institution of higher education in the United States. Internet connections were overwhelmed by this new menace. While institutions dealt with Napster in various ways, mostly by buying more bandwidth from the Internet service provider or by trying to banish Napster, KaZaa, or any other incarnation of the phenomenon, they ultimately learned to live with MP3 downloads and other P2P applications.

The recording industry, fearing a tremendous reduction in profitability, tried to stem the tide by suing first Napster and other "providers" of illegal MP3s and then the users of these services. But these lawsuits have had mixed results, and members of the recording industry are now embarking on a new approach: providing legal commercial music downloading and streaming services. In turn, they are trying to sell these services to colleges and universities.

Apple’s iTunes (http://www.itunes.com) was the first and most successful attempt to sell music over the Internet. Driven by its hugely popular iPod portable MP3 player, Apple has sold millions of songs for $.99 a piece. This has convinced the music industry that there is a future in selling music over the Internet rather than selling CDs through the traditional retail channels. Soon the new Napster (http://www.napster.com) was born, and RealNetworks followed suit with a product called Rhapsody (http://www.rhapsody.com).

Whereas iTunes allows patrons to purchase individual songs, Napster and Rhapsody provide another service. By purchasing a subscription, patrons can stream unlimited amounts of music to their computers. They can even download unlimited numbers of songs and play them while off-line, a concept called tethered downloads. As long as the patrons are keeping their monthly subscription up-to-date, they can play any stored and any streamed music files. All of these providers feature additional services, such as unique playlists, search capabilities, and players that let users organize libraries of music for playback on their computers, entertainment systems, and portable MP3 players.

Over one million songs are now available from iTunes; Napster provides access to 700,000 titles. All genres of music are available, but these providers tend to ignore the independent music markets coveted by students. A startup company called Ruckus (http://www.ruckus.com) is trying to address this gap. Still in development, it plans to provide music from independent labels and perhaps more innovative work from local bands that frequent the college scene. In addition, Ruckus will deliver movies, college cult TV and film programs, and local entertainment events listings. It will provide cover art, critical reviews, student-produced content, and other information specifically geared toward the college student. Ruckus will store the content locally on campus-based servers.

Why Are Legal P2Ps Important to Higher Education?

Legal P2Ps are still in their infancy. But just like campus networks a decade ago and wireless only a few years ago, what appears to be brand-new and optional today soon becomes old news and required.

Digital life is critically important for students. The current generation of students is so entrenched in the digital world that colleges and universities may well be forced to provide non-educational digital content in the not-too-distant future. Today many campuses provide cable TV, an FM radio station, and through the library, access to traditional analog content of various kinds. A plausible argument can be made that providing all of this content digitally will provide campuses with a competitive advantage. Just as campuses had to address the "Yahoo Most Wired" list in past years, they may have to deal with the "Best of Digital Entertainment" list tomorrow.

Focus groups commissioned by Ruckus (this paper can be obtained from Ruckus on demand) show how students are using various entertainment options today. Strikingly, traditional analog media is becoming less important. The computer becomes both the core of the entertainment center and a tool for instant communication with peers on and off campus. Watching a movie on a computer or listening to a song while having multiple instant messaging conversations and perhaps also writing a paper for class is relatively common behavior. Students tend to use "Resnet" predominantly for entertainment purposes. Campuses could choose to ignore these trends or, alternatively, could try to put some fences around them and potentially derive some benefits.

Still, the legitimate question remains: should campuses provide these services or should they let the entertainment industry do it for them? Whatever the outcome of that debate, these trends will have an impact on the networks and general IT infrastructure of higher education. It is for this reason that the higher education IT community needs to be aware of this emerging issue.

How Are Legal P2Ps Evolving?

Digital convergence is perhaps most interesting in the entertainment world. The large media conglomerates are positioning themselves to produce and deliver digital content over varying networks: cable, DSL, broadcast, satellite. A second important trend is that small, independent producers of content, including students themselves, are asking for and are getting "airtime." A third trend is the move from scheduled programming to on-demand access to all (digital) content. Finally, Ruckus believes that there will soon be national networks of student-produced content. Indeed, with affordable digital movie cameras, this future is already upon us as students produce TV shows with relatively little production costs.

Conclusion

Legal P2Ps have arrived, and it is unlikely that they will go away. Those of us in higher education need to be at least aware of our options. An increasing number of us are already actively engaged in providing the service.

Learning Objects

On campuses across the country, faculty are fabricating strange entities called learning objects. They might not even know that their creation is a learning object, but that’s what they’re making.

Two quite sensible definitions of learning objects have emerged over the past years. David A. Wiley defines a learning object as "any digital resource that can be reused to support learning." Meanwhile Laurence F. Johnson, from the New Media Consortium, defines a learning object as "any grouping of materials that is structured in a meaningful way and is tied to an educational objective."3 But a much more situational definition is that if something is used in the learning process, it is a learning object: animation, simulation, interactive map, game, applet—anything reusable in multiple contexts. A Web site at Wesleyan University (http://learningobjects.wesleyan.edu/about/examples.html) has examples.

Why Are Learning Objects Important to Higher Education?

Learning objects are important to higher education because their use in particular instructional contexts provides new ways of visualizing, thinking about, presenting, interacting with, and understanding complex topics. Although they are not a universal solvent, their use will increasingly differentiate "old" ways of teaching from "modern" teaching techniques. There is no definitive study that proves or disproves that the use of learning objects always improves learning outcomes, but there is a growing amount of anecdotal evidence suggesting that rich media, when used effectively, improve student satisfaction, student retention, time-on-task, and other significant indicators. Well-designed learning objects allow students with different learning styles to interact with the materials according to their preferred way of learning.

The production of learning objects is complex and costly. Yet if one considers all of the money that is spent on course management systems, on computer networks, on multimedia classrooms, on instructional computing staff and training, and on faculty development, then the costs, though still high, become less daunting. And the cost of use (not of production) is much, much less.

Learning objects are at a turning point. There is an opportunity to create a diverse, global network of learning object developers, repositories, and users who, if they can effectively organize and coordinate their activities, will be able to produce a library of high-quality, pedagogically sound, free (or inexpensive) materials that will make all of the investments in infrastructure pay off in the educational experience for students.

How Are Learning Objects Evolving?

In the late 1990s, the Department of Defense declared that any vendor wanting to do business with the department would have to conform to a standard that it had invented: SCORM (Sharable Content Object Reference Model). Given the size of the department’s budget, it is not surprising that all of the major software vendors that traffic in tools and technologies related to learning objects declared their commitment to SCORM. This means that in the not-too-distant future, learning objects will "talk" to each other, share components with each other, and interact with Learning Management Systems (e.g., WebCT/Blackboard/Sakai/Moodle).

The tools for learning object production are also becoming increasingly easier to use. Faculty are thinking in terms of multimedia, partly because the younger faculty have grown up in a world of Game Cubes and Xboxes and partly because there are growing numbers of good examples to be emulated. Most visions of the future of learning objects predict a robust, albeit complex, mixture of learning object producers: commercial and academic publishing houses, academic institutions, professional societies, and other consortial entities. The vast majority of faculty (and perhaps institutions) will opt to be learning object consumers and will avail themselves of these materials within the context of a learning object marketplace. The shape of that marketplace is very much up for grabs.

Conclusion

Institutions might think that learning objects—unlike a payroll system, or a campus network, or even e-mail—could be "skipped" while the campus waits for the next wave of new technology. Doing so, however, is both an impossibility and a mistake. It is impossible because unlike the use of a new payroll system, which requires authorization and input from the central administration, the use of learning objects is almost certainly already happening on most campuses. Very likely, early adopters are busily creating these materials. More important, not engaging with this new way of thinking about instructional technology is a mistake because it is the collective responsibility of all institutions to shape the future of how this new medium can help faculty and students (and therefore colleges and universities) achieve their educational goals.

Convergence of Libraries, Digital Repositories,
and Web Content Management

Information technology and library expenditures for course management systems, student information systems, e-portfolio systems, library content (e-journals, e-books, and databases), library management systems, digital asset management systems, and general-purpose content management systems—when taken together and figuring in total cost of ownership, licensing, and staff support—can easily add up to millions of dollars at an individual higher education institution today. These complex Web systems and comprehensive e-resources are designed to serve the teaching and/or research mission of colleges and universities. There is no question that institutions have become dependent on a mix of these Web-based systems to manage the day-to-day business of teaching, learning, and research, but how much attention are colleges and universities paying to how these systems integrate to improve return on investment? Convergence may help.

For some institutions, convergence refers to the seamless integration of various e-content and e-delivery systems. For others, convergence is the expression of a merged organization, the blending of library and IT services. For my purposes here, convergence is considered to be the natural (or, in some cases, planned) evolution of multiple systems and multiple services into a single, holistic environment, one completely accessible from the friendly neighborhood Web browser.

Why Is Convergence Important to Higher Education?

"Libraries, digital repositories, and Web content management" is an umbrella heading under which all these systems and services can be listed. The convergence of systems and services offers exciting opportunities for improved faculty/instructor and student utilization of valuable, and increasingly expensive, digital resources. Convergence creates dialogue among stakeholders, expands institutional understanding of the roles played by each, and opens new doors to collaboration.

Organizations spend an immense amount of time planning and delivering services associated with various systems. By not considering ways that these systems and services can integrate, campuses perpetuate duplication of effort and fail as stewards of institutional funds. They send a message to vendors: "We don’t care if our systems can connect to each other. We don’t value the end user experience. We don’t talk across organizational lines."

By tangibly demonstrating that they are pursuing and planning for added-value convergence in their systems, institutions go a long way toward cultivating user advocates for these systems and layered services. Convergence is an ideal showcase for cooperative planning and smart IT investment.

How Is Convergence Evolving?

Across various systems markets, we see convergence in product development. In other words, the capabilities and functions offered by one system may be duplicated in what was previously a disparate system. Three scenarios follow:

  • CMS and DAMS. The type of asset-management functions offered by a course management system (CMS) vendor’s newest release resembles or even duplicates the functions of the digital asset-management system (DAMS) already in use by the library. Are the individuals who support each system talking?
  • New LMS. The vendor of an administrative data processing, payroll, and human resources system begins offering a companion product—a learning management system (LMS)—that competes with the institution’s existing, internally designed online training/compliance system. Will there be duplication if the new product is licensed?
  • Portal Overload. The vendors of student information systems, e-portfolio systems, course management systems, and library "automation" systems all promise a customizable experience and interface for users. Each contends that its portal product should serve as the launch point for the online educational experience. Will users appreciate the patchwork quilt of options?

The danger in not recognizing convergence in external vendor products or in systems being developed locally is clear: an institution could be purchasing or creating redundant systems and services. In larger institutions, this is likely already occurring.

Conclusion

Regardless of an institution’s budgetary climate, generating support among financial and academic leaders for the systems and associated services offered by IT and the library requires ongoing education. A local convergence project can serve as a terrific springboard for an educational campaign highlighting the value of convergence: improved system interoperability, improved user experience, and improved communication across organizational lines.

Nomadicity

Students arrive on campuses today using multifunction cell phones, personal wireless networks, voice-over IP (VOIP), peer-to-peer file sharing, digital video capture and editing, personal storage, and wireless data cards. As a result, they need to remotely access all existing campus software and services electronically. There is actually a name to describe this plethora of devices and technological convergence: nomadicity. Although wireless is certainly included within nomadicity, the term is not limited to wireless but includes multiple devices and methods of mobile computing and communications.

Why Is Nomadicity Important to Higher Education?

In response to an e-mail query sent to a representative sample of higher education CIOs, the majority reported a bombardment of consumer technology—PDAs, MP3s, DVDs, videogames, laptop computers, tablets, wireless hotspots—all creating havoc with campus support and security expectations, budgets, and infrastructure. How are their institutions responding to this demand? Are they ignoring (or actively opposing) this trend by continuing to build, overbuild, or inappropriately preserve hard-wired, place-bound infrastructures? Or are they adapting while still attempting to protect the integrity of campus technology resources?

The CIOs’ e-mail responses fell into five categories:

  • Most defined mobile technology as wireless.
  • Most saw wireless technology as augmenting, not replacing, wired networks.
  • Many questioned how to control the nomads and how to address security issues.
  • Several thought that a more mobile environment might alter the type and shape of computing and service (such as reducing the need for general-purpose labs or necessitating 24x7 support).
  • All saw a growing demand for multiplatform/multidevice support.

Several examples illustrate how nomadicity is creating a sandstorm on campuses. At Evergreen State College, David Metzler reports that faculty are beginning to use wireless-enabled tablets to access information immediately or to manage e-mail before a meeting takes place. Arthur Giovannetti, at Johns Hopkins School of Public Health, reports an "expanding need for remote access to applications and data." He cautions about the security issues already affecting his campus: "When attached to networks (usually at home) that do not have the same level of perimeter protection as we have at JHSPH, these machines are frequently infected with some form of malware. When they are connected to the JHSPH network again, the malware has effectively evaded our perimeter defense." At the College of New Jersey, Craig Blaha is intent on keeping the wired network primary because of spotty wireless security and reliability.

Phillippe Hanset, of the University of Tennessee, said that many people are dropping their wired connections completely. "We thought of it as a complement, and it’s becoming for many people a replacement." This makes support issues "huge." At Oakland University, Theresa Rowe explained: "Our university decided that handheld computers are personal devices and therefore cannot be purchased with university funds. That didn’t stop people from buying them with personal funds, then asking central IT to ‘make it work’—and we had to stop support. We can’t afford to buy one of everything to figure out the individual wireless connection’s intricacies and such." Giovannetti concurs: "With desktops, we could be fairly confident in what was physically and permanently connected to our network. . . . [Now] there is a demand for wireless roaming. People actually walk around with open laptops moving from one wireless segment to another."

Support, many of the respondents noted, is predicated on a robust WLAN. This is the case at the new Franklin W. Olin College of Engineering in Needham, Massachusetts. Even at Olin, only four years old and unfettered by old buildings and preexisting technology, keeping up with the current nomadic changes requires careful planning. Joanne Kossuth noted: "As a new institution, we have a fully converged category 6 network. Wireless access points are automatically specified and planned for in new construction along with the wired network. We are working with our vendor partners regarding the next generation of wireless IP phones."

Despite the demands for wireless and for multiplatform/multidevice support, almost all of the CIOs reported that wireless was still being viewed as an "add-on" to the infrastructure. Tracey Leger-Hornby, of Brandeis, added that the faculty has not yet requested mobile technologies. At the University of Tennessee, Philippe Hanset said: "Wired will not lose its trend." David Smallen, of Hamilton, noted that he saw "central services as essential to interoperability."

How Is Nomadicity Evolving?

Users need to be able to access programs, to fulfill computational and communication needs, "as they move from place to place in a way that is transparent, integrated, convenient and adaptive."4 Simply stated (but not so simply executed): the wide range of campus information services, once limited to a physical site such as the tethered desktop, will now need to be accessed from just about anywhere and on just about any type of Internet-enabled gadget.

Joel Hartman, CIO at the University of Central Florida, outlines the challenge: "It’s not just our desktop machines, laptops, servers, and mainframes, but now also hand held PDAs of various kinds, tablet computers, and even security devices and video projectors—all kinds of things are now getting IP addresses and interacting through the network or exchanging information, that several years ago would not have been part of the picture." Hartman adds that he finds himself "looking up information minutes before the meeting starts, as opposed to hours or days before."5

Conclusion

To say that nomadicity is having a profound impact on campuses is an understatement. Nomads are likely to be more advanced users who use new and emerging technologies such as voice/data convergence, PDAs, open source alternatives, and specialized software applications. This suggests a help-desk model very different from the one that services only a certain brand or two of workstation. It also suggests a different funding model, one in which we will likely need to make every effort to "unravel" complex funding so that we can support a wide variety of legacy and next-generation technologies.

The key to nomadicity is flexibility, which is often at odds with standardization of services and implies higher cost. But there is no denying that the technological nomad is here to stay.

Regional Networks

In 1985, the National Science Foundation (NSF) founded NSFNet (http://moat.nlanr.net/INFRA/NSFNET.html). It was higher education’s first Internet research-and-development backbone connecting faculty at lead research institutions on one coast to faculty at lead research institutions on the other coast. NSF soon provided funding for the development of regional networks, since the only way many institutions could afford connectivity to NSFNet was by collaborating with other institutions to create a critical mass of users and to share the cost of accessing NSFNet.

Many of the initial regional networks were developed by members of established academic organizations. Such an early regional was NorthWestNet (http://www.nwacc.org/archive/nwacc-announce/nwnet-sale.html), created by the North West Academic Computing Consortium (NWACC). A six-state regional academic organization, NWACC (http://www.nwacc.org/) was established in 1987 to bring Internet access to the northwest region of the nation. NWACC’s members come from a larger, fifteen-state regional organization, the Western Interstate Commission for Higher Education (WICHE), which has been in existence since the 1950s. This interstate compact was created by formal state and congressional legislation, with a mission to "facilitate resource sharing among the higher education systems of the West" (http://www.wiche.edu/about/).

The members of regional networks such as NorthWestNet often had their first Internet access to one another with this development. Once the NSF funding was no longer available, usually after about three years, most institutions found other ways to provide Internet access, and many of the initial regional networks disbanded. In about 1995, NSF turned the Internet over to the private sector. In 1997, Internet2 was born and soon developed the Abilene network (http://abilene.internet2.edu/). Abilene’s footprint was very similar to that of NSFNet. Abilene once again created impetus for the development of regional networks. Some of the earlier regionals were still in place, such as the Southeastern Research Universities Association (SURA) (http://www.sura.org/) and the New York State Education and Research Network (NYSERNET) (http://www.nysernet.org/), and new regionals appeared, such as the Great Plains Network (GPN) (http://www.greatplains.net/). Those networks, once again, had similar characteristics. Driven by the continued need to access and share educational and research resources, they collaborated to create critical mass to share costs of access. In 2003, National LambdaRail (http://www.nlr.net/) was born, and another reiteration of regional networks is now developing. One such regional is the Northern Tier Network Consortium (NTNC) (http://www.ntnc.org).

Why Are Regional Networks Important to Higher Education?

Higher education is about teaching, learning, and research (TLR). Regional networks play an important role in maximizing the TLR experience of their participants. The people involved in regional networks primarily include faculty from various disciplines and IT professionals. They work together to share information (teach), create new information (research), and enable both through the regional infrastructure. The institutions—including colleges and universities, libraries, nonprofit hospitals, K-12 schools, and state agencies—bring their resources together to create that critical mass so necessary to acquire or provide access to national and global resources. In addition, an important outcome that is being increasingly recognized throughout the higher education community is that the economic vitality of a region is enhanced by an institution’s participation in the global research community through regional networks.

The NTNC can be used as an example. In the eight states of the NTNC (Wisconsin, Minnesota, Iowa, North Dakota, South Dakota, Montana, Idaho, and Washington), there are at least fourteen Internet2 institutions. The region is rich in scientific resources, such as a federal USGS laboratory in South Dakota, the Mayo Hospital and Clinic in Minnesota, and databases of Native American culture and the region’s ecosystems and weather. The long international border also provides opportunities for collaborations to find ways to keep the nation safe. Not having the most robust Internet access creates barriers not only to regional scientists, who do not have good access to global scientific resources, but also to the global community, which does not have good access to the region’s scientific assets.

How Are Regional Networks Evolving?

The NTNC had its first meeting in Minneapolis in the spring of 2003. Present were representatives from primarily Internet2 member institutions who had seen, for nearly twenty years, the development of a national R&D backbone (NSFNet and Abilene) without a presence across the upper Midwest, upper Great Plains, and the Northern Rockies. It was clear from that first meeting that with the development of yet a third R&D backbone—the National LambdaRail—a way had to be found to make sure that the region did not again get overlooked as part of the country’s R&D networking infrastructure.

Since that time, an executive committee has been named. This committee is responsible for policy and procedures for the NTNC and has a representative from each state. A technical committee was named, with a member from each participating institution. This committee is responsible for engineering a potential NTNC physical network. Internet2 serves as the administrative agency, basically "incubating" the project. A nominal fee of $1,000 was paid by each institution wanting to participate in the project. To date, there are fifteen members, including Internet2 institutions, a tribal college, federal labs, and state networks.

Conclusion

The NTNC supports the continued build-out of advanced networks and the networking research thus enabled. It also supports providing access globally. With nearly one-fifth of the nation lacking adequate access to global scientific resources, the region is working hard, with limited resources, to make this happen. Communication with many different communities to create awareness is beginning to have positive results.

The regional network model works. Find a common cause, and people and institutions will work together to make it happen. Since those early days of yearning for Internet access for our institutions, the Internet has become a global tool for teaching, learning, and everyday commerce. Regional networks will continue to enable colleges and universities to bring some of our brightest talents together to teach, to learn, and to work.

Notes

1. Dan Updegrove, e-mail to EDUCAUSE CIO Constituent Group Listserv, March 27, 2004.

2. "FTC: No ‘Do-Not-Spam’ List," CNNmoney, June 15, 2004, http://money.cnn.com/2004/06/15/technology/spam_list/.

3. David A. Wiley, "Connecting Learning Objects to Instructional Design Theory: A Definition, a Metaphor, and a Taxonomy," http://www.reusability.org/read/chapters/wiley.doc, p. 7, chapter 1.1 in David A. Wiley, ed., The Instructional Use of Learning Objects (online version, © 2000), http://www.reusability.org/read/; Laurence F. Johnson, Elusive Vision: Challenges Impeding the Learning Object Economy (San Francisco: Macromedia, 2003), http://www.nmc.org/pdf/Elusive_Vision.pdf.

4. Leonard Kleinrock, "Breaking Loose," Communications of the ACM, vol. 44, no. 9 (September 2001): 43.

5. "Campus Networking: A CIO’s Perspective," Syllabus, January 2004, http://www.syllabus.com/article.asp?id=8711.

EDUCAUSE Related Resources

The Evolving Technologies Committee is one of thirteen EDUCAUSE advisory committees, which guide association strategies in various arenas. The ten EDUCAUSE program committees develop content for specific conferences. All EDUCAUSE committees consist of member volunteers (http://www.educause.edu/volunteer/).