There is a frequently voiced belief that information technology (IT) will transform the educational process. We often hear assertions that high-speed data networks, rich media content, and compelling user interfaces created with tools like Macromedia Flash and Dynamic HTML will surely revolutionize learning. Although many educators feel that technology will have a significant—even profound—impact on learning, how this will happen is less certain.
We’ve been down this road before, of course. When television entered the scene in the mid-1950s, similar predictions surfaced about its influence on education. Television, it was thought, would bring the classroom to a vast, nationwide audience. Early attempts to educate through television used the new medium to replicate traditional classroom instruction, with a lecturer addressing the camera as if it were the class.
While superimposed graphics provided a high-tech variant of chalk and a blackboard for televised classes, the traditional classroom teaching method remained unchanged. And television provided only a pale representation of the classroom experience. Conspicuously absent were dialogue with the instructor, student-to-student interaction, and the instructor’s ability to sense whether students "get it."
Should we then conclude that television has had no appreciable impact on education? Perhaps not so quickly. Sesame Street, the History Channel, and the Discovery Channel play a key role in educating the American population, but it is not education in the traditional sense. They do not use television to replicate the experience of the classroom. They provide a different type of learning, driven by the particular characteristics of the medium.
What, then, are we to make of broadband digital networks and interactive multimedia? How will these technologies affect learning in the future? Will we use IT merely to replicate traditional teaching methods—substituting Web-based documents for printed textbooks and remote "distance-learning" sessions for on-campus classrooms? Or will these technologies introduce new methods of learning that differ fundamentally from traditional classroom teaching techniques such as the teacher-directed lecture or the case-study method? The short answer is… we don’t know. While it is reasonable to believe that IT will change the nature of education to some degree, the details are much less clear.
The early enthusiasm over the potential impact of technology on education is now facing a backlash of dissenting viewpoints noting that the initial lofty expectations of e-learning have not been met. A recent report by Robert Zemsky and William F. Massy from the University of Pennsylvania’s Learning Alliance for Higher Education titled "Thwarted Innovation: What Happened to E-Learning and Why"1 sought to answer the question, Why did the boom in e-learning go bust?
The key conclusion supported by the study is that the bust occurred because expectations (and proposed solutions) grew too rapidly. "[T]he boom-bust cycle in e-learning stemmed from an attempt to compress the process of innovation itself."2
In short, the concept has not been disproved but, rather, has not had sufficient time to evolve. It is not that we now know that technology-enhanced learning won’t be effective; we simply haven’t done enough empirical research in the classroom to know what approaches will deliver lasting value.
Wharton’s Learning Lab
With funding from Alfred West, Jr., an alumnus of The Wharton School of the University of Pennsylvania (see the sidebar on The Wharton School) and chairman of SEI Investments, Wharton Dean Patrick T. Harker established the Alfred West Jr. Learning Lab to explore ways "to reach a deeper understanding of how people learn and to push that process to a higher level using advancements in technology and learning science."3 Toward this end, the Learning Lab is working with a broad range of business faculty to develop a series of tools for classroom instruction.
The Learning Lab is intended to be a lab in the traditional sense—conducting an ongoing series of experiments on computer-enhanced learning. By exploring multiple avenues over time, the project hopes to uncover what works and what doesn’t—what endures versus what is merely passing fancy.
As Alec Lamon, a technical director on the Learning Lab team, stated, "We fully expect that several years from now, we’ll look back on some of our early Learning Lab applications and realize how wrong-headed they were. But we also expect that some will prove to be of lasting value. And it is these applications that we will continue to develop and learn from over time."
The foundation for the Learning Lab was established in fall 2000 with a pilot program that included two new projects and expansion of an existing one. All three projects reflected Harker’s commitment to research that would have a significant impact within the school. The new projects were earmarked for MBA "core" courses—those taken by all first-year Wharton MBA students. The third project was the expansion of a portfolio-management simulation previously developed at Wharton for one of the school’s finance faculty members (which eventually became Wharton’s Online Trading and Investment Simulator, or OTIS).
Lab Structure and Process
The pilot phase of the Learning Lab provided valuable feedback from faculty that helped set the direction for subsequent phases of the program. At the outset the Learning Lab team believed that much of their work would focus on developing interactive modules or "widgets" for demonstrating fundamental concepts—for example, an interactive Web-based tool demonstrating the characteristics of the central limit theorem.4 Indeed, work on one of the two new projects selected for the pilot phase of the Learning Lab focused on this type of product.
This approach gained little traction with the faculty, however. While they found such tools helpful, this type of application failed to significantly advance the teaching process. Other projects—with more complex simulations that illustrated concepts not easily demonstrable in class—gained greater favor with the faculty. Subsequent collaboration over the next several years confirmed this trend.
Based on the experiences of these initial pilot projects during the 2000–2001 school year, Wharton developed a more formal structure to support the expansion of the program in fall 2001, including the appointment of two senior Wharton faculty members as directors. These faculty co-directors work closely with an internal Faculty Steering Committee that selects projects for development and manages their direction. An external Oversight Board consisting of thought leaders in various disciplines provides feedback and guidance on the Learning Lab’s direction.
Faculty who seek to have an instructional application implemented by the Learning Lab submit a proposal to the faculty co-directors. The Learning Lab’s IT staff then assesses the technical feasibility of the project. This candidate-proposal phase often involves one or more follow-up meetings with the faculty member to elicit additional details. The members of the Learning Lab’s Faculty Steering Committee assess the project’s pedagogical value and expected impact on the school.
Once the committee approves a project, the co-directors respond to the originating faculty member with a final description of the application to be developed along with target dates for "alpha" (prototype) and final versions, as well as an estimate of how much time the faculty member is expected to dedicate to the project. If the originating faculty member agrees to the project as stated, the implementation process begins.5
An early milestone in each project is the delivery of a prototype version of the product for the faculty member to test. Even with a formal proposal process, getting the details right is often an iterative process. In most cases, the development team creates an initial prototype to validate their assumptions about the design details and to generate additional feedback on the exact requirements of the application.
The Web-based development tools used by the Learning Lab’s technical staff are particularly helpful in allowing the development team to construct and quickly modify prototype applications. The Wharton School’s development environment consists primarily of Macromedia ColdFusion MX and Microsoft SQL Server along with Macromedia Flash and Macromedia Dreamweaver.
A typical Learning Lab simulation requires three to six months of one FTE (full-time equivalent) employee to prototype, develop, test, and deploy in an initial classroom application. During the development cycle, each originating faculty member spends approximately 15 to 25 hours collaborating with the development team.
By the fall 2004 semester, the Learning Lab had developed 18 Web-enabled simulations, real-time learning experiences, and interactive programs that challenge students to think strategically across multiple business functions. Wharton is deploying these applications throughout the curriculum—in core MBA courses, undergraduate business classrooms, executive MBA programs, and non-degree executive seminars. (See the sidebar "Wharton Learning Lab Simulations" for brief descriptions of some of the applications.)
Approaches to Technology-Enhanced Education
While it is beyond the scope of this article to classify the large and varied universe of learning technologies, a brief overview may clarify where Wharton’s Learning Lab stands in regard to these approaches. While there are a number of exceptions to this simple taxonomy, the major categories of technology-based or technology-enhanced education can be described as follows:
- Electronic textbooks
- Distance learning
- Computer-assisted communication
- Blended or hybrid models
Electronic textbooks typically use HTML, Adobe’s Portable Document Format (PDF), or Macromedia FlashPaper to disseminate electronic content to a broad audience via the Internet, CD-ROM, or DVD. While Macromedia Flash or Java Applets can add interactive components to these documents to provide capabilities not available in the traditional printed book, the pedagogical paradigm closely parallels that of the traditional textbook.
Distance learning initiatives typically seek to bring the classroom experience to an audience that is not physically present. They focus on using technology to achieve a larger scale or to address an audience beyond the reach of a physical classroom.
Computer-assisted communication uses electronic technology (typically Internet-based) to assist communication outside the classroom. Tools in this category include e-mail, bulletin boards, instant messaging, and groupware products such as Documentum’s eRoom.
Blended or hybrid models combine one or more of the above techniques with traditional classroom instruction. The rise of such models is in part a response to the perceived limitations of these other approaches. Because electronically reproduced content and electronically mediated communications do not have the richness of classroom interaction, blended models combine traditional classroom experience with electronic services outside the classroom.
All the techniques described above either use technology to replicate the classroom experience outside a physical classroom, side-step the classroom entirely, or supplement classroom activities outside of class sessions. In general, the activities of Wharton’s Learning Lab follow none of these paths.
The Learning Lab Approach
Based on faculty feedback from the initial pilot projects, a key goal emerged for the Learning Lab: to enhance the classroom experience, not replace it. Learning Lab applications typically seek to expand the depth of the educational experience, not extend its reach. These products aim to teach students better by using technology to create situations that are difficult or impossible to experience in an instructional setting through any other means. The technology serves to strengthen student-faculty interaction, not replace it. Although not all the projects fit this model, these are characteristic of most Learning Lab initiatives.6
Most Learning Lab projects would be categorized as simulations, although the term is fraught with ambiguity. Everything from multimedia cases with a simple branching structure to full emulations of complex control systems fall into the category of "simulation." Other than the attempt to emulate real-world events or processes, these tools have little in common, and their pedagogical outcomes may be very different.
Many of the Learning Lab’s projects, however, share a number of characteristics that differentiate them from other technology-enhanced learning models. In general, Wharton Learning Lab simulations:
- Have open-ended outcomes.
- Don’t always present the object of the game as the object of the game.
- Encompass more than meets the eye.
- Teach by doing rather than describing.
- Facilitate interaction and dialogue.
Open-Ended Outcomes
One weakness of many business simulations is the tendency of participants to focus on the mechanics of "the game" rather than on the underlying principles the simulation attempts to teach. This is only natural for participants who know they are engaged in a simulation whose goal is to "win." Unless the simulation possesses a high degree of complexity, participants often find it more efficient to look for the underlying algorithm of the simulation than to learn the abstract concepts the simulation seeks to demonstrate.
Many of Wharton’s Learning Lab applications seek to avoid this situation in their construction. Some—such as OTIS—are driven by real-world data. The simulation consists of recording, tracking, and reporting the students’ financial portfolios as if their trades actually occurred. The calculations that drive the application reflect changes in actual market data. Anyone who can figure out the underlying algorithm of this type of simulation can be equally successful on Wall Street.
Most Learning Lab simulations are not based on real data, of course, but they are nevertheless open-ended. In many cases there is no "right answer" nor a single, optimal outcome. The behavior of the participants determines the outcome of the simulation. With OPEQ (Oil Pricing EQuilibrium), Fare Game, and Power Play, for example, teams compete against other teams of individuals. The computer merely stores the state of the game, transmits the players’ moves, and presents the results. There is no computer to outsmart—only the other human competitors in the game.
Comparing Wharton’s VIBE (Virtual Interactive Bond Engine) bond portfolio management to earlier pen-and-paper-based exercises, Wharton Finance Professor Michael Gibbons stated, "The old problem sets were ‘toy’ examples. These are ‘serious’ problems—much more open-ended," which require students to work toward a solution. As in the real world, there are multiple ways in which one can arrive at a correct outcome. According to Gibbons, the students "learn real-world techniques and understand their application to solve open-ended, real-world problems."
The Object of the Game Isn’t the Object
In David Fincher’s 1999 movie The Game, Nicholas Van Orten (Michael Douglas) is involved in a complex game, the nature and goal—even the existence—of which is not clear to him. "The object of the game is to discover the object of . . . The Game," he is told at one point.
While none of Wharton’s simulations embody the complexity (or the perversity) of the game staged by the mysterious CRS Corporation in the film, in many cases the object of the simulation is not entirely known at the outset—and may not be the explicit topic of the exercise. While Learning Lab simulations such as the OTIS equities trading simulation and the VIBE bond-trading environment teach the explicit topics presented in the simulation (equities markets and fixed-income investment strategies), often this is not the case.
Although OPEQ is ostensibly about trading oil on the open market, the real point of the simulation is to demonstrate various negotiating strategies. Each team determines the number of barrels of oil its fictional country will produce during each time period, with the goal of maximizing profits (and making more money than the teams playing other oil-producing countries). Although the students have enough information to calculate the "optimum" strategy (and many, in fact, do), the intent of the simulation isn’t to teach the mechanics of global markets or pricing strategies. OPEQ creates a series of situations in which students have to negotiate with representatives from other teams in increasingly difficult or ambiguous circumstances. The oil trading exercise is merely a platform to get students invested in the outcomes of their negotiations as they try to "win" the oil pricing game. (See Figure 1.)
Click image for larger view.
Similarly, in Fare Game student teams price seats and select routes for major airline carriers. Although the game loosely models the conditions when discount carrier Midway Airlines entered the Chicago market to compete against the established carriers, the point of the simulation isn’t the economics of airlines. Rather, it is about signaling and how pricing strategies can be used to communicate intentions and influence the behavior of competitors.
In Wharton’s FutureView, students browse through dozens of screens of detailed product information and user opinions on futuristic auto-piloted vehicles, but the point of the exercise has little to do with the automotive industry per se. Rather, it demonstrates how tools can be developed to generate quantitative marketing data for radically new technologies.
In all of these cases, the simulation is used as a tool to stimulate a series of interactions or produce a situation that becomes the basis of the teaching point.
More Than Meets the Eye
Many of the simulations have a "reveal" in which the faculty member "pulls back the curtain" to unveil new facts or additional details not apparent during earlier stages of the simulation. A number of applications combine a homework-based exercise with a subsequent, in-class discussion of what was "really going on."
In OPEQ, new twists are added to the conditions of the simulation as the game progresses. In FutureView, after students have gone through the detailed information of the simulation and answered a simple survey, they later learn that different students saw different scenarios in the simulation and that the survey results have been analyzed in light of these variations.
Teach by Doing
Textbooks and classroom lectures typically provide information about a concept. Wharton’s Learning Lab simulations often demonstrate the concept itself.
With OTIS, students manage equities portfolios based on real stock data. (See Figure 2.) Kelly Kamm, who teaches finance at the University of Texas using OTIS, stated that "[Students] can read about [complex investment strategies like] hedging, but they don’t really understand how it works until they actually do it." Kamm finds that her students quickly learn the details of investing "when they have a million dollars [to invest in OTIS], and they’re watching [their portfolio] start to go up or down."
Click image for larger view.
Rather than describing how information acceleration can be used to uncover quantitative market data for radically new technologies, FutureView actually implements an information accelerator. In class, the faculty present students with an analysis of the data generated by their classmates. Wharton Marketing Professor Peter Fader, one of the faculty members who developed FutureView, pointed out the impact this has on his students: "It’s the vivid example that will be remembered years from now—and, hopefully, this brings with it the teaching point as well." When the student actually experiences a vivid instance of learning by doing, "the teaching points are not washed away," he said.
According to Wharton Finance Professor Gibbons, who developed Wharton’s VIBE product for his finance class,
Gibbons pointed out that "Students now learn things they didn’t learn before. For example, some computationally complex techniques—like Monte Carlo simulations—begin to have real meaning for the students," since these can be helpful in calculating the potential future value of the investment instruments in the VIBE universe. "They have to figure out the value of the VIBE securities, and this leads them to explore many areas they never would have seen otherwise."
Facilitate Interaction
Wharton’s Learning Lab products are typically not stand-alone, self-paced learning modules. Often the purpose of the application is to create a situation that fosters interaction and dialogue. OPEQ, for example, puts student oil-producers in a situation in which they must negotiate with competing students. The VIBE portfolio management product contains features to allow students to form teams for each round of exercises.
According to Gibbons, these tools are more than just a convenience; they "create socialization in the class." Students start by finding teammates, creating teams, and planning their group strategy. Even though many of these processes occur online, this type of interaction is key to the learning process. "This is particularly important in large classes," said Gibbons, "when students might not otherwise interact as often." Even exercises that include homework modules for individual exploration are typically geared toward stimulating classroom discussion once the details of the underlying application are revealed to the students in class.7
Marketing Professor Fader believes stimulating discussion is the goal of most teaching materials, whether case studies or simulations. "[When taught properly], both case studies and simulations are used to catalyze discussion. [It’s important to] turn the simulation off [to discuss] what’s good and what’s bad."
In short, applications such as these create a unique classroom experience rather than providing a substitute for it.
Phased Assessment
If Wharton’s Learning Lab is a lab, how are its experiments validated? How do we know if these tools have a positive impact on the educational process? From the outset, Wharton’s Learning Lab sought to follow a phased approach to evaluating the success of the program. This plan has three steps:
- Faculty acceptance, adoption, and expansion
- External adoption
- Data collection and assessment
Faculty Acceptance, Adoption, and Expansion
Acceptance by Wharton faculty was a key initial goal. While not a replacement for more formal evaluation, faculty acceptance is a necessary step for the continued growth of the Learning Lab and the development of a sufficient base for later quantitative assessments. If the faculty members who invest their time to develop these simulations don’t find them valuable, the applications will not continue to be used in the classroom.
Faculty acceptance means more than positive feedback from the faculty. Although the Learning Lab informally seeks opinions from all faculty participants, it also tracks how an application is used beyond its original deployment.8 The main assessment criteria follow:
- Frequency: Does the original faculty member use the simulation more than once? How often is the simulation used?
- Enhancement: Does the faculty member who developed the simulation have additional ideas on how to extend the product and submit a proposal for an enhanced version following the product’s initial use in the classroom?
- Expansion: Does use of the application expand to encompass additional classes and new audiences beyond the original classroom deployment? Does use of the application spread from the originating faculty member to additional faculty?
- Pervasiveness: Does the use of these applications become widespread in Wharton’s curriculum?
The results of assessing these criteria for the first 14 applications in production by the spring 2004 semester follow:
- Frequency: Twelve applications are still in use9 and have been used in each successive school year since their initial deployment.
- Enhancement: Ten applications generated subsequent proposals for additional features and enhancements based on their earlier use in class.
- Expansion: Eight applications were used in additional courses besides their original target class (with four used by courses outside the Wharton School). Seven are now being taught by additional faculty at Wharton, and the commercial version of OTIS is now used by more than 100 faculty at other institutions.10
- Pervasiveness: Since a number of the early applications focused on MBA core courses, by the end of the second full year of the Learning Lab virtually all first-year Wharton MBA students had experience with at least three Learning Lab applications. FutureView, one of the earlier Learning Lab simulations still in production, has been used by more than 3,000 Wharton students.
External Adoption
In February 2003, Wharton entered into a partnership with the Addison-Wesley division of Pearson to make Wharton Learning Lab applications available to other educational institutions.11 This external adoption is the second threshold in measuring the project’s success. By engaging a larger number of external faculty and students in teaching and learning with the tools, the Learning Lab builds a broader base from which to garner feedback and insights.
A pilot deployment of the first Wharton Learning Lab Series—OTIS—was launched in August 2003. Based on feedback suggested by an initial group of 30 schools, the first major commercial version was launched in August 2004. As of this writing, more than 2,000 students at more than 100 educational institutions have used OTIS.
Data Collection and Assessment
Now that Learning Lab applications have achieved a sufficient scale of usage, the project is beginning the third phase of the evaluation—quantitative data collection. This past year, the Dean’s Graduate Student Advisory Council (DGSAC), a group of Wharton MBA students who work on special projects for Dean Harker, conducted a preliminary survey of MBA students from the classes of 2004 and 2005 on the impact of simulations in the classroom. Overall, the 290 students who responded to the survey were pleased with the use of computer-based tools in their classes and found them to be effective. Seventy-seven percent of the respondents were either very satisfied (30 percent) or satisfied (47 percent) with computer-based tools in classes. Eighty-six percent of the respondents said that computer-based tools significantly enhanced (21 percent) or enhanced (65 percent) learning in class.
When asked to rate the importance of 10 criteria in facilitating overall learning in a classroom setting, students ranked "attention and engagement" the highest, with 70 percent of the respondents rating this category "very important." (In contrast, only 46 percent rated the next-highest ranked category, "applicability to your professional goals," as very important.)
On the same list of criteria students assessed computer-based tools as being most effective on the dimension they regarded as most important to learning—"attention and engagement" (42 percent rating them very effective). Also highly ranked were team collaboration (42 percent very effective) and "student-student interaction" (37 percent very effective).
Students rated computer-based classes as more effective than case-based classes on several criteria, most notably "attention and engagement" (62 percent), "team collaboration" (67 percent), "fun" (69 percent), "retention of material" (47 percent), and "student-student interaction" (51 percent).
Similarly, students rated computer-based tools as more effective than lecture-based classes in enhancing "attention and engagement" (79 percent), "retention of material" (54 percent), "team collaboration" (81 percent), "student-student interaction" (72 percent), and "fun" (80 percent).
The students’ assessment of specific Learning Lab products reveals an interesting grouping based on the nature of the application. Of the Learning Lab applications listed in the survey, the highest ranked (WSX, see Table 1) involves real-time game play among a class of 30 or more students. The next highest ranked applications—Fare Game, OPEQ, and Power Play—involve real-time team play in the classroom. The next simulations in the satisfaction ranking—OTIS and VIBE—involve complex, long-term (semester-long) exercises played by teams. Applications that function as stand-alone, self-directed exercises (Marketing Math Essentials) or that are assigned as homework exercises with a classroom "reveal" and discussion (FutureView, Rules of Engagement, and RATE) were rated somewhat lower.12 These results appear to support the notion that a key function of these applications is to stimulate interaction in the classroom.
Click image for larger view.
Based on the results of this preliminary survey, Wharton plans to develop a more formal assessment process. Wharton will both conduct specific surveys about the Learning Lab and its impact on classroom instruction and include questions on technology-enhanced learning in the school’s annual series of stakeholder surveys.
Future Directions
In addition to working with faculty to develop an increasingly rich portfolio of learning applications, key future goals for the Learning Lab include the following:
- Establish a community of educators.
The Wharton Learning Lab products distributed through Pearson Addison-Wesley extend the use of these simulations to a significant number of faculty and students. Professor Kamm at the University of Texas Department of Finance has prepared detailed lesson plans for using OTIS that are available to other instructors using the application. Wharton hopes to foster a "community of educators" who use these tools in their instruction, continue to provide ideas and feedback for their enhancement, and extend the products into new areas.
- Provide an increasingly sophisticated user experience.
The past few years have seen a rapid evolution in tools to create rich user experiences over the Web. The first generation of Learning Lab applications typically used HTML interfaces connected to the Macromedia ColdFusion application server with Oracle or Microsoft SQL Server databases. More recent Learning Lab applications have interfaces written entirely in Macromedia Flash.
- Perform ongoing evaluation.
Wharton plans to develop more formal assessment tools and to expand the scope of these assessments to other schools that are using Wharton Learning Lab products.
Through these methods of ongoing experimentation in the classroom, Wharton hopes to discover how technology can have a lasting impact on learning. As Wharton School Dean Harker said recently to Wharton’s MBA students, "Business thrives on innovation, and innovation doesn’t spring just from the wisdom of the ages. It arises from the knowledge we create through experimentation and analysis."13