Assessing Online Learning
Patricia Comeaux, editor
Anker Publishing Company, Inc., 2005
$39.95 (cloth), 206 pp.
Reviewed by Joan Getman
Patricia Comeaux was motivated to edit Assessing Online Learning by her experiences both as a teacher of online courses and as a student in an online doctoral program. She has come to believe that assessment carries implications for learning that extend beyond grades. Instead of using assessment merely as a way to measure learning by looking at the outcomes and products of a learning process, the authors support her view that assessment can be a natural extension of learning. Comeaux asks whether online assessment can support a constructivist, student-centered learning model, and the authors respond strongly that it can. When assessment is ongoing and implemented through communication, participation, and interaction, the instructor is likely to be an effective partner in creating a student-centered learning experience.
Drawing from multiple disciplines with contributions by instructors, a graduate student, academic technologists, and instructional designers, this collection of essays offers a useful resource for practitioners. Readers will find strategies, rubrics, guidelines, and frameworks that demonstrate how assessment can significantly influence the learning process itself. The authors make a point of telling the whole story, including challenges they faced and lessons learned about trying strategies that may not yet be accepted as the best or most effective practices. Additional value lies in the focus on assessment in online environments in that the authors point to characteristics of online interaction with students that can be leveraged to increase the impact of formative assessment strategies on learning.
This chapter is especially relevant to administrators. In describing eight community colleges, the author emphasizes the pervasive expectation for institutions of higher education to be accountable in several dimensions. The common thread among the vignettes in this chapter is that accountability can be addressed in part by assessment planning at an institutional level.
The particular advantages and opportunities presented by online environments (such as automatic tracking and monitoring, content creation tools, unlimited access to course material, and records of interactions for reflection) are highlighted here. It is a well organized rationale for modifying assessment when moving away from the face-to-face classroom to take advantage of these characteristics. The authors introduce the concept of assessment at three stages, emphasizing the importance of assessment as an ongoing activity and the value to students in that assessment enables them to identify knowledge and skill gaps.
Another key concept is the motivation provided by engaging students in real-world problems and teamwork. The author provides a useful rubric for team assessment that combines feedback from the learners themselves, peers, and experts, with the idea that being able to work in teams contributes to learning a critical skill set that spans many disciplines and professions.
Continuing with the theme of teamwork, the authors of this chapter offer a collaborative learning assessment methodology and several frameworks for implementation. This will be welcome information to those who have been challenged by the assessment of individual and team performance in a collaborative learning situation. On behalf of the learners, the authors point out that instructors need to guide this process because it is often new and not easy for students to engage in collaborative learning. Other practical tools and resources include a checklist for team assessment guidelines; a strategy for authentic assessment; principles of cooperative learning; and a diagnostic tool for team assessment that can be used to check on the alignment between curriculum goals and assessment strategies. The authors remind practitioners that organization and design of a curriculum is much more visible to learners when it is online, which can be an advantage for students if the course is well designed and can expose aspects that are not.
Though the book focuses on assessment in online environments, the authors of this chapter view pedagogy and not technology as the real challenge. Information-age workers need to know how to be effective members of virtual teams, which represents a new set of skills that spans both disciplines and professions.
An interesting comparison is drawn between a traditional method of assessing learning—evaluating the process of transmission to acquisition and finally to regurgitation—and a newer method of assessment that addresses the process of collective and individual knowledge creation. Technology offers tools with which to think, but instructors must first consider a different pedagogy. The authors suggest that developing a course syllabus that points to learning experiences, rather than topics and content, would be a starting point.
The case study presented in this chapter demonstrates how learners were able to produce a "collaborative artifact" as the result of constructivist discourse. This is significant in that the authors contend that constructivist learning is often "shallow," leaving learners with little motivation to participate and little to show for their investment when they do. In the end, the case study also shows how simple technology tools were used to accomplish sophisticated knowledge building. This chapter has much to offer with regard to principles of implementation, including a piece on peer tutoring and assessment in online environments.
Discussion boards are often set up with great enthusiasm and then either go quiet or become an archive of interactions that is more about the number of contributions than the quality. The authors suggest discussion-board strategies for teaching reading, writing, and critical thinking skills. They have based their approach on Virtual Learning Circles and describe the role of the instructor in guiding, re-focusing, and motivating discussion with eight different kinds of "prompts." Their case study demonstrates that online environments enable students to learn the skill of deliberate reflection and enable instructors to do more precise grading of these skills. The authors also make the general point that assessment supports learning most effectively when students know the particulars of what is to be scored. Readers will find another set of rubrics comparing holistic versus analytic assessment approaches.
This chapter reinforces the connection between assessment and learning. The author supports her discussion with a table that aligns specific assessment tasks with learning outcomes and processes. They maintain that instructors need to evaluate evidence of the social construction of knowledge.
Who says that assessment has to be text-based? The authors of this chapter present a case where digital video was used to implement assessment and provide feedback to students struggling with math and statistics concepts. By creating a video of a fictional character in a real-world situation who encounters the same applied statistics problems the students are experiencing, the instructors helped students assess their learning and address skill gaps in a way that is empathetic and less discouraging for those who are struggling—an interesting and apparently effective approach that begs the questions of how much was invested in the video and how scalable this approach is.
This chapter begins by acknowledging that in online environments, instructors are giving up a degree of control over the testing situation, but the author uses this point as a segue to a discussion of what instructors are assessing in the first place. She maintains that evolving online test formats and rethinking time- and place-bound testing leads to more learner-centered, constructivist-appropriate assessment. The author asks instructors to consider whether they want to assess learners’ abilities to memorize and recall or their ability to find information, work toward the right answers, and demonstrate understanding.
Independent learning is not a natural skill, and the author suggests that learners need to develop these skills. Using a case study that revolves around learning library skills, they describe using a WebCT discussion forum and reflective portfolios to develop skills at two levels. Learners were assessed on their discipline-specific skills and on more general skills, such as reflection and transferable library skills. A novel idea in this chapter is that both the instructor and the learner are required to be reflective and responsive.
Computer-mediated communication (CMC) facilitates interaction among groups of people, and, in the case study presented in this chapter, the instructors combine CMC with face-to-face instruction. The author considers the advantages to include an instructor’s ability to identify patterns more easily. He also observed a greater degree of accountability among learners who are engaged with each other through CMC as compared with their face-to-face interactions. Learners can also engage in multiple modes—public and private—in online environments that would be disruptive in a face-to-face teaching and learning situation. The author uses his case study to support the premise that CMC supports transformational learning in which learners validate, modify, and replace prior knowledge.
It is a challenge, in the space of one chapter, to convey both overarching principles and enough detail to make a case study relevant to readers and their own practices. In general, the authors of Assessing Online Learning strike a reasonable balance between the two. Practitioners will also find the book rich with references and resources that will lead to more in-depth information, if they are inclined to look for it. The authors are consistent in their support of the central premise that assessment can be an integral and highly influential part of learning.