Meeting the Challenges of Remote Assessment [video]

min read

Asking faculty to reevaluate how they do assessment during the pandemic has created several challenges. These challenges not only touch on technical issues around proctoring, but also student access and equity.

View Transcript

Authors:

Thomas Cavanagh
Vice Provost for Digital Learning
University of Central Florida

John Fritz
Associate Vice President for Instructional Technology
University of Maryland, Baltimore County

Cynthia Golden
Associate Vice Provost and Executive Director of the University Center for Teaching and Learning
University of Pittsburgh

Gerry Bayne: I'm Gerry Bayne, multimedia producer for EDUCAUSE and I'm here with Tom Cavanagh, John Fritz, and Cynthia Golden. Could you guys introduce yourselves starting with you, Tom?

Tom Cavanagh: Sure, I am Tom Cavanagh. I am Vice Provost for Digital Learning at University of Central Florida in Orlando.

John Fritz: I'm John Fritz. I'm the Associate Vice President for Instructional Technology at the University of Maryland, Baltimore County, UMBC.

Cynthia Golden: I'm Cynthia Golden. I am Associate Vice Provost and Executive Director of the University Center for Teaching and Learning at the University of Pittsburg.

Gerry: I'm curious about assessment, how is your institution approaching faculty to be more focused on learning outcomes? Assessment obviously becomes more complex given the situation we are in and unplanned move to online. Can you talk about assessment and how you guys are approaching that strategy?

Tom: I'll go ahead and start. And it's been I think a series of increasingly urgent triages. When we moved this spring completely online, it was a bit of a fire drill. And we had a bunch of faculty that had never really thought about online assessments before. They, particularly in hard sciences and engineering, they were used to giving classroom closed-booked exams, especially, kind of written exams where you would like, build a formula and they would check your work and especially, in engineering and computer science disciplines. So, for them, it was a bit of a challenge. For other faculty, who we have worked with traditionally in online development, they didn't have as harder time with it because they kind of already understood online assessment affordances, limitations and how do you or try to attempt to do authentic assessment as opposed to knowledge recall kind of assessments. But for those faculty that had already created their assignments and assessments, in those disciplines that are not use to that kind of strategy, we did have to kind of talk them through proctoring strategies and the technical challenges sometimes associated with proctoring which was exasperated by the fact that many of our students just didn't have the equipment. They just didn't have webcams; they didn't have their laptops. When we went remote, UCF happened to be on spring break. So, students’ stuff was still in their dorms, they couldn't go and get it. So even if they had it, they couldn't access it in some cases. And in other cases, we had students who didn't know they needed to purchase that when they signed up for this course last November. And then when the time came for them to try to obtain a webcam, you couldn't get them. So it was a real struggle and trying to talk faculty through some of these different strategies, trying to recognize that not every student has the same homogenous set up, whether it's equipment or bandwidth. Something as simple as a test strategy of showing one question at a time in the LMS versus showing the whole test at once in LMS, faculty liked to show one question at a time, and randomize them, it's an academic integrity strategy. However, we started getting a bunch of complaints from students that their bandwidth was so low, that it was taking too long for this one question at a time exams to show each question that by the end of the test, they hadn't gotten through it all. Because it just took too long for the questions to load. So we started espousing the faculty, show all the questions at once which they weren't thrilled about which is very practical such as things like that that we were dealing with.

John: So we had some challenges, just like Tom described in terms of what students had available to them. We had some students who especially in computer science class where the faculty member would say, you have to use Respondus LockDown browser and webcam monitor and we couldn't buy it. The faculty member was digging in on that requirement. And so we had to have some conversations about, even if they wanted to, they couldn't do it. And so it's been sort of massaging, asking faculty to kind of rethink how they do assessment, which then is a short hope skip and a jump back to what are your learning outcomes, your goals and are they in alignment regardless of the pandemic, you know students hate it when they're taught one way but assessed a different way. And when those things are out of sync, what do you mean, 50-question multiple choice tests, we never went over these kinds of things in the class, either for lecture or wherever. So those kinds of disconnects or things that are out of sync, that pops up quite a bit especially, in a situation like this. But getting faculty to think about at scale, how can you do assessment? Maybe it's not everybody taking the same test, certainly not the same questions, randomizing those questions, that has been the challenge for us particularly in the STEM disciplines 'cause that's just what they've been accustomed to do for years.

Cynthia: So, for us at Pitt, we did have some similar issues. Our decision to go online happened while our students were on spring break and we started our remote classes March 23rd, I think it was. And because our semester ends earlier than a lot of others, we finished our classes toward the end of April and our commencement is usually the last weekend in April. Because of that timing, we knew that when the inquiries started coming in about remote proctoring, that we were not going to have time to develop an enterprise solution. And some of our schools, our professional schools, had some remote proctoring solutions in place. The university runs a large testing center for in-person and proctor testing, that would obviously not be open and so what we tried to do was work with faculty and encourage that they think about other ways to address assessments. And alternatives like open-book exams or short answers or writing a transformative essay were things that our consultants suggested, and we spent time with faculty helping to move them in that direction. We also thought it might be a really good opportunity to think about high-stakes exams. And how to really measure student learning and consider some alternatives to the 100-question fill-in-the-blank tests. So, most of our students at Pitt were able to come back to the resident halls and get their things in that week that we took to prepare for remote learning. So we didn't have some of those issues that John described, for example. But one of the things that our central IT group did in conjunction with the Provost office was to work to provide tablets and hotspots to students who didn't have robust enough access from the devices they had or who didn't have devices that they could use. And interestingly enough, we had some faculty in that situation, mostly with the network. Some of our faculty lived places where there was just not good dependable network access. So, we had to deal with some of those similar issues at Pitt. And it was a really nice partnership between IT and the Provost office to make these devices and hotspots available to people.