That's the Power of Words: The Effects of Changing Survey Question Language

min read

A look back in time revealed how some small, intentional changes to the ETRAC student survey language yielded some unexpected results.

Businessman flying in a time machine
Credit: Leremy / Shutterstock.com © 2020

If we had the ability to hop into a time machine and travel back to 1955 like Marty McFly did in the film Back to the Future, we would see some stark reminders of how far technology has come over the years: in 1955 telephones had cords and dials, computers were so large they filled entire rooms, and paper maps were used to plot travel routes. Many things look very different due to the evolution of technology over the past 65 years. However, even if the time circuits on the DeLorean were set to just 16 years ago, the world still looks pretty different by tech standards: in 2004 there was no iPhone, no Siri or Alexa, and no Instagram. And most people used one device to take pictures and another to listen to music. The EDUCAUSE Center for Analysis and Research (ECAR) started administering ETRAC surveys in 2004, and when we compare that first survey to our most recent—or even when we compare our most recent surveys to those from just a few years ago—it's evident that the questions have changed over time to reflect advancements in technology, as well as user needs and habits.

Reviewing the ETRAC student and faculty surveys from previous years reminds us about how higher education technology has advanced and shifted, and how our analysts, statisticians, and researchers must continually think about ways to ask questions that yield valid and useful data. Relevance is critically important, and as a result, ETRAC survey questions must reflect technology trends and demographic changes, as well as the cultural shifts that often result from these changes. So each year, the ECAR team takes great care to develop survey instruments that are sound and germane—and as useful as possible to stakeholders. This sometimes involves changing the language used in a survey question to keep pace with the times. But changing a survey question can be tricky. Similar to the Hollywood concept of time travel, where a small change to the past can have unintended consequences on the future, replacing even one word can impact how a question is understood and interpreted. Even a slight change to a question can affect the analysis of longitudinal data and the tracking of trends over time. A modification like that won't unravel the very fabric of the space-time continuum (as Doc Brown so ominously warns Marty McFly), but it does cause ripple effects. Take, for example, the changes we've made over the last few years to a question about student learning environment preferences.

Let's start with a minor temporal displacement and go back to 2013, when the student study first explored learning environment preferences with four simple choices: students could select between courses with no online components, some online components, or completely online courses (or indicate no preference). The majority of students (63%) indicated a preference for blended environments. "No online components" (26%) was selected by more than twice as many students as "completely online" (11%). In addition to asking about their personal preferences, the 2013 survey also asked students to identify in which environment they learn the most. The response options were identical to those listed in the learning environment preferences question, and the results of the two questions were within a few percentage points of each other. In 2013 and 2014, a clear theme surfaced: students preferred face-to-face over online learning.

Moving along to 2015 (roads not needed), we added "mostly but not completely online" as a response option to increase the precision of the blended learning responses. After getting feedback from subject-matter experts that our scale needed a middle category, in 2017 we added a "half-and-half" option between courses with no online components and those that are completely online (see figure 1). While we refined the blended learning options, we also explored the differences between students' preferences and their self-reported best environment for learning. Asking about the "best" environment was motivated by a desire to gain insight into maximizing student success but tempered by the limitations of self-reporting. Ultimately, we settled on the more straightforward option—asking students what they prefer—noting that results have been nearly identical in the years we presented both questions.

Bar scale illustrating changes to learning environment preference response options
Figure 1. Changes to learning environment preferences response options

When preparing the 2018 survey instrument, the ECAR team identified an additional opportunity to refine and improve upon the question about students' preferred learning environment. Up to this point, we had largely defined our scale in terms of the presence, level, and absence of online components. We realized, however, that a singular focus on the degree to which learning environments had (or did not have) online components masked the face-to-face aspect of any of those options and forced the respondent to imagine what an environment with "no online components" looked like. To render the item response options more accurate and to reduce the cognitive load—or mental effort—needed to respond to the question, we revised the response options once again to include the language of "face-to-face" and "online" environments (see table 1).

Table 1. Evolution of learning environment preferences survey question and response options

2013

2017

2018

What type of learning environment do you prefer?

In what type of learning environment do you most prefer to learn?

In what type of learning environment do you most prefer to learn?

  • Courses with no online components
  • Courses with some online components
  • Courses that are completely online
  • No preference
  • One with no online components
  • One with some online components
  • About half online and half face-to-face
  • One that is mostly but not completely online
  • One that is completely online
  • No preference
  • One that is completely face-to-face
  • One that is mostly but not completely face-to-face
  • About half online and half face-to-face
  • One that is mostly but not completely online
  • One that is completely online
  • No preference
 

Well, we certainly changed the way students interpreted this question! While only 10% of students selected "no online components" in 2017, 38% selected the theoretically equivalent "completely face-to-face" option in 2018. The 2017 version with "some online components" garnered 49% of student responses, which is about two-thirds of students wanting a blended environment.1 In comparison, the 2018 option of "mostly but not completely face-to-face" saw only 32% of responses,2 though still a comfortable majority of the blended responses (see figure 2). The flip-flop in 2018 was not a fluke. The results of the 2019 survey, based on the same wordings, are nearly identical.3 These findings support our conclusion that the results were impacted by wording changes in the response options and not by a dramatic shift in student preferences.

Bar scale illustrating differences in results based on wording changes to the learning environment preference options
Figure 2. Differences in results based on wording changes to the learning environment preference options

One possible explanation for this change in our results is that the "no online components" option was interpreted literally by students. Even in a completely face-to-face course, students would expect to have some online interaction, such as accessing the syllabus or checking grades with the LMS. A respondent might interpret an online component as being any kind of networked tool or resource instead of reading those options as being limited to meeting for a course online instead of attending a course in person in a brick-and-mortar classroom. In this light, "completely face-to-face" may ring true for more students.

The challenges for the ECAR team have been capturing the learning environment landscape accurately and conveying that landscape to students using a few simple response options. Sometimes, we rely heavily on responses from previous years to shape our perspectives and adjust questions accordingly. We did not foresee the shift in responses spurred by our wordsmithing. We made what we considered minor improvements simply because we wanted to be as clear and accurate as possible. Fortunately, the changes to the survey response options merely caused a small shift in the unfolding of the results (no worries about a time-travel-like paradox here). In other words, the conclusions drawn from this survey question remain consistent: students prefer blended learning, with further propensity for face-to-face environments. But this is a clear example of the caution and attention to detail that are imperative in research and survey design, especially with iterative changes to questions and response options that occur over time.

For more information and analysis about higher education IT research and data, please visit the EDUCAUSE Review Data Bytes blog as well as the EDUCAUSE Center for Analysis and Research.

Notes

  1. D. Christopher Brooks and Jeffrey Pomerantz. ECAR Study of Undergraduate Students and Information Technology, 2017, research report (Louisville, CO: ECAR, October 2017).
  2. Joseph D. Galanek, Dana C. Gierdowski, and D. Christopher Brooks. ECAR Study of Undergraduate Students and Information Technology, 2018, research report (Louisville, CO: ECAR, October 2018).
  3. Dana G. Gierdowski, 2019 Study of Undergraduate Students and Information Technology, research report (Louisville, CO: ECAR, October 2019).

Dana Gierdowski is a Researcher at EDUCAUSE.

Ben Shulman is a Statistician at EDUCAUSE.

D. Christopher Brooks is Director of Research at EDUCAUSE.

© 2020 Dana Gierdowski, Ben Shulman, and D. Christopher Brooks. The text of this work is licensed under a Creative Commons BY-NC-ND 4.0 International License.