Online learning is widely employed by higher ed institutions of all Carnegie classifications. It’s no mystery why: online programs benefit students of all demographic backgrounds; they are incredibly cost effective, and they add gravitas to an institution’s commitment to accessibility.
Years back, such programs were lauded as a nod to the future of learning. Now that we’ve solidly entered the digital age, just how are these programs serving students? Are the policies, technologies and pedagogies related to distance education actually working?
The University of Wisconsin-Milwaukee’s National Research Center for Distance Education and Technology Advancements (DETA) Research Center has sought answers. DETA (pronounced “data”) is led by Tanya Joosten, Director of eLearning Research and Development, whose passion for research and the social sciences is palpable in regular conversation.
The SXSWedu advisory board member knows that everything starts with asking the right questions.
“How do we know how to design content? How can we best design interactions between instructors and students? We are only scratching the surface, but we’re finding more evidence behind whether various aspects of the teaching and the student support services we’re providing actually work,” she said.
Joosten, who has studied both communication technology and organizational communication including the large-scale implementation of technologies, initially felt that the standards of rigor in online and distance education research were lacking.
“Early on, coming into distance education, online and blended learning, and instructional design — the quality wasn’t at the level that I and many others were accustomed to in our academic fields,” Joosten notes. “We needed solid research designs, as much of what was shared at the time was anecdotal evidence or descriptive in nature.”
In conversations with colleagues from a variety of institutions, Joosten determined that practitioners and administrators in distance education needed access to rigorous research design and methodologies.
“Practitioners aren’t researchers,” she noted. “The field is rapidly evolving, with new technologies churning out all the time. There was a definitive need to do applied research, so that we know we’re meeting our goals when we’re implementing practices to impact student learning.”
A need for ‘responsive research methods on the fly’ led to funding for the creation of DETA, which recently developed a publicly-available research toolkit that has gotten a lot of attention. With hundreds of downloads from faculty members, researchers, administrators, technology companies, and more all over the world, Joosten has her own evidence of both demonstrated interest and need for the tool.
Joosten has brought the knowledge gained from the project back to the field. “We’ve shared our experience with the toolkit at the ELI Annual Meeting last year, and have been conducting workshops at different conferences,” Joosten said. “‘How to conduct research at your institution’ has been a well-received topic.” (In November 2016, she’ll lead an ELI course on Promoting Effective Teaching and Learning Ecosystems through Research and Proven Practice, offerings insights on how faculty members can design and implement their own research programs.)
The DETA team also convened 50 experts from across the U.S. to participate in a summit at the ELI Annual Meeting in February 2015. Their goal? To develop a collective agenda to explore and understand what factors influence access, instructional and learning effectiveness, and student satisfaction in distance and online education.
“We recognized the need to develop something everyone can use so we’re on the same page,” Joosten explained. “Bringing together experts from the community and engaging them in the process lends extra weight, whether they’re involved in making technology decisions, developing technology, or on the front lines teaching or conducting research. It helps the longevity of the process.”
Several months later, DETA issued a nationwide Call for Proposals supporting the evaluation of student success in online programs, which employed insights from the summit. (The grant was supported by the US Department of Education and EDUCAUSE.) Now, they’re in the process of gathering data and findings from studies they funded, including cross-institutional surveys and experimental studies.
The latter, described as a methodological approach to understand if an ‘intervention’ — either a pedagogical approach or technology used — is actually impacting learning outcomes, is beginning to yield results. Several institutions able to establish both control and treatment groups participated. Joosten highlighted a couple quasi-experimental studies examining content:
- At Oregon State, the impact of closed captioned videos on student learning was examined;
- And faculty at Cal State-Fullerton studied student-created videos of content and their impact on student learning.
Other institutions participated in survey studies, including cross-institutional studies, to examine the relationship between instructional characteristics, such as course and instructional design, or learner characteristics and student outcomes. The merging of survey data with student information system data allows for some robust analysis rather than using only one data source. These institutions included Milwaukee Area Technical College, Florida SouthWestern, University of Central Florida, and San Diego Community College.
Of particular interest to Joosten is the intersection of survey data, student information system data, and learning technology platform data. By identifying meaningful student and instructor behavior data in these platforms, the ability to better understand how to teach and what learning behaviors can predict success is best realized. As Joosten described her frustration with measuring student engagement with only one data source, “Engagement is a far more complex construct than how often students are touching the technology.”
Edtech companies recognize this, she notes. “They realize the importance of having data from multiple sources beyond their platforms to determine how different aspects of their technology are, in fact, impacting student learning. It’d be a huge win-win if we as educators can have solid evidence of a technology's influence on student learning and, in return, impact how technology is being developed.”
To that end, Joosten would like to work with learning technology companies and others to identify meaningful measures, gather data from multiple sources, and analyze data to better understand student and instructor behaviors and predict student learning outcomes.
Interpreting quantitative and qualitative findings and translating into changes in instructional and institutional practices can ultimately have a huge impact on students. “For example, advocating for instructional practices that work for all students — including first generation, minorities, low income, and those with disabilities — can be done most effectively once we know what works.”
More research ideas are in progress. Joosten wants to study how online course pedagogy and student support mechanisms, such as online learning readiness, just-in-time support through coaching and analytics, and virtual learning community in class and outside of class, increases college completion.
She acknowledges that one reason people don’t do research is a lack of access to relevant data. “One of our long-term goals is to build an open distance education data warehouse,” Joosten said. Such a resource would propel research for decades to come.
They have been gathering data from the institutional partners and subgrant awardees; all current data is scrubbed. Using an institutional code based on IPEDS data indicating region, whether institutions are 2-year or 4-year, their size, etc, along with a random element an outsider couldn’t figure out the institution.
“It’d be great for administrators,” Joosten said. “Imagine you’re developing an online program, and you have this dashboard. You could go to this dashboard and enter criteria, say 4-year, in this discipline, etc., and it’d indicate what you should be doing in your classrooms — such as what learner characteristics, instructional and course design would be most influential in your scenario.”
Joosten emphasized that solving problems with open data sets including multiple data sources, such as student information systems, survey tools, and learning technology platforms could be the future of online programming. Securing funding for such an effort will be the first step.
Interested in learning more? Download the DETA Research Toolkit.
Kristi DePaul of Founders Marketing provides editorial support and regular contributions to the Transforming Higher Ed column of EDUCAUSE Review on issues of teaching, learning, and edtech.