Student Feedback on Quality Matters Standards for Online Course Design

min read

Key Takeaways

  • Online course offerings continue to surge, yet design issues often short-circuit the goals of students and instructors alike.
  • A team of researchers sought to illuminate this issue by studying student perceptions of a large online course designed using the Quality Matters (QM) standards.
  • The team's goal was to more fully understand how to improve online courses; its findings offer both validation for using QM standards as a basis for course design and tips on improving specific areas, such as accessibility and usability.

Online learning continues to expand in higher education, growing consistently for the past 13 years. Indeed, the Babson Group found that 28 percent of college students will take at least one online course in their academic career.1 His study also found that higher education administrators also recognize that online learning is critical to their strategic initiatives for course delivery, and that students want online courses among their choices.2

With the increase in online delivery, quality concerns also increase. To address this, in 2016 the federal government released a set of proposals to develop quality indicators for online learning.3 A review of these indicators proposed the determination of quality can be complex and is often relegated to "a tension between two roles of quality assurance as a means of accountability and as a route to quality improvement."4

In reviewing online course quality, student success indicators must also be considered. Online learning has developed a reputation as an environment that is sometimes detrimental to this success as students tend to "disappear" within and never complete the course. For higher education institutions dealing with increased demands for online learning and low student retention rates, it is a frustrating situation.5

Developing online courses structured to ensure that students will complete them successfully is an important goal that impacts students, instructors, and institutions alike. Given this, students' perceptions about online course structure could be crucial in their own success, as well as the success of online courses in general.

In earlier research,6 students identified several essential components to good online course design, including effective communication, assistance in working with other students, active involvement in the content, prompt feedback, time management, clear expectations, motivation, and hands-on learning. All of these factors are provided for in a well-designed online course that follows the Quality Matters (QM) Higher Education Rubric standards for proper design.

The goal of our team of researchers was therefore to assess, from the student perspective, the course design of a large online course based on these QM standards. The team also sought to more fully understand how to improve online course design and instruction to help students succeed.

Given this, the study was guided by two research questions:

  • Do students perceive when a large online course has an overall quality design based on QM Higher Education Rubric standards?
  • If students perceive a QM Higher Education Rubric standard as not meeting expectations, what can be done to improve the online course design?

As the "Online Course Design" box describes, research shows that students in online courses want to be able to find what they need, understand course expectations, and feel engaged. QM can help design courses that meet these requirements.

Online Course Design: Barriers and Opportunities

When teaching or learning in an online environment, barriers to learning always come to mind. Instructors and students alike may have preconceptions of online learning and/or had online learning experiences in the past that did not motivate them to continue teaching/learning online. When investigating how to enhance online teaching and learning, it is essential to review the barriers and opportunities that lie ahead.

Barriers to Student Success

Lin Muilenburg and Zane Berge noted eight student barriers to online learning7:

  • Administrative issues

  • Social interaction

  • Academic skills

  • Technical skills

  • Learner motivation

  • Time and support for studies

  • Cost and access to the Internet

  • Technical problems

Other researchers have focused primarily on findability. As Peter Morville defines it, findability is "the degree to which a particular object is easy to discover or locate, [as well as] the degree to which a system or environment supports navigation and retrieval."8

In their work, Bethany Simunich and colleagues investigated findability and its importance to student perceptions of or satisfaction with online courses, and thus whether findability should be considered more heavily in online course design.9 They found that findability is paramount for online students, because if those students cannot find important course components, they cannot use them. The likely result is frustration, lowered motivation, and decreased self-efficacy, all of which can impact both student learning and course attrition.10

Moreover, as other researchers noted, findability is part of usability in that, given findability, "the user can do what he or she wants to do the way he or she expects to be able to do it, without hindrance, hesitation, or questions."11

Indeed, research has singled out findability as not only the most significant predictor of both self-efficacy and motivation among students in online courses, but the only significant variable that predicts these two key factors. Aside from previous experience in online courses, no other single variable, including age, rank/year in school, GPA, or comfort level with using a computer has proven significant here. Because lack of motivation is a recognized barrier to student learning in the online environment, findability and usability are doubly important in course design. The QM Higher Education Rubric has specific standards related to navigation, which, when properly designed, improves findability and usability.

Opportunities: The Student Perspective

George Bradford conducted a study to explore if, from students' perspective, a relationship existed between cognitive load and student satisfaction with online learning.12 Bradford collected data from 1,401 college students who said they had experience with asynchronous, online courses prior to the term in which the study was conducted. In using factor analysis to analyze the student data, he found the following three factors as significant.

Awareness. Bradford found that when students can track their progress within a certain timeframe and receive timely feedback, they are more satisfied in general. Moreover, students look for clarity and clear expectations in the course syllabus, content presentation, due dates, and assignments. Furthermore, students are satisfied when instructors demand respectful online etiquette to maintain an enjoyable environment.

Challenge. Students find greater satisfaction when a course has relevance and provides appropriate challenges. Therefore, course design should incorporate a degree of certainty for overcoming the challenge if the performance details are clearly communicated. Additionally, the course design should include a realistic schedule to support challenges that require high levels of effort. Students reported that challenging assignments also have intrinsic value that further increases their satisfaction. Therefore, instructors should use rubrics, communicate clearly and in a timely way, and include supporting material for students to review so that they can successfully complete challenging assignments.

Engagement. Bradford's study found that students dislike the feeling of isolation in online learning environments. Therefore, instructors should incorporate communication into the course design and offer opportunities for communication to occur. Engagement also increases when course activities relate to students' major field of study or life experiences. Students also prefer to have options when completing assignments. Therefore, instructors should provide those options both to offer students variety and encourage individuality. Finally, students prefer learning through active communication; Bradford therefore recommends that instructors employ strategies that leverage active communication, such as modeling how to be a productive engaged communicator, developing effective threaded discussions, and asking probing questions. Active communication strategies should also leverage student motivation and offer positive feedback through engagement as students participate.

As Bradford's findings suggest, engagement can be positively influenced when students are aware of course conditions and performance standards, and a challenge is properly set.13 The QM Higher Education standards address course conditions and performance standards, and connecting appropriate content to high-level standards creates the challenge.

The Learning-Performance-Design Connection

The QM standards recognize the need to connect learning outcomes with performance and assessment to design a challenging, engaging online course. A few QM standards that represent these essential components include the following:

  • The instructor's plan for classroom response time and feedback on assignments is clearly stated.

  • The requirements for learner interaction are clearly stated.

  • The assessments measure the stated learning objectives or competencies.

  • The course grading policy is clearly stated.

  • Specific and descriptive criteria are provided for the evaluation of learners' work and are tied to the course grading policy.

  • The instructional materials contribute to the achievement of the stated course and module/unit learning objectives or competencies.

These and other key rubric standards, described in the main article, can help instructors design better online courses.

The Quality Matters Rubric Standards

The QM Higher Education Rubric was developed for use as a tool for designing online courses and to promote continuous improvement in online course development.14 The QM Rubric's eight general standards and their intentions follow:

  1. Course Overview and Introduction: ensures that students can easily understand how to begin a course, what the course is about, who the instructor is, and what the student and instructor can expect from each other.
  2. Learning Objectives: outlines the purpose of the course; each objective must be written from a student perspective and summarize the skills that a successful student will demonstrate by the time the course completes.
  3. Assessment and Measurement: ensures that assessments align with the learning objectives, the grading policy is clear, and students have a good understanding of how they will be evaluated.
  4. Instructional Materials: focuses on whether instructional materials provide an adequate resource for students to achieve the course learning objectives and considers whether materials are thoughtfully selected, current, and demonstrate multiple perspectives.
  5. Course Activities and Learner Interaction: seeks to verify whether activities and interactions further the attainment of the learning objectives based on the interactions with content, the instructor, and other students.
  6. Course Technology: ensures that course technologies are used to achieve learning objectives in an optimal way.
  7. Learner Support: focuses on information about or links to the university's academic policies and services and student support services, and how students can access such policies and services.
  8. Accessibility and Usability: looks for documentation of course material accessibility as well as that of other course tools and activities, to ensure they are accessible to all students. If students need accommodations when using course materials, instructions for obtaining accommodations should be provided.

Each of these general standards consists of various specific standards, with a total of 43 specific standards for the entire rubric.

Research Methodology

The course studied was a large online-only course at the University of South Carolina focused on learning Microsoft Excel in an asynchronous environment. The course had been previously approved through the university's Center for Teaching Excellence (CTE) Distributed Learning Quality Review (DLQR) program.

DLQR is based on the QM standards and serves as an internal review of all university courses with more than 50 percent of the content delivered through distributed learning. The DLQR review is completed by CTE instructional designers. Our interest, however, was in using the QM standards to get a more student-centered perspective review of the course.

Study Participants

The course had 199 students enrolled; 109 participated in our study for a response rate of 55 percent. The course is a requirement in the College of Hospitality, Retail, and Sport Management; it also appeals to many students in other departments across campus. Of the 109 students participating in our study, 100 were taking the course because their major or minor required it, 5 took the course because their advisor recommended it, 3 were interested in the topic, and 1 took it as an elective. Table 1 shows the general demographics of our participants.

Table 1. Participant demographic variables

Demographic

Total (n)

Percentage

Age ranges

 

 

18–19

55

50%

20–21

41

38%

22–23

7

6%

24–25

5

5%

26 or older

1

1%

Gender

 

 

Female

60

55%

Male

49

45%

Race

 

 

Caucasian

92

84%

African American

12

11%

Asian

3

3%

Native American

1

1%

Multi-racial

1

1%

Number of prior online courses

 

 

1–2

84

77%

3–4

19

17%

5

4

4%

6 or more

2

2%


Course Procedures

The students used Blackboard as the learning management system (LMS), interacted with each other and the professor through frequently asked question (FAQ) discussion boards, and interacted with course content through Blackboard and Pearson's MyITLab.

MyITLab provided simulations and content interactions in various ways, along with capstone summative activities in which students used Microsoft Excel to complete assignments. These assignments, referred to as capstone projects, were completed at the end of each chapter. Students uploaded these projects into MyITLab, where they were automatically graded, providing immediate feedback. Students also completed multiple choice quizzes in Blackboard that were automatically graded and provided immediate feedback as well.

Students could complete the MyITLab Microsoft Excel simulations and Blackboard quizzes as many times as they wanted to earn the preferred grade. They could also complete the capstone projects twice, with the highest grade recorded. However, the final exam (which resembled a capstone project) could be completed only once.

QM Survey

Selma Vonderwell suggested that student perspectives can provide an in-depth understanding of the effectiveness of student learning in an online environment.15 We therefore focused on the QM Higher Education Rubric standards to obtain student feedback regarding the design of the online course and the implication of that feedback for instructors.

We gave each student participant a link to a 42-item online survey on Blackboard, offering minimal extra credit if a student completed the survey. The QM has 43 specific standards; however, due to researcher error, one standard was unintentionally omitted from the survey instrument. The omitted item was Standard 3.4: "The assessment instruments selected are sequenced, varied, and suited to the learner work being assessed." Because three other standards pertained to assessment, and Standard 3.4 was not an essential standard, we believe omitting this standard did not impact our findings.

We asked participants to provide feedback on how the course performed in relation to each of the remaining 42 standards by choosing from one of three ratings: (1) Did Not Meet Expectations; (2) Met Expectations; (3) Exceeded all Expectations.

Within the QM's 43 specific standards are 21 essential standards. In a traditional QM course review, all 21 essential standards must be met, and a score of 84 out of 99 points must be earned to qualify as a Quality Matters approved course. Each standard is assigned a specific point value; essential standards are three points each, and other specific standards are one or two points each. For example, among the standards under the general standard Course Overview Introduction Standard are the following essential and regular specific standards, respectively:

  1. Instructions make clear how to get started and where to find various course components. Essential standard (3 points)
  2. Etiquette expectations for online discussions, e-mail, and other forms of communication are clearly stated. Regular standard (2 points)

Personality Assessment

In addition to the survey, we also gave students a learner assessment based on previous learning personalities research.16 This research labels personality types according to color; our goal here was the same as the previous study to identify learning personalities and teaching strategies and increase opportunities for success.

The assessment consisted of questions related specifically to the previous personality trait research. These questions included True Colors character words that students ranked from "most like me" to "least like me." The assessment was provided as an addendum to the QM Standards survey questions. The overall makeup of the students enrolled in this course were:

  • Orange: 32 percent
  • Gold: 26 percent
  • Blue: 23 percent
  • Green: 19 percent

Orange students need freedom to choose how to complete a project, appreciate immediate feedback, and prefer less specific guidelines. Given the freedom to take quizzes and complete simulations multiple times with immediate feedback, coupled with the ability to complete capstone projects twice, Orange students would be more likely to succeed.

Gold students need detail and appreciate specific guidelines. The syllabus in this course provided considerable detail, step-by-step instructions, and specific due dates. These components helped Gold students get organized, which is one of their strengths, and provide an opportunity "check off" tasks as they were completed.

Blue students are social and have a need to help others; the FAQ discussion boards allowed for such interaction, assistance, and social engagement.

Green students need to be challenged and work independently; the Microsoft Excel comprehensive content offered challenge, and the online course allowed for independent work.

Study Findings

After reviewing the students' survey responses, we were able to decipher how the students viewed the class in relation to the quality approved measure and to investigate where improvements could be made.

We analyzed the data using the multiple response categorization and combined all participants' responses for each of the eight general standards. All general standards were overwhelmingly rated by the participants as Met Expectations or Exceeded All Expectations (see table 2).

Table 2. Percentage of participants rating*

General Standard

Did Not Meet Expectations

Met Expectations

Exceeded All Expectations

1. Course overview and introduction

2.04%

41.49%

56.47%

2. Learning objectives

1.47%

46.97%

51.56%

3. Assessment and measurement

0.92%

43.81%

55.28%

4. Instructional materials

1.38%

45.11%

53.06%

5. Learner interaction and engagement

0.69%

46.10%

53.21%

6. Course technology

2.39%

48.07%

49.54%

7. Learner support

0.46%

50.00%

49.54%

8. Accessibility and usability

2.75%

52.66%

44.59%

*The majority of students (97.25–99.54%) rated the general standards as being met or exceeded all expectations.

  1. Course Overview and Introduction: 97.96 percent of students reported that the standard met or exceeded all expectations because the course's structure was clearly outlined, the instructor provided thorough instructions on how to get started in the course, and student and instructor introductions were appropriate and available. Moreover, the instructor stated the minimum technical skills expected of students to succeed in the course.
  2. Learning Objectives: 98.53 percent of students reported that the standard met or exceeded all expectations due to clearly defined learning objectives, written from the students' perspective and described as measurable outcomes for the course.
  3. Assessment and Measurement: 99.09 percent of students reported that the standard met or exceeded all expectations because the course grading policy was clearly stated and the type of assessments used to evaluate student progress aligned with the learning objectives. Additionally, the assessment instruments were deemed appropriate for the course activities.
  4. Instructional Materials: 98.17 percent of students reported that the standard met or exceeded all expectations because the materials were current and presented a variety of perspectives, and the materials used for learning activities were clearly explained.
  5. Learner Interaction and Engagement: 99.31 percent of students reported that the standard met or exceeded all expectations because activities promoted student learning objectives and these activities provided opportunities for student engagement. The response time for instructor feedback was also clearly stated.
  6. Course Technology: 97.61 percent of students reported that the standard met or exceeded all expectations because course tools and media support appropriately guided them for a positive learning experience. Students also had access to the required technologies for this course.
  7. Learner Support: 99.54 percent of students reported that the standard met or exceeded all expectations due to clear instructions on how to access technical support. For instance, an explanation on how university resources can help students succeed and how to go about obtaining these resources was clearly stated.
  8. Accessibility: 97.25 percent of students reported that this standard met or exceeded all expectations based on how well the course employed accessible technologies and offered guidance on obtaining accommodations.

Room for Improvement: Accessibility

When conducting a study of this type, it is essential to review the specific standards within the general standards to begin to improve course design and teaching in those areas. In our case, in terms of not meeting students' expectations, the general standard rated worst (2.75 percent) was the Accessibility and Usability standard.

Table 3 shows the survey data on the specific standards within the general standard category.

Table 3. Accessibility and Usability specific standards rated "does not meet" expectations

Specific Standard

Total (n)

Percentage

Course navigation facilitates ease of use.*

5

4.6%

Information is provided about the accessibility of all technologies required in this course.*

2

1.8%

The course provides alternative means of access to course materials in formats that meet the needs of diverse learners.

4

3.7%

The course design facilitates readability.

4

3.7%

Course multimedia facilitate ease of use.

0

0.0%

* Essential standard

Providing all content to all students is essential. Therefore, even if only a small percentage of students indicate this area as not meeting expectations, it is imperative that all issues be corrected. This addresses the findability and usability information addressed previously. To improve the course navigation, instructors must ensure proper planning is in place to control for the learner's ease of movement from one place to another. This includes considering all icons, links, and other controls that assist in consistent and user-friendly course navigation. Labeling and content organization are essential as well because students new to online learning might need assistance to move from one location to another. The navigation technology should appear seamless and be intuitive rather than difficult to decipher.

All content must be accessible, whether through screen readers, PDF documents, captions, or transcripts. Providing alt-text for graphics and other equivalent, accurate representations for content is essential. It is also important to effectively use white space appropriately and carefully consider font color, style, and size as you also consider relevant content relationships.

General Course Weaknesses and Strengths

Student comments within the survey illuminated the course's weaknesses and strengths. Because students rated the course highly, there were few comments on weaknesses. However, it is still important to review all weaknesses and strengths to obtain an overall picture of the course as we consider ongoing assessment and improvements. The following weaknesses and strengths are organized by QM Rubric general standards and summarized into the bulleted themes that arose from the comments.

  • Learning Objectives
    • Class too hard
    • Volume of work
  • Assessment and Measurement
    • Quizzes did not seem to assess course content
    • Some automatic grading glitches
    • Limit of one submission on final exam
  • Instructional Material
    • No lecture
    • Repetitive structure
  • Course Activities and Learner Interaction
    • Interaction with other students
  • Accessibility and Usability
    • Mac issues with MyITLab

Comments about course strengths included the following:

  • Assessment and Measurement
    • Immediate feedback on assignments
    • Ability for multiple submissions
  • Instructional Materials
    • Training videos
    • Simulations
    • Interactive assignments
  • Course Activities and Learner Interaction
    • FAQ discussion board for help
    • Good interaction with professor
  • Other
    • Ability to work at own pace
    • No F2F class meetings
    • Freedom

Recommendations and Conclusions

As previous research has shown, we cannot assume that "good teaching" is an inherent trait among instructors.17 For students to succeed in an online learning environment, the instructor must implement good teaching via a good online course design and seek continued feedback from their students.

The QM Higher Education Rubric provides thorough guidelines to help instructors meet this goal. For example, students are often anxious when beginning a new course; one way to alleviate their anxiety is by describing the course's general purpose, as well as clearly defining the course outcomes and how they align with assignments. This gives students a better understanding of the purpose of course activities.

When students are informed of each assignment's purpose and valuation criteria, they are more likely to understand expectations and work to achieve them and obtain knowledge at a higher level. Instructors should also ensure that assessments and evaluations appropriately measure student learning outcomes, and they should clearly state the course grading policy.

Students will be more likely to succeed in online courses when instructors provide thorough explanations and define goals clearly — and also seek student feedback in a timely manner. Doing so not only allows instructors to continue improving the course design and their own teaching methods, it makes students feel more valued and engaged than if they are simply passive consumers of course content.

Notes

  1. D. Frank Smith, Report: One In Four Students Enrolled In Online Course, Ed Tech.
  2. Ibid.
  3. Sri Ravipati, "Federal Government Seeks to Regulate Online Education Programs," Campus Technology, July 26, 2016.
  4. Stamenka Uvalic-Trumbic and Sir John Daniel, A Guide to Quality in Online Learning, Academic Partnerships, 2016, 1–28.
  5. Papia Bawa, "Retention in Online Courses: Exploring Issues and Solutions—A Literature Review," SAGE Open, Vol. 6, No. 1 (January 5,2016); DOI: 10.1177/2158244015621777e.
  6. Tena B. Crews, Kelly Wilkinson, and Jason Neill, "Principles for Good Practice in Undergraduate Education," MERLOT Journal of Online Learning and Teaching, Vol. 11, No. 1 (March 2015): 87–103; see 88.
  7. Lin Y. Muilenburg and Zane L. Berge, "Student Barriers to Online Learning: A Factor Analytic Study," Journal of Distance Education, Vol. 26, No. 1, 2007: 29–48.
  8. Peter Morville, Ambient Findability, O'Reilly, 2005, 3.
  9. Bethany Simunich, David B. Robins, and Valerie Kelly, "The Impact of Findability on Student Motivation, Self-Efficacy, and Perceptions of Online Course Quality," American Journal of Distance Education, Vol. 29, No. 3, 2015: 174–185.
  10. Ibid.; see 174.
  11. Jeffrey Rubin, Dana Chisnell, and Jared Spool, Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, 2nd ed., Wiley, 2008.
  12. George R. Bradford, "A Relationship Study of Student Satisfaction with Learning Online and Cognitive Load: Initial Results," The Internet and Higher Education, Vol. 14, No. 4, 2011: 217–226.
  13. Ibid.; see 223.
  14. Quality Matters, Why QM?, 2017.
  15. Selma Vonderwell, "An Examination of Asynchronous Communication Experiences and Perspectives of Students in an Online Course: A Case Study," The Internet and Higher Education, Vol. 6, No. 1 (1st Quarter 2003): 77–90; DOI: 10.1016/S1096-7516(02)00164-1.
  16. Tena B. Crews, Sradha Sheth and Tamlyn Horne, "Understanding the Learning Personalities of Successful Online Students," EDUCAUSE Review, February 24, 2014.
  17. Tena B. Crews, Kelly Wilkinson, and Jason Neill, "Principles for Good Practice in Undergraduate Education."

Tena B. Crews is a professor and associate provost of Academic Programs and director of Distributed Learning at University of South Carolina.

Tiffany M. Bordonada is an assistant professor in the Counseling and Human Services Department at the University of Scranton's College of Professional Studies.

Kelly Wilkinson is a professor and associate dean in Indiana State University's Scott College of Business.

© 2017 Tena B. Crews, Tiffany M. Bordonada, and Kelly Wilkinson. The text of this EDUCAUSE Review article is licensed under the Creative Commons BY-NC-ND 4.0 license.