Sponsor Videos

Software Secure

Keypath Education

Conference News

LinkedIn FaceBook YouTube GooglePlus www.instagram.com/onlinelearningconsortium


Download the Mobile App
IOS  |  Android
OLC Mobile App

Make your travel arrangements

Yoga with Jan

Add to my registration


American Higher Education in Crises book cover

Join keynote speaker Goldie Blumenstyk for a book signing.

Books are available for pre-purchase for $16.95 (+tax). 
Read more

Conference Program now posted! This year's line-up includes:


OLC Excellence and Effective Practice Award Recipients Announced


Add/remove sessions from the Program Listing on the website or in the mobile app to create a list of sessions you want to attend!

My Schedule

Join Keynoters Goldie Blumenstyck (Chronicle of Higher Education) and Phil Hill and Michael Feldstein (MindWires Consulting)

BYOD to learn, explore, and share knowledge within this lab environment

Test Kitchen

Save the Dates

22st Annual OLC International Conference
November 16-18, 2016 | Orlando, Florida | Walt Disney World Swan/Dolphin Resort

OLC Innovate 2016 - Innovations in Blended and Online Learning
April 20-22, 2016 | New Orleans, LA | Sheraton New Orleans Hotel

Moving Beyond Quality Matters: Building a Custom Fit Multimodal Approach for Evaluating Course Design

Nicola Wayer (Champlain College, USA)
Additional Authors
Josh Blumberg (Champlain College, USA)
Session Information
October 16, 2015 - 10:45am
Institutional Strategies & Innovations
Major Emphasis of Presentation: 
Practical Application
Institutional Level: 
Multiple Levels
Audience Level: 
Session Type: 
Information Session
Oceanic 7
Session Duration: 
45 Minutes
Concurrent Session 10

The challenge: to maintain and measure online course quality and prioritize development needs. See how one college developed an integrated, multi-faceted approach to quality assurance.

Extended Abstract

Moving Beyond Quality Matters: Building a Custom Fit Multimodal Approach for Evaluating Course Design

Nicola Wayer
Champlain College

Josh Blumberg
Champlain College

The challenge: to maintain and measure online course quality and prioritize development needs. See how one college developed an integrated, multi-faceted approach to quality assurance.

One of the challenges of online learning is how to maintain high quality online courses and programs and how to identify which courses need to be updated or redeveloped. Course quality starts at the planning and design phase and continues through the course production and delivery phases. In evaluating the quality of an online course, there are three pillars that encompass the life cycle of a course: course design, production, and facilitation.

Popular methods for evaluating courses include standardized course evaluations for student feedback and the use of instructional design rubrics such as those developed by Quality Matters (https://www.qualitymatters.org/) or California State University Ð Chico (http://www.csuchico.edu/eoi/). Some institutions, however, may find that neither of these rubrics fit their needs and that student evaluations focus more on course facilitation than design. Champlain College addressed this problem by creating its own set of course evaluation tools that include a course design rubric, design-focused student feedback surveys, production checklists, and an automated course analysis tool. These innovative tools support an integrated approach to quality assurance (QA) throughout the life cycle of a course, spanning the planning and design, production, and facilitation phases. This presentation will describe the multi-faceted approach used by Champlain to save time and allow the data from multiple sources to feed into the planning process. It will focus on the pillar of course design and will describe the process Champlain used in identifying what the institution valued in online courses and how they developed a course evaluation rubric to meet their needs based on those values.

Quality Matters (QM) is designed to be a Ònationally recognized, faculty-centered peer review process designed to certify the quality of online courses and online componentsÓ (https://www.qualitymatters.org/higher-education-program). Like many other institutions, Champlain College has its own template for online course design; with guidelines for syllabus and course components that faculty must follow. Courses are developed in partnership between the eLearning department and faculty subject matter experts (SMEs), and faculty members are supported throughout the process. Given the CollegeÕs requirements for all online courses, aspects of course design such as the QM rubricÕs Course Overview and Introduction, Course Technology, Learner Support, and Accessibility and Usability sections are less meaningful because these aspects are addressed by the eLearning Department more so than by individual faculty members. The remaining sections of the QM rubric take a broader view of assessment, activities, and instructional materials within a course. In order to evaluate the quality of each specific course element (e.g. weekly overviews, lectures, discussions, assignments, etc.), the instructional design team at Champlain developed its own rubric that reflects research-based best practices for online teaching and learning. As part of the development process, the rubric was piloted and inter-rater agreement was calculated using FleissÕ kappa, with a mean agreement of 0.334, suggesting fair agreement. This presentation will share the process used to develop the Champlain Instructional Design Evaluation Rubric (CIDER), look at the rubric itself, and how data is tabulated and shared to support evidence-based decision-making.

Like many institutions, Champlain uses the Individual Development and Educational Assessment (IDEA) Student Ratings of Instruction (http://ideaedu.org/) survey for all courses. The IDEA survey uses the same questions for all courses across the College, regardless of modality. To better understand what is happening specifically in online courses, brief student surveys were embedded at three different points in courses. These surveys align with the criteria of the rubric and focus on discussions, assignments, and lecture content as well as instructor facilitation. Further correlation studies are being conducted to determine how well studentsÕ feedback about different design elements of courses correlate with ratings using the rubric and with the IDEA survey ratings. This presentation will discuss and demonstrate how the surveys were used to gather course design feedback from students and how it was used to identify development needs.

Using these QA tools, Champlain CollegeÕs instructional design team has been able to give evidence-based advice to deans and program directors as to where they should focus their development funds and what changes could be made to courses for a maximum return on investment. For example, evaluation using the rubric identified courses where simple improvements to the Weekly Overview or adjustments to discussion prompts would have student impact with minimal involvement from subject matter experts, who are paid for their work.

This presentation will share ChamplainÕs strategy for evaluating course quality and provide suggestions for how other institutions could apply the strategy to meet their specific institutionÕs needs in measuring course quality, identifying development needs, and making data-driven decisions in allocating resources.