This effective practice was developed to address a strategic need for a comprehensive evaluation process and instrument to be used by department chairs and others responsible for assessing teaching practices in online and hybrid courses at an institution of higher education. While eLearning courses at the university where this practice was developed pass a rigorous peer-review process to ensure high-quality design prior to being offered to students, stakeholder interviews with administration, faculty, and students indicated a need to develop review strategies for evaluating teaching practices as eLearning courses were in the process of being delivered to students. The dynamic, constantly evolving nature of eLearning makes evaluating course facilitation practices challenging. To meet this need, an online dashboard was established to house individual components of a comprehensive review system: a course site and facilitation practices review survey instrument, a facilitator’s self-assessment, resources such as an overview of best practices in online and hybrid course facilitation and suggestions for conducting post-review conversations, mentor faculty, and facilitate ongoing professional development.
The assessment of teaching practices in online and hybrid courses was a demonstrated need that was not being effectively fulfilled at an institution of higher education. This effective practice was developed by a faculty member with a background in designing, teaching and coordinating online courses in collaboration with the university’s Center for eLearning and Continuing Education as part of a university leadership development program. The vision of this project was to support a positive and productive culture of assessment in online and hybrid courses. The goal of the practice was to equip department chairs and others responsible for faculty evaluations, some of who were not familiar with online education, with the knowledge, tools, and suggested procedures for assessing the quality of teaching in online or hybrid courses with confidence. In addition, the post-review facet of the review process aimed to facilitate ongoing conversations with faculty about needs for continuing professional development to ensure ongoing high quality facilitation practices and transformative learning experiences or students. An online portal with checklists, resources, tutorials, and evaluation instruments was developed. The online evaluation instruments were developed and delivered using Qualtrics, an online survey system that allowed for flexible delivery, assessment, and reporting. This practice also allowed department chairs and faculty supervisors to customize their evaluation instruments and procedures to meet the specific needs of their department and its various discipline and also offered flexibility in familiarization with the process, which could be completed online, face-to-face, or in a hybrid format.
This program was developed as a response to: 1) a demonstrated need to address divergences from eLearning quality course components observed after the course had passed initial design quality reviews in some online and hybrid courses during course delivery to students; and, 2) stakeholder interviews with chairs, course coordinators, faculty, and students--collected during a university-wide strategic planning process that included a component focusing specifically on eLearning needs and future directions. The pilot program for this effective practice is currently underway with department chairs representing departments across campus with the greatest number of online and/or hybrid course offerings and having varying degrees of expertise with eLearning. Throughout the pilot program, participating department chairs are providing feedback on the need for this practice and the review process itself. They participated in a brief introduction to the dashboard system and review instruments and are using the process to evaluate select instructors from their departments. Individual components of the system (e.g., the course site and eFacilitation practices review survey, the facilitator’s self-assessment) as well as an overview of the three facets of the comprehensive system (an introduction to eLearning best practices, use of the review instruments, post-review conversations and mentoring) have been shared through venues such as the university’s Chairs’ Council with the Provost and faculty enhancement mini-conferences and will be presented this fall at the Sloan Consortium national conference on online learning and has been submitted for the D2L Ignite regional conference. Presenting this effective practice will increase awareness of one of its primary advantages, which is the ability to easily replicate it with minimal required resources at a variety of institutions, and with its flexibility that allows for application across teaching fields. This approach addresses a critical balance between the need for evaluation and flexibility to address diverse and innovative online teaching practices.
This effective practice primarily addresses Learning Effectiveness, but is also inextricably related to Student and Faculty Satisfaction and additionally promotes Scale and Access. Effective evaluation practices enhance the overall quality of online and hybrid course instruction, thereby increasing the overall effectiveness and value of eLearning experiences and student satisfaction with them. This evaluation framework supports a productive, ongoing dialogue among faculty and leadership rather than one-moment performance reports. This promotes more successful evaluations and faculty satisfaction with work experiences as well as more intentional professional development activities, which in turn lead to more successful teaching that, once again, promotes increased student satisfaction with learning opportunities. In addition, online faculty benefit from having evidence of successful online teaching for promotion and review dossiers the same as their faculty peers that teach on campus. They also benefit from a system that lends accountability and, therefore, increased credibility to a form of teaching that remains somewhat of a mystery to their colleagues that are unfamiliar with eLearning. Given that online faculty reviewers are often faculty members themselves with little experience teaching online, having a guided, informative evaluation system in place increases their confidence in their ability to review eLearning faculty. This practice also addresses the pillar of Scale as it is developed using systems already in place at the university, rendering it very cost effective. Moreover, all components, resources and training can be delivered or accessed online at one’s convenience, thereby reducing expenses related to the use of facilities and faculty and staff time. If we consider that this practice entails an educational experience—a course, of sorts--for evaluators and facilitator alike, access is also reflected not only in the anytime, anywhere and timely availability of resources, but also in the consideration of feedback from administrators, faculty, and students in the decision to create this practice, the development process and in the continuing refinements made to it through the pilot program. Lastly, findings from online teaching reviews can assist institutions as they are increasingly called upon by accrediting organizations to demonstrate assessment of eLearning.
This practice was implemented by creating a resource dashboard resembling an online course site in the university’s learning management system, Desire2Learn, and Qualtrics, an online survey delivery system also already in use at the university.
This practice can be implemented for free or at no additional costs beyond those incurred to create the technology infrastructure required for other needs at many institutions, which can be adapted for the purposes of implementing an eLearning facilitation review system. For example, both Desire2Learn and Qualtrics were already in use at the institution developing this practice so a creative, innovative idea for using them to develop and implement the review practice was all that was necessary. Expenses in the form of faculty and staff time dedicated to developing the various components of the comprehensive system—site framework, evaluation instruments, resources, and training for those conducting faculty reviews—fall within the parameters of endeavors that represent an established aspect of ongoing faculty and staff research and development endeavors to promote transformative eLearning at the institution and meet the evolving needs of the university toward that goal.
The following resources are available as supporting documentation:
• Procedure checklists for suggested evaluation procedures https://dl.dropboxusercontent.com/u/13539450/master_checklist_V01.html
• Slideshow tour of online portal https://docs.google.com/file/d/0B_PrAuz08nbvY3ZvdFkzNDlUSmM/edit?usp=sha...
• Evaluation instrument (URL) https://uco.us2.qualtrics.com/SE/?SID=SV_7WYjkahBtuznz8x
• Self-assessment instrument (URL) https://uco.us2.qualtrics.com/SE/?SID=SV_1BKVlvd3NLsp2pD
• Evaluation Instrument (print) https://docs.google.com/file/d/0B_PrAuz08nbvSTVkdVlnSThzN00/edit?usp=sha...
Direct access to online portal can be granted upon request.