Clemson Online staff identified an opportunity to create tools and support to further the fine art of online education at the University. Three distinct processes were developed: CONCERT – an asynchronous online training for faculty; ENCORE(S) – a course certification rubric designed for a positive user experience; and BACKSTAGE – a checklist to ensure effective online delivery. This paper outlines the specifics of each tool.
Clemson Online was organized as a centralized unit to orchestrate the growth and administration of online offerings at Clemson University. The unit was tasked with providing vision, leadership, coordination, and expertise in support of faculty design, delivery, and evaluation of technology-enhanced, blended, and fully online courses and instructional materials. Specifically, Clemson Online was asked to collaborate with faculty, administrators, staff, and students to:
• Develop appropriate requirements, standards, and training for online course development.
• Evaluate courses to ensure they are pedagogically sound and provide a high-quality learning experience for students.
• Implement a strategic planning process and business plan for online education.
• Ensure compliance with government regulations and accreditation standards.
• Create and maintain effective eLearning infrastructure and support for faculty development.
• Market online courses and programs to current and prospective students.
In the spring semester of 2014, Clemson Online conducted an audit of all courses offered online by Clemson University. The audit was designed to bring transparency, accountability, and robust support to faculty and students engaged in online coursework. This audit was particularly timely because of the growth and commitment of Clemson Online to build accessible curricula for all Clemson University stakeholders. The audit involved the search of each course to ensure the following guidelines were met:
• National quality benchmarks for online course design
• Compliance with accreditation standards
• Authentication of student identity
• Security and retention of institutional and student data
• Substantive faculty-student interaction through adopted technologies
The audit included 141 online courses offered in the spring semester of 2014. An analysis revealed that only 37 of those courses were compliant, 40 were categorized as correspondence courses, and 64 courses could not be located in the learning management system. As a result of the audit, Clemson Online staff identified a need to train faculty to develop and deliver online content. From this audit, CONCERT, ENCORE(S), and BACKSTAGE were born.
Based on the disappointing outcome from the audit of online courses, it was clear that an organized initiative was needed to ensure quality and excellence for all online offerings within the University. Clemson Online staff developed a threefold approach to address the quality gaps identified in the audit. First, a six-week faculty certification course was developed for those wishing to teach or create online content. Then, a review process was initiated to ensure quality online course development processes. Finally, a checklist was developed to support faculty in the delivery of excellent online content. These threefold approaches were named CONCERT, ENCORE(S), and BACKSTAGE.
Clemson ONline CERTification (CONCERT) was developed as a six-week course required for all faculty teaching or developing courses online for the University. The course was offered entirely online in an asynchronous format to allow for maximum flexibility. Each week, faculty spent approximately 3-5 hours on the training. In this course, faculty engaged in content spanning three modules:
1) Design and Develop - In this module, faculty learned fundamental skills to design and develop courses in an online format. Information included technology, best practices, and strategies to create engaging content for an online course.
2) Deliver and Teach - In this module, faculty explored strategies to provide quality online instruction. They explored pedagogies appropriate for the online environment and identified ways to engage and encourage student success.
3) Assess and Improve - The final module included a focus on ways to evaluate student mastery of content through assessment, effective grading practices, and quality feedback. In addition, there was an exploration of continuous improvement for online effectiveness and growth.
Once faculty completed CONCERT, they were invited to continue in their online course development either autonomously or in partnership with a course development leader, who was trained to support faculty throughout the course creation process. Once the course was developed, faculty were asked to submit their course for a peer-review using the Quality Matters (QM) Rubric. Unfortunately, many faculty resisted the lengthy QM review process and the feedback they received. Clemson Online staff decided to explore alternatives to the QM process. The aim was to:
• Offer faculty involvement prior to review. Self-review and comments prior to peer help highlight noteworthy features in an online course.
• Account for compliance issues. An online review process should also observe that compliance needs are addressed: substantial interaction, Learning Management System (LMS) usage, faculty-initiated interaction, and grading in the LMS.
• Review for newly developed courses and mature online courses. QM was intended for courses that have been taught two or more times; Clemson Online sought to ensure quality from the first online offering and beyond.
• Reduce number of syllabus-related standards in relation to course content. By changing from a QM-based approach where more than 30 of standards measured were syllabus-related, a more efficient review process was desired to coincide with an online syllabus template.
• Respond to changes and new research. Attention to visual design and layout should be included in the review process along with an emphasis on other areas that have gained attention through research on innovative approaches.
• Account for course-level and discipline-specific variations in relation to student population. As noted by Clemson faculty, student needs vary by course-level and discipline; therefore an online review tool should respond accordingly.
• Recognize superior quality indicators. Innovative and other engaging qualities may include attention to mobile learning, advanced use of social learning tools, interaction with others beyond the online classroom, online badges, student multimedia presentations, field research, digital game-based learning, proficiency-based learning, computer apps, or other applications of learning to build an engaging learning environment.
• Offer a cost-effective model in an information-sharing age. By saving on QM subscription costs, reviewer training, and training materials, a more efficient and cost-saving model could be utilized. Training could emphasize practice and norming with reviews.
• Provide an additional method of review for online course delivery metrics. Online student evaluations (or surveys) focus on the instructor facilitation of the online course and is a major contributor to the student experience; unlike QM, an online review process should prepare for and include a measure of key qualities of online delivery.
In the fall term, faculty were offered the chance to have their newly developed online course reviewed using the new ENCORE(S) process, which included the following criteria:
• Experience of Students
• Navigationally Sound Design
• Collaborative Learning
• Ongoing Faculty Presence
• Relevant Application
• Engaging Content
• Superior Qualities
[The ENCORE(S) scoring rubric is included as Appendix A of this paper].
As a follow-up to the course certification that results after ENCORE(S), Clemson Online staff developed an additional quality checklist to be conducted while the online course is being delivered. While this tool has not yet been deployed, it is designed to ensure quality online teaching and interaction. The checklist includes the following criteria:
• Before Course Starts - Course syllabus, welcome announcement, and first module available to students before course start date. Faculty initiates welcome email or phone contact with students prior to course start.
• Acknowledgement of Students - Each student is individually welcomed to the course by the faculty. Faculty individually addresses student performance and participation needs. (Analytics in course may be utilized.)
• Cues to Direct Online Activities - Students are provided guidance to respond to online discussions via faculty facilitation and participation. Students are given feedback about their participation in the discussions.
• Key Dates Identified - Dates and deadlines for current course offering are identifiable in course calendar.
• Student-to-Student and Faculty-to-Student Interaction - Students respond to one another as part of the discussion requirements. Faculty engages students in the discussion forums. Faculty provides feedback to students on assignments, referring to a rubric or project guidelines whenever possible. Faculty clearly indicates how grades are derived for assignments.
• Timely Feedback - Faculty grades course assignments within one week of submission (sooner for a compressed course or longer for graduate-level courses). - Faculty promotes use of Q & A forum and responds to questions within 36 hours. Faculty responds to student emails in 24-36 hours.
• Announcements Throughout Course - Announcements are posted at least weekly and ideally more frequently. Announcements convey important information about the course.
• Grades in LMS - Assignment and final grades are organized and captured within the LMS. Feedback from faculty is included with student scores for easy student access.
• Emerging Strategies and Technologies - Delivery strategies and technology tools build community and promote learning. Students receive multimodal feedback (typed, audio, video, or other).
To date, over 200 faculty have participated in CONCERT. Of those, 134 successfully completed the training and received online faculty certification. Because Clemson Online started with QM as the review tool, 26 courses were certified through this process. Since the transition to ENCORE(S), faculty have indicated a more rewarding review experience. Since implementation of the ENCORE(S) process, 13 courses have been reviewed and 2 have achieved certification. Going forward, all courses will use the ENCORE(S) process.
ENCORE(S) has been normed to ensure inter-rater reliability. Norming groups included faculty as well as staff/administrators of online faculty development at Clemson and other institutions. Further, an ENCORE(S) Reviewer Certification is under development. All of the courses developed lead to new or expanded online certificate or degree programs; improve options for students; coordinate well with existing online courses or programs; have market appeal for both Clemson students as well as transient students (i.e., students at other institutions); and have the potential to lead to significant enrollment and tuition revenues. While this threefold approach continues to evolve, Clemson Online is pleased with early feedback from faculty regarding their experiences developing and delivering online content after engagement in these processes. CONCERT, ENCORE(S), and BACKSTAGE have furthered the fine art of assuring quality in online education.
Learning Effectiveness – This three tiered process is designed to prepare faculty to develop and deliver high quality online education across the University. Because many of Clemson’s faculty have limited experience using technology, we identified the need to provide professional development training to help them develop effective online teaching skills. Then, we developed our quality review process to ensure the courses created were of a superior quality to support student learning and success. The final piece of the process is the delivery review. In this process, faculty performance in the online class is reviewed to ensure students are receiving a learning experience that is equivalent to or superior to their traditional learning experiences. This three tiered process is well aligned with the following aspects of learning effectiveness: Course Design, Learning Resources, Faculty Development, Pedagogy, Interaction, Assessment, and Learning Outcomes.
Scale (Cost Effectiveness and Commitment) – The CONCERT, ENCORE(S), and BACKSTAGE processes were developed in house to make them scalable and low-cost for the university. By creating these approaches based on best practices documented in the literature, we were able to provide value to students and the University. Because of the finite resources allocated to the online unit, it was necessary to create a low-cost alternative that would result in the highest quality for the institution. We manage all aspects under Clemson Online and support faculty in the cost-effective development and delivery of online offerings. This three tiered process is well aligned with the following aspects of scale: Cost Effectiveness, Institutional Commitment, Leadership, Institutional Infrastructure, Technical Infrastructure, Methodologies, Policy, Partnerships, Scalability, and Localness.
Access – We have invited faculty from across the institution to engage in our three tiered process of professional development, course certification, and delivery review. Those who have participated have received praise from students regarding the quality of the online experience at the institution. In addition, access has been shared with external constituencies interested in the process developed. We treat the three tiered process as an open educational resource. This three tiered process is well aligned with the following aspects of access: Technical Infrastructure, Academic Administrative Services, Student Support Services, Learning Resources, Course Design, and Program Access.
Faculty Satisfaction – Faculty surveyed after the CONCERT and ENCORE(S) process indicated satisfaction with the process in 98% of the responses. Faculty not only enjoyed their experience, but affirmed that they learned necessary strategies to provide quality online offerings for students. In fact, many of our faculty champions have decided to serve in supportive roles such as CONCERT instructors or ENCORE(S) reviewers. We have also submitted several award nominations for faculty who have completed the ENCORE(S) review process and developed exceptional online courses. This three tiered process is well aligned with the following aspects of faculty satisfaction: Institutional Rewards, Administrative Support, Faculty Support, Technological Infrastructure, Online Experience, and Opportunities for Research Publication.
Student Satisfaction – The BACKSTAGE process is specifically tailored to evaluate the delivery components of the online program. We ensure the courses offered are 100% accessible, student engagement is measured, and we review faculty performance on indicators in the literature shown to promote student success and retention. There is a value placed on student engagement and faculty interaction in the courses offered, and student satisfaction surveys are used to identify any additional modifications needed in the offering. This three tiered process is well aligned with the following aspects of student satisfaction: Access, Course Design, Technological Infrastructure, Academic Administrative Services, Student Support Services, and Interaction.
Some type of spreadsheet to track faculty completion, course review status, and delivery review outcomes.
The only cost is in the resources used for CONCERT training and reviewers for ENCORE(S) and BACKSTAGE (approximately $2000 per month at the current rate of delivery at Clemson University. Costs will vary depending upon scope of implementation.)
Allen, I., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Needham, MA: The Sloan Consortium, 1-26. Retrieved from http://sloanconsortium.org/publications/survey/class_differences
Anastasi, J. S. (2007). Full semester and abbreviated summer courses: An evaluation of student performance. Teaching of Psychology, 34(1), 19-22.
Bowen, W., Chingos, M., & McPherson, M. (2009). Crossing the finish line: Completing college at America’s public universities. Princeton, NJ: Princeton University Press.
Brock, T. (2010, Spring). Young adults and higher education: Barriers and breakthroughs to success. Future Child, 20(1), 109-132.
Crockett, D. S. (n.d.). The ten most effective retention strategies for technical and community colleges. Retrieved from http://www.dixie.edu/reg/SEM/DCrockettTenMost.pdf
Ferguson, J. M., & DeFelice, A. E. (2010). Length of online course and student satisfaction, perceived learning, and academic performance. International Review of Research in Open and Distance Learning, 11(2), 73-84. Retrieved from http://www.eric.ed.gov/contentdelivery/servlet/ERICServlet?accno=EJ895748
Fike, D. S., & Fike, R. (2008). Predictors of first-year student retention in the community college. Community College Review, 36(2), 68-89. doi:1569189141
Gallien, T., Oomen-Early, J. (2008). Personalized versus collective instructor feedback in the online courseroom: Does type of feedback affect student satisfaction, academic performance, and perceived connectedness with the instructor? International Journal of E-Learning, 7(3), 463-476.
Georgia Institute of Technology. (2012). Annual first-time freshman retention study. Retrieved from http://www.academicimpressions.com/webcast/retaining-first-generation-st...
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81-112. doi:10.3102/003465430298487
Ho, W. L., & Polonsky, M. (2012, August). Marketing students' perception of traditional and intensive delivery: An exploratory study. In ANZMAC 2007: 3Rs, reputation responsibility relevance (pp. 3268-3273). University of Otago, School of Business, Dept. of Marketing.
Hodges, C. (2008). Self-efficacy in the context of online learning environments: A review of the literature and directions for research. Performance Improvement Quarterly, 20(3/4), 7-25. Doi:10.1002/piq.20001
Kara, A., & DeShields, O. W., Jr. (2004). Business student satisfaction, intention and retention in higher education: An empirical investigation. In MEQ Vol 3. Retrieved from http://www.elmar-list.org/_Vol_3/_satisfaction.pdf
Klein-Collins, R. (2011). Underserved students who earn credit through prior learning assessment (PLA) have higher degree completion rates and shorter time-to-degree. Retrieved from http://www.cael.org/pdfs/126_pla_research_brief_1_underserved04-2011
Kochtanek, T. R., & Hein, K. K. (2000). Creating and nurturing distributed asynchronous learning environments. Online Information Review, 24(4), 280-289.
Kucsera, J. V., & Zimmaro, D. M. (2010). Comparing the effectiveness of intensive and traditional courses. College Teaching, 58(2), 62-68. doi:10.1080/87567550903583769
Lee, N., & Horsfall, B. (2010). Accelerated learning: A study of faculty and student experiences. Innovative Higher Education, 35, 191-202. doi:10.1007/s10755-010-9141-0
Liaw S. S. (2008) Investigating students’ perceived satisfaction, behavioural [sic] intention, and effectiveness of e-learning: a case study of the blackboard system. Computers and Education, 51, 864-873. Retrieved from http://www.journals.elsevier.com/computers-and-education/
Long, T. (2009). Harvard Graduate School of Education: Usable knowledge. Retrieved from http://www.uknow.gse.harvard.edu/leadership/LP101-407.html
Maroney, B. R. (2010). Exploring non-traditional adult undergraduate student persistence and non-persistence in higher education: A stress and coping model approach. (Doctoral dissertation, Indiana University of Pennsylvania, 2010). Retrieved from ProQuest dissertation and theses database.
Meyer, K. A., & McNeal, L. (2011, June). How online faculty improve student learning productivity. Journal of Asynchronous Learning Networks, 15(3), 37-53.
National Center for Education Statistics. (2012). Distance education at degree-granting post-secondary institutions. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp
Patel, R., & Rudd, T. (2012). Can scholarships alone help students succeed? Lessons from two New York City community colleges. MDRC. Retrieved from http://www.mdrc.org/publication/can-scholarships-alone-help-students-suc...
Roehrs, C., Li, W., Kendrick, D. (2013, March). Preparing faculty to use the Quality Matters Model for course improvement, Journal of Online Learning and Teaching, 9(1), 52-67.
Samuels, W., Beach, A. L., Palmer, L. B. (2012). Persistence of adult undergraduates on a traditionally-oriented university campus: Does Donaldson and Graham’s Model of college outcomes for adult students still apply? Journal of College Student Retention: Research, Theory & Practice, 13(3), 351-371.
Scott, M., Bailey, T., & Kienzl, G. (2006). Relative success: Determinants of college graduation rates in public and private colleges in the U.S., Research in Higher Education, 47, 249-279.
Shaw, M., Chametzky, B., Burrus, S. W. M., Walters, K. (2013, Winter). An evaluation of student outcomes by course duration in higher education. Retrieved from http://www.westga.edu/~distance/ojdla/winter164/shaw_chametzky_burrus_wa...
Song, L., Singleton, E., Hill, J., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education, 7, 59-71.
Wang, C. H., Shannon, D. M., Ross, M. E. (2013, November). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education, 34(3), 302-323.
Wyatt, L. G. (2011). Nontraditional student engagement: Increasing adult student success and retention. Journal of Continuing Higher Education, 59(1), 10-20.