Sponsor Videos

Software Secure

Keypath Education

Conference News

LinkedIn FaceBook YouTube GooglePlus www.instagram.com/onlinelearningconsortium


Download the Mobile App
IOS  |  Android
OLC Mobile App

Make your travel arrangements

Yoga with Jan

Add to my registration


American Higher Education in Crises book cover

Join keynote speaker Goldie Blumenstyk for a book signing.

Books are available for pre-purchase for $16.95 (+tax). 
Read more

Conference Program now posted! This year's line-up includes:


OLC Excellence and Effective Practice Award Recipients Announced


Add/remove sessions from the Program Listing on the website or in the mobile app to create a list of sessions you want to attend!

My Schedule

Join Keynoters Goldie Blumenstyck (Chronicle of Higher Education) and Phil Hill and Michael Feldstein (MindWires Consulting)

BYOD to learn, explore, and share knowledge within this lab environment

Test Kitchen

Save the Dates

22st Annual OLC International Conference
November 16-18, 2016 | Orlando, Florida | Walt Disney World Swan/Dolphin Resort

OLC Innovate 2016 - Innovations in Blended and Online Learning
April 20-22, 2016 | New Orleans, LA | Sheraton New Orleans Hotel

Reflect, Review, Improve: Faculty Course Portfolios, Self-Reflection, and Peer Reviews to Assess Online Course Quality

Darlene Smucny (George Mason University, USA)
Monisha Tripathy (George Mason University, USA)
Session Information
October 15, 2015 - 11:15am
Faculty and Professional Development & Support
Major Emphasis of Presentation: 
Practical Application
Institutional Level: 
Universities and Four Year Institutions
Audience Level: 
Session Type: 
Discovery Session
Atlantic Hall
Session Duration: 
45 Minutes
Discovery Session 2

Join us for discussion of best practices and strategies for quality assessment processes which evolve with your institution's changing e-learning landscape.

Extended Abstract

Online course quality is guided by the five pillars of the OLC quality framework (learning effectiveness, cost effectiveness, access, faculty satisfaction, and student satisfaction). Demonstrating learning effectiveness is particularly important for institutions which are new to online or for those which seek to expand online programs. To achieve faculty buy-in, obtain institutional support, and meet accreditation requirements, it is critical for an institution to demonstrate that the quality of learning online is comparable to the quality of traditional "on-the-ground" programs. Therefore, online program administrators must develop processes: (1) to provide faculty with feedback about the quality of their online courses and (2) to promote continuous improvement of online courses and online teaching. Once in place, these quality assessment processes must change and evolve, as needed, to serve an institution's changing e-learning landscape.

In our Discovery Session, participants will
(1) Evaluate the relevance of course portfolios, self-reflection, and peer reviews for quality assessment of online courses.
(2) Appraise changes needed in quality assessment models, as online education evolves at institutions.
(3) Share strategies for best practices in quality assessment and continuous improvement in online courses.

The Office of Distance Education at George Mason University currently implements an online course assessment process of self-reflection, peer review, and continuous improvement. This process occurs after an online course has been piloted for the first time. Through the multiple perspectives of self-reflection and peer review, an online instructor obtains information about online course effectiveness. The key element to this process is the faculty online course portfolio. The purpose of the faculty online course portfolio is two-fold: (1) to demonstrate the effectiveness and comparability of online courses to equivalent face-to-face courses; and (2) to provide faculty-with self-reflection and peer feedback on their completed online pilot courses.

The steps of our quality assessment process include:

Self-Reflection: An online instructor prepares the course portfolio upon completion of the online pilot course. The Faculty Reflection document is generally considered the most important document in the online course portfolio, providing a "road map" for reviewers for course context, learning outcomes, use of technologies, assessment strategies, and student evaluations. The instructor reflects on the following questions: "What did I plan, how did it work out, and what would I change in my future online course offering?"

Peer Review: In the second step of our process, the course portfolio and online Blackboard classroom undergo an anonymous peer review. Each course portfolio is reviewed by a team of two reviewers, consisting of an experienced online instructor and an instructional designer. The reviewers use common criteria to rate the materials presented in the portfolio. Criteria are grouped into major review areas, including learning outcomes, course presentation, participation and interaction, learning support, faculty reflection, and course comparability.

Continuous Improvement: When reviews are completed, instructors are debriefed by the Office of Distance Education and then each instructor receives the review results of their online pilot course portfolios. To promote continuous improvement, faculty are encouraged to follow-up with the Office of Distance Education for consultations, and to participate in faculty professional development opportunities, such as OLC workshops.

Since 2010, this process of course portfolios, self-reflection, and peer reviews has been completed for approximately 174 new online courses developed through the Office of Distance Education at George Mason University. Overall, the assessment process has been successful in providing constructive feedback to faculty about online course quality, effectiveness, and comparability. The basic framework of our "reflect, review, improve" model is sound, but may require changes, as online education at George Mason University "evolves", changes, and expands. The Office of Distance Education must consider the relevance and scale of our processes: What changes are needed in our quality assessment processes in order to better serve online teaching and learning across the university? What recurrent issues and concerns have we identified in the self-reflections and peer reviews? How can we improve our quality assurance processes in order to promote continuous improvement of online courses?

Our course quality assessment process has been limited, since it only has focused on new course developments supported through the Office of Distance Education. Online courses, developed independently in the departments and schools across campus, have not had the opportunity for Office of Distance Education-sponsored course portfolio reviews. Therefore, in spring 2015, we are piloting an "Open Call" for online course portfolio reviews for courses not developed through our office. Through this effort, we hope to introduce more online faculty to the services offered by the Office of Distance Education, and to promote continuous quality improvement in online courses across the entire university. We also need to better define how to compare online courses to f2f courses. Reviewing past online course portfolios, it appears that faculty often interpret course" comparability" to mean that an online course is taught in exactly the same way as its f2f course counterpart. To address this issue, faculty may need additional guidance in transforming "teacher-centered" f2f courses to "learner-centered" online courses, while ensuring that learning outcomes are met in the different delivery formats. We must enhance faculty awareness of the accessibility of online courses. Accessibility frequently is mentioned as a concern in the peer reviews of online courses. We are now piloting "accessibility checks" of online courses, through partnering with the Assistive Technology Initiative (ATI) Office. As part of our "Open Call" course reviews this spring, ATI will conduct accessibility checks of the online classrooms and identify potential accessibility concerns more efficiently. ATI will follow-up with the online instructors with specific, focused guidance and resources for accessibility.

In our Discovery Session, we will describe our "evolving" quality assessment process and solicit suggestions for additional improvements. We will engage participants in an interactive discussion of best practices and strategies for quality assessment and continuous improvement for online courses, both for newly-developed online courses and existing online courses.

Lead Presenter

Dr. Darlene Smucny is Assistant Director for Quality in Online Instruction, Office of Distance Education, George Mason University (http://masononline.gmu.edu/). Darlene focuses on assessment of online student learning, retention, and satisfaction; quality of distance education courses and programs; and online faculty development services and support. She is a Quality Matters (QM) Peer Reviewer and has served on QM review teams to assess quality of online courses. Prior to coming to Mason, Darlene was Collegiate Professor and Academic Director for Social Sciences in The Undergraduate School, University of Maryland University College (UMUC). At UMUC, Darlene led faculty development efforts for online instruction in the social sciences, including an online faculty workshop series which received the 2011 Award for Excellence in Faculty and Staff Development from University Professional and Continuing Education Association (UPCEA Mid-Atlantic Region). Darlene holds Bachelor of Arts degree from Lake Erie College, Masters of Science (Biology) from Cleveland State University, and Ph.D.(Anthropology) from UCLA. She has designed & taught hybrid and online courses in Anthropology, Interdisciplinary Social Sciences, Biology, and the Natural Sciences.


Handouts with extended abstract (downloadable pdf), and samples of George Mason University's Distance Education Course Portfolio Review Criteria and Action Plan Template have been posted for this Discovery Session (olc53593).