Develop a peer review practice to meet challenges, improve accessibility, promote collaboration, and expand expertise in online course design.

Author Information
Author(s): 
Fawn Thompson
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Peer review can be stimulating and enjoyable, but busy faculty and designers often find it difficult to make time for review, reflection, and idea sharing. I conceptualized and pilot-tested methods for small- and large-group peer review that required very little time on behalf of my participants, and had successful outcomes for course design and revision. The large group peer review process may be easily scaled for different group sizes, and may be utilized by faculty, designers, and other program or course stakeholders.

Description of the Effective Practice
Description of the Effective Practice: 

Peer review can be stimulating and enjoyable, but busy faculty and designers often find it difficult to make time for review, reflection, and idea sharing. I conceptualized and pilot-tested methods for small- and large-group peer review that required very little time on behalf of my participants, and had successful outcomes for course design and revision. The large group peer review process may be easily scaled for different group sizes, and may be utilized by faculty, designers, and other program or course stakeholders.

For the first pilot, I tested a method of peer review with a very small group: four instructional designers who work together on a team. Each term, a designer 1) provided a brief review request for her or his own course, and 2) independently reviewed another designer’s course. This pilot lasted a year, during which time I gathered feedback. Next, I redesigned the process for a larger group of 12 instructional designers who would take turns being the reviewee. These designers already met on a monthly basis, and the peer review took place in the first 15-20 minutes of the meeting. The reviewee provided his or her course review challenge in advance of the meeting.

For these peer review processes, I provided a list of questions to reviewers as a starting point, but communicated that reviewees were welcome to provide alternate questions and use the peer evaluation however it would be most beneficial to assess and explore a course design challenge. I offered reviewees the opportunity to either take a work-in-progress approach for an upcoming course, or to ask for an evaluation of a recently designed course.

Guiding questions for peer course review (very roughly based on the Feldman Method: Varieties of Visual Experience, 1972):

  1. What do you notice?
  2. What elements/methods are used to achieve a design goal?
  3. How might students interpret or interact with parts of this course?
  4. What do you find successful?
  5. What simple suggestions might you offer the designer?

Creative freedom was key to the success of the peer review processes. While designers followed the proffered list in the first round of peer evaluation, they quickly began to explore areas of their own interest in the evaluation requests and reviews. They also brought in challenges provided by the faculty with whom they work.

Common themes emerged in the peer review challenges:

  1. Check out my tech - Does this new technology meet my challenge?
  2. Recommend tech - Do you have any technology recommendations for my design challenge?
  3. Pedagogy - I'm in search of new/different teaching strategies to meet learners' needs.
  4. Accessibility – Does this format meet/exceed standards for accessibility? Does it follow principles of universal design?
  5. Reorganization - How should I reorganize my content for better flow and understanding?
  6. Clarity - I'm too close to this; is it clear?

In the spirit of review, designers also provided me with helpful feedback and suggestions to make the process more effective for them; this feedback was incorporated into the peer review processes. For example, participants in the larger group pilot requested an end-of-year recap session where designers could share results from the outcomes of their challenges.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Participants in both pilots found peer review to be highly beneficial. They enjoyed seeing challenges presented both for courses that were complete and works-in-progress.

Design challenges were rarely unique to a single course. Participants found commonalities in challenges across a range of programs and subject areas, from Social Work to Computer Science. By sharing experiences and solutions, participants also deepened their understanding of each other’s expertise. They began to follow up on one another’s past challenges to find if earlier solutions would be applicable to new, related challenges. Solutions were beneficial not only to immediate experiments but to a wider scope of course outcomes.

Participants reported that they felt more inspired after reviewing challenges together, sharing helpful ideas, and celebrating achievements. (Their enthusiasm in the process also demonstrates how much they care about design.)

  • "It’s been really valuable to me to not only get feedback but to take some time to consider other courses and design challenges that you all face."
  • "These are fantastic ideas and really helped me to see what’s possible in new ways—which is exactly what I was hoping for."

After each pilot, the designers agreed unanimously to continue implementing the peer review processes.

How does this practice relate to pillars?: 

Learning effectiveness – Courses are reviewed by a panel of designers with expertise in pedagogical theory and practice. Principles of universal design and usability testing are consistently applied. Learning outcomes are further measured by student evaluations at the end of each course.

Scale – The peer review practice has been scaled from smaller to larger groups with success. Furthermore, design solutions are beneficial not only to individual courses but to a wider scope of course and program outcomes.

Access – Accessibility is a major component of the peer review. Panel include designers with expertise in WCAG guidelines who examine course elements to ensure they meet/exceed standards for accessibility.

Faculty satisfaction – Design challenges approached by the panel result in implementation of solutions that assist faculty with their teaching goals. Instructors have taken elements created for their online courses and brought these to their on-campus students to broaden their conceptual practice or to deepen understanding.

Student satisfaction – (Note: These comments apply to the online courses we support, which benefit holistically from this practice. Students do not participate in instructional designers’ peer review process.) Students evaluate online courses at the end of each term. Evaluations are carefully read and addressed by online learning teams. Students are also encouraged to reach out to support administrators who work exclusively on online courses.

Equipment necessary to implement Effective Practice: 

Nothing beyond what organizations typically own (computers, meeting space).

Estimate the probable costs associated with this practice: 

Time cost: Total of 45-60 minutes per review, including prep and discussion.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Fawn Thompson
Email this contact: 
fawnt@bu.edu