Integrated Assessment System for Courses, Overall Program and Post-Program Career Impacts

Author Information
Wayne Pferdehirt
University of Wisconsin - Madison
Institution(s) or Organization(s) Where EP Occurred: 
University of Wisconsin - Madison
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

This practice outlines a system of planned evaluations that together firmly support a process of continuous improvement for a distance degree program at the University of Wisconsin - Madison. Key elements include: an evaluation of each course by students and faculty; an evaluation of the overall program at graduation; and a follow-up survey of alumni, their co-workers, and their family members to measure the impact of the program upon professional and personal development of alumni.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

This set of coordinated evaluation tools has been employed in the MEPP program since 1999. An important emphasis with each evaluation is to seriously study results and use input to improve course design, instructional methods, and program curriculum. Students see that faculty and staff are serious about continuous improvement, observing real impacts from theirs and classmates' recommendations, and building their ownership of the continuous improvement process. The following statistics show the impact of these continuous improvements upon ratings across all courses. For each parameter, the program-wide average is given for 2000 and 2003. Instructor did a good job: 4.3 (2003), 4.1 (2000) Workload was about right: 4.1 (2003), 3.7 (2000) Discussion forums: 4.1 (2003), 3.7 (2000) Teleconferences: 4.1 (2003), 3.6 (2000) Course texts: 3.9 (2003), 4.1 (2000) Study guides: 3.9 (2003), 3.6 (2000) Got help whenever needed it: 4.3 (2003), 4.0 (2000) Useful in current responsibilities/job: 4.3 (2003), 4.4 (2000) Useful in future responsibilities/job: 4.4 (2003), 4.4 (2000) A good example of how the evaluation process works relates to the evaluation of teleconferences. During a faculty meeting in 2000, teleconferences were identified as an improvement target, based on program-wide averages. Training was developed and offered to faculty, focusing on making teleconferences more interactive and student-centered. Faculty used many of the ideas presented in these sessions, and teleconference rating by students improved significantly.

How does this practice relate to pillars?: 

learning effectiveness: An integrated system of evaluations is needed to continuously improve the design of individual courses and associated overall degree programs. A regular system of surveys, debriefing and planning of improvements supports the culture and practice of continuous improvement. This practice describes the system of evaluations and follow-up debriefings that has been developed and used by the University of Wisconsin-Madison Master of Engineering in Professional Practice (MEPP) program since 1999. MEPP is a two-year interdisciplinary engineering master's degree program focused on engineering technical leadership. This web-based graduate program admits 30 full-time working engineers from across the U.S. each year; these students progress as a stable cohort through a fixed curriculum. The first cohort was admitted in 1999 and graduated in 2001; as of May 2004, 114 engineers have graduated from the program. End-of-course evaluation form completed by students At the conclusion of each course, students are requested to complete an on-line course evaluation. All evaluations are conducted online, anonymously, using an automated survey tool. A summary of the evaluation results is prepared by the program evaluator, who submits the report to the course faculty, instructional designer, course manager, and the program director. Results are also shared appropriate with other program staff , and are discussed by faculty at periodic program faculty meetings. End-of-course evaluation by instructors At the conclusion of each course, the course instructor(s) complete an instructor evaluation form. The questionnaire asks instructors about their perceptions of what went well during the course, and what could/should be improved. The questionnaire is e-mailed to the instructor who can add responses digitally or manually. Completed questionnaires are then reviewed, along with student-completed evaluations, in a joint meeting that includes the course instructor(s), the instructional designer, and the course manager. As part of this meeting, the team identifies items to be improved and an action plan for accomplishing those changes prior to the next offering of the course. Occasional special evaluations As the need arises, special evaluations will be conducted to inform decisions affecting program design and operation. For example, over the past few years, special surveys of MEPP students have asked students about their use of the course management system tools, their hardware, operating systems, security system, ISP, and use of various orientation and help resources. Informal discussions with faculty, counselor and program director MEPP students typically visit their online classrooms, and participate in online discussions at least daily. These discussions are monitored daily by the program director, program counselor, and program IT support staff. Students are quick to point out problems and suggestions as part of these classroom forums. As problems are identified, they are resolved as soon as possible or noted for subsequent follow-up. At the end of the course, the messages describing problems are compiled and used as part of the end-of-course review by the instructional team. Graduate program evaluation Approximately one week before graduation, MEPP asks graduates to complete an evaluation of the overall program. This evaluation focuses on "big picture" issues that extend beyond any individual course. For example, students are asked to look retrospectively at each course, rate the value of each course, and to suggest any changes to the program curriculum. Students are also asked to identify the most important changes to themselves, and their abilities as a result of the program, and to identify strengths and weaknesses in the overall degree program. Results of this survey are compiled, compared with results from previous years, and shared with staff, faculty, and the advisory committee to identify target areas for continuous improvement efforts. Post-graduation program impact survey For the past three years, MEPP has also performed a program impact survey of alumni 6-9 months after graduation. This evaluation seeks to identify how the program has produced real changes in the abilities, attitudes and career opportunities of graduates. Separate, complementary surveys are conducted of alumni, co-workers of alumni (names provided by alumni), and family members of alumni (names provided by alumni). The survey of alumni and co-workers asks, respectively, alumni and their co-workers to rate the alumnus' perceived improvements in 18 skill domains targeted by the program. The alumnus is also asked to note the most significant changes in his/her attitudes, abilities and opportunities. The survey of family members asks about changes to alumnus' life skills and attitudes, and asks about impacts of the students' studies on family life, and how we can better support students' families. Results from this survey are compiled, summarized, and discussed with faculty, staff, and the program's advisory committee.

Estimate the probable costs associated with this practice: 

The MEPP program is glad to share the survey instruments used for course evaluations, program evaluations, or the post-graduation program impact evaluation, with any educators to reduce costs and foster collegial exchange. Costs for updating, administering, and analyzing results of the subject surveys follows: Course evaluations: 1 hr. to update; 0.5 hr. to administer, 2.0 hr to review results, summarize and discuss follow-up actions with course faculty. Program evaluations: 1 hr. to update; 0.5 hr. to administer, 2.0 hr to review results, summarize and discuss follow-up actions with program director, all faculty and staff, and with program advisory committee. Program impact survey: 2 hr. to update 3 surveys (graduate, co-worker; and family member); 8.0 hr. to administer (Web-administered with e-mail and phone follow-up contacts), 3.0 hr to review results, summarize and discuss follow-up actions with program director and all faculty, and with program advisory committee.

References, supporting documents: 

Haworth, J. (1996). Assessing Graduate and Professional Education. New Directions for Institutional Research, Vo1. 92. San Francisco: Jossey-Bass Publishers.

Other Comments: 

MEPP is glad to share survey instruments with other educators.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Wayne Pferdehirt, Program Director, 608-265-2361
Email this contact: