The Intersection of Technology & Institutional Effectiveness: Leveraging MERLOT Content Builder with Emerging Technology to Assess Programmatic Student Learning Outcomes

Award Winner: 
2014 Sloan-C Effective Practice Award
Author Information
Author(s): 
Rick Lumadue, PhD
Author(s): 
Rusty Waller, PhD
Institution(s) or Organization(s) Where EP Occurred: 
Texas A&M University-Commerce
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Programmatic student-learning outcomes of an online master’s degree program at a regional University in Texas were assessed in this study. An innovative use of emerging technology provided a platform for this study. The Astin Model provided the framework for the evaluation. This study has provided a model for conducting well-informed, instructional and programmatic assessments of student-learning outcomes. The results of this study demonstrated that emerging technology can provide a platform for students to both showcase and preserve their ability to meet programmatic student-learning outcomes.

Description of the Effective Practice
Description of the Effective Practice: 

This online master’s degree program is taught using a fully interactive online format in a primarily asynchronous delivery model. Asynchronous activities used in the program included: threaded discussion, video and audio presentations, written lecture linked to video and audio presentations embedded into the course management system, Voicethreads, faculty developed MERLOT web pages created using the MERLOT Content Builder, e-Textbooks, etc.
The Astin Model (1993) provided a framework for this assessment. In the Astin Model, quality education not only reaches established benchmarks but also is founded upon the ability to transition students from where they are to reach intended competencies. An innovative use of MERLOT Content Builder combined with emerging technology provided a means for assessing the seven student-learning outcomes in an online master’s program at a regional university in Texas.
Two full-time faculty and one adjunct faculty used rubrics to evaluate each of the programmatic student-learning outcomes by assessing a random sample of student assignments from courses.
The goal of this study was to help students reach the intended learning outcomes for metacognition, digital fluency, communication, cultural fluency, global fluency, servant leadership, and commitment to life-long learning. Definitions of these learning outcomes are provided here. Students will evidence metacognition by demonstrating the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading. Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations. Students will be able to communicate ideas and content to actively engage participants. Students will evidence understanding of generational and cultural learning styles. Students will develop instructional materials appropriate for a global perspective. Students will practice the principles of servant leadership as espoused by Robert Greenleaf in his work titled, The Leader as Servant (1984). According to Greenleaf, “The servant-leader is servant first. It begins with the natural feeling that one wants to serve first. Then conscious choice brings one to aspire to lead. Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials.
Digital education presents many challenges. Barnett-Queen, Blair, and Merrick (2005) identified perceived strengths and weaknesses of online discussion groups and subsequent instructional activities. Programmatic assessment is required for all institutions accredited by the Council of Higher Education Accreditation or the US Department of Education. Walvoord (2003) indicated that good assessment should focus on maximizing student performance. The following questions rise to the forefront: (1) Have graduates mastered programmatic expectations; (2) What relationships exist between student performance and other factors; and (3) How can faculty improve the program based upon the analysis of student performance. Walvoord further stresses the importance of direct assessment in determining student performance. Indirect measures may provide evidence of student-learning, but direct assessment is widely viewed as more valid and reliable.
Brandon, Young, Shavelson, Jones, Ayala, Ruiz-Primo, and Yin (2008) developed a model for embedded formative assessment. The model was collaborative and stressed embedded assessment. Their study stressed the difficulties associated with broad-based collaboration given the difficulties of formally identifying partners and spanning large geographic distances. Price and Randall (2008) demonstrated the importance of embedded direct assessment in lieu of indirect assessment. Their research revealed a lack of correlational fit between indirect and direct assessment of the same aspect of student-learning with the same course in a pre- and post-test design. They documented a difference between student perceived knowledge and actual knowledge. These findings further underscore the importance of direct assessment of student-learning. Walvoord’s (2003) findings further indicated the need for embedded direct assessment of student-learning owned and supported by those who will implement the change. Those implementing change would include program faculty and students.
Gardner (2007) found that education has long wrestled with defining and assessing life-long learning. Though loosely defined as the continued educational growth of the individual, lifelong learning is rapidly rising to the forefront of 21st century education to assume a more prominent place than that held in the 20th century. Brooner (2002) described the difficulty of assessing the intention to pursue learning beyond the completion of a program. Intention and subsequent performance are affected by many different factors including, but not limited to, normative beliefs and motivation. Educational programs have often been encouraged to avoid assessment of behavior beyond the point of graduation as such behavior as been viewed as beyond the control of program educators (Walvoord, 2003). The question arises as to the importance of future behavior as an indicator of current learning.
Astin (1993) pointed out that educators are inclined to avoid assessment of the affective domain viewing such as too value laden. Accordingly, the cognitive domain became the defacto assessment area though affective assessment more closely paralleled the stated aims and goals of most institutions of higher education. The avoidance of assessment in the affective domain is well documented by Astin. The advent of social media tools coupled with e-portfolios offers some intriguing possibilities in regard to assessment in the affective behavioral domain. Astin pointed out that a change in the affective domain should translate into changed behavior.
Secolsky and Wentland (2010) found many advantages to portfolio assessment that transcend regular assessment practices by providing a glimpse into non-structured behavioral activities. Behavior beyond the classroom can be captured and documented within a properly designed portfolio. Behavior that has not been directly observed by the teacher can be measured in light of portfolio submissions via a broad collection of relevant and targeted information. Established performance criterion can be assessed to measure student-learning and determine specific areas for programmatic improvement. Though Secolsky and Wentland point out that reliability and validity concerns still exist with portfolio measurement, they concur that portfolio assessment potentially gauges authentic student performance outside the educational environment. With the development of a portfolio transportable beyond program enrollment and across the life experience the opportunity exists to assess the impact of the instructional experience upon real time student performance. Evaluation of life-long portfolios promises to provide meaningful insight into the real life impact of the educational experience. Astin (1993) viewed changed behavior over time as the real evidence of affective enlightenment.
An interesting finding from this study was the creative manner in which some of the students layered or nested other web 2.0 technologies into their MERLOT web pages. Examples of layering or nesting included embedded student developed Voicethread presentations, embedded open-ended discussion Voicethreads used to promote participation and feedback, embedded YouTube Videos, embedded Prezis and the like.
The integration of MERLOT GRAPE Camp peer review training into this Master Degree Program has provided an additional platform for further research to be conducted relative to the assessment of all seven of the programmatic learning outcomes of the program. For example, metacognition may be assessed as it relates to MERLOT’S peer-reviewers serving as content expert in assessing materials that pertain to one’s field. Communication may be assessed through interaction with peers and peer-reviews. Digital fluency is obviously what is required to contribute to MERLOT. Cultural Fluency may be demonstrated through peer reviewing submissions of MERLOT’s international community of partners. Global Fluency may be measured through the development and contribution of appropriate content for use in a global community of learners. Servant Leadership is the motto of MERLOT, “Give a Gift not a Burden!” (Gerry Hanley, 2010). Finally, the development of students into lifelong learners will help to establish the identity of the program. Student performance outside of the program is one of the best measures of student-learning and the MERLOT Content Builder along with MERLOT peer-reviews is a tremendous platform for measuring student-learning outcomes.
Life long learning may be assessed by current and former students’ contributions of materials to MERLOT and by those providing peer reviews of materials contributed to MERLOT. As a benefit of being a MERLOT partner, the dashboard report provides information on contributions made by members of the partner organization. Contributions and/or peer reviews completed by students who have graduated from the program will be recorded in the dashboard report. This is a tremendous tool to measure the commitment to life long learning. Ultimately, this study has demonstrated that the MERLOT platform combined with emerging technology are integral in assessing student-learning outcomes in an online master’s program at a regional University in Texas. Other online degree programs should seriously consider the MERLOT Content Builder’s potential to help them assess student-learning outcomes.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The Online Master of Science in Global eLearning equips specialists in education for practice in public education, private education, business, industry, and non-profit organizations. Learning and technology are intertwined as we develop the next generation of enhanced training, development, and teaching to engage learners with key components of instructional technology. Technology provides access to all forms of education and this program will teach educators how to implement technology across curricula and classrooms of all kinds. With a blend of theory and technical skills, this program will prepare teachers and corporate trainers alike.

Metacognition – Students will demonstrate the knowledge and skills for designing, developing, and evaluating personal strategies for learning and leading.
5 journal entries will be selected at random from a course offered in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Metacognition rubric. Scores will be deemed acceptable at an average of 4.0 or higher on a 5 point scale in each of the areas of context & meaning, personal response, personal reflection, and interpretive skills.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Context & Meaning 4.27
Personal Response 4.13
Personal Reflection 4.40
Interpretive Skills 4.47

All standards were met.
Though all standards were met, the faculty noted that the personal response section scored the lowest at 4.13. Accordingly the course, EDUC 595 Research Methodology, was expanded to include more opportunities for students to provide self and peer-evaluation feedback on projects and assignments. Two assessments were recommended for AY 2013-2014. We will assess one course in the Fall and one course in the Spring.

Communication – Students will communicate ideas and content to actively engage participants.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 42 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone. The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Purpose 45.33
Organization 46.67
Content 46.00
Language 44.00
Voice & Tone 44.67
Technology 45.33

All standards were met. Though, all standards were met Faculty noted that Language scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 515 Intercultural Education to provide students an opportunity to develop their language skills on a project to provided heightened sensitivity to language that might be offensive in other cultures.

Two assessments were recommended for AY 2013-2014.

Digital Fluency - Students will evidence digital fluency in the adoption and integration of appropriate technologies into digital presentations.
5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Assessment of Digital Student Presentation Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 45 on a 50 point scale in the area of technology.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Technology 45.33

The standard was met.
The faculty noted that the students tended to use more familiar software and avoid the utilization of emerging software. Accordingly, EDUC 510 Utilizing Effective Instructional Technology was modified to include requirements for the utilization of at least one Web 2.0 software program to complete an assignment.

The faculty will conduct two evaluations in AY 2013-2014.

Cultural Fluency – Students will evidence understanding of generational and cultural learning styles.

5 student digital presentations will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Cultural Fluency Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 3.0 on a 4 point scale in the areas of knowledge & comprehension, analysis & synthesis, and evaluation.

The assessment was conducted by two fulltime and one external faculty on March 6, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 3.53
Analysis & Synthesis 3.07
Evaluation 3.67

The standard was met. The faculty noted that analysis and synthesis scored lowest. Accordingly the curriculum for EDUC 552 Global Fluency was expanded to include group projects on the education system of other cultures.

The faculty will also conduct two evaluations in AY 2013-2014.

Global Fluency – Students will develop instructional materials appropriate for a global perspective.

5 group project entries will be selected at random from a course offered in Summer 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Global Fluency Rubric. Scores will be deemed acceptable at an average of 2.8 or higher on a 4 point scale in each of the areas of knowledge & comprehension, application, and evaluation.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Knowledge & Comprehension 2.87
Application 3.00
Evaluation 2.87

The standards were met.

Faculty found student performance in this area to be adequate. Some challenges were noted in the use of stereotypes in identifying people from other cultures. For example, a student made a comment on. EDUC 515 Intercultural Education will be expanded to include a project in which students will interview someone from a different culture to discover differing worldviews of other cultures and share these findings in a forum with classmates.

Servant Leadership – Students will practice the principles of servant leadership as espoused by Robert Greenleaf.

5 student group project self-assessment packets will be selected at random from a course offered in Fall 2012. These will be evaluated using the Global eLearning Servant Leadership Rubric by the fulltime bachelor’s and master’s faculty. Scores will be deemed acceptable with an average of 40 on a 50 point scale in each of the five areas of purpose, organization, content, language, and voice & tone.

The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

Servant Leadership 41.33
Strategic Insight & Agility 39.33
Building Effective Teams & Communities 44.00
Ethical Formation & Decision Making 43.33

The standard was NOT met for Strategic Insight & Agility.

Faculty noted problems in the effective feedback of peer-evaluation assignment. Accordingly, the group peer assessment process has been expanded to include MERLOT GRAPE Camp to provide training on conducting peer-evaluations. All students will be required to complete MERLOT GRAPE Camp training. These changes will be enacted in all new course sections.

Commitment to Life-Long Learning – Students will evidence a commitment to lifelong learning in the production and evaluation of learning materials. 5 portfolio entries will be selected at random from a course in Fall 2012. These will be evaluated by the fulltime bachelor’s and master’s faculty using the Global eLearning Commitment to Life-long Learning rubric. Scores will be deemed acceptable at an average of 3.0 or higher on a 4 point scale in each of the areas of production of educational materials, publications, presentations, including personal response, personal evaluation, and interpretive skills.
The assessment was conducted by two fulltime and one external faculty on July 22, 2013. The external was added to strengthen the review. Results were as follows:

MERLOT Web Pages 3.4
Presentations 3.8
Peer Evaluations 3.60

The standard was met. Though, all standards were met Faculty noted that MERLOT Web pages scored the lowest. The faculty decided to conduct two assessments for the next cycle. One will be done in the fall and one in the spring.

The faculty modified an assignment in EDUC 528 Intro. to Presentation Design to make the MERLOT Web page a requirement rather than an option

Two assessments were recommended for AY 2013-2014.

How does this practice relate to pillars?: 

1) Leveraging MERLOT Content Builder with emerging technology to assess programmatic student learning outcomes is scalable because it encourages more online instructors and instructional designers to consider integrating this model to measure the effectiveness of assignments in meeting the goals for Institutional effectiveness planning.

2) Increases access by providing open access using MERLOT’S Content Builder combined with emerging technology to showcase learning outcomes for students and faculty to assess regardless of location as long as they have an internet connection.

3) Improves faculty satisfaction by providing faculty with open access to evaluate student assignments to assess programmatic student learning outcomes for Institutional effectiveness planning.
Since this model was used to complete a recent Institutional Effectiveness Plan for an online master’s degree program in preparation for a regional accreditation visit other instructors can easily replicate this model to evaluate their programs.

4) Improves learning effectiveness by providing instructors with effective online strategies that are supported by empirical data from assessments of random samples of student assignments .

5) Promotes student satisfaction by providing valuable opportunities for interaction with their instructor and other students. Students work together on group projects for both synchronous and asynchronous presentations. Students are also assigned group and individual projects to evaluate the work of their peers and provide feedback. Rubrics are embedded in the grade book of the LMS to evaluate student assignments. Also, an evaluation tool of the programmatic student-learning outcome that is tied to the assignment is also included in the grade book to assess the level of student understanding. Students regularly comment about how valuable these practices are to their learning experience.

Equipment necessary to implement Effective Practice: 

The only aspect completely necessary is an internet connection and an LMS. In our program, the students also used Camtasia, Quicktime and Captivate for creating videos to complete some of their individual projects. Group projects were completed using Google+ Hangouts, Skype, Voice Thread and Adobe Connect. Students also created MERLOT web pages, MDL 2 Courses and digital portfolios.

Some of the tools we used have costs associated with them. Here is a list of some them:

• Synchronous tools: Adobe Connect, Google Hangouts, Google chats, Skype
• Asynchronous tools: Voicethread, MERLOT Content Builder, Prezi, MERLOT GRAPE Camp, Peer Review Workshop and Discussion Forums in LMS
• Reflective tools: Journals, self-assessments, and digital portfolios

Estimate the probable costs associated with this practice: 

The only additional cost would be optional and would involve the use of some emerging technologies that are not open source. All other resources used in this project were open source and we did not incur additional costs using them. There was essentially no budget for this project.

References, supporting documents: 

Astin, A. (1993). Assessment for Excellence. Wesport, CT: Oryx Press.

Barnett-Queen, T., Blair, R., & Merrick, M. (2005). Student perspectives of online discussions: Strengths and weaknesses. Journal of Technology in Human Services, 23(3/4), 229-244.

Brandon, P., Young, D., Shavelson, R., Jones, R. Ayala, C., Ruiz-Primo, M., & Yin, Y. (2008). Lessons learned from the process of curriculum developers’ and assessment developers’ collaboration on the development of embedded formative assessments. Applied Measurement in Education, 21, 390-402.

Gardner, P. (2007). The ‘life-long draught’: From learning to teaching and back. History of Education, 36(4-5), 465-482.

Greenleaf, R. A. (2008). The Servant as Leader. Westfield, IN: The Greenleaf Center for Servant Leadership.

Price, B., & Randall, C. (2008). Assessing learning outcomes in quantitative courses: Using embedded questions for direct assessment. Journal of Education for Business, 83(5), 288-294.

Secolsky, C., & Wentland, E. (2010). Differential effect of topic: Implications for portfolio assessment. Assessment Update, 22(1), Wilmington, DE: Wiley Periodicals.

Walvoord, B. (2003). Assessment in accelerated programs: A practical guide. New Directions for Adult & Continuing Education, 97, 39-50.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Rick Lumadue
Email this contact: 
proflumadue@gmail.com
Effective Practice Contact 2: 
Rusty Waller
Email contact 2: 
rusty.waller@tamuc.edu