Assessing the Effectiveness of Online Educator Preparation

Award Winner: 
2013 Sloan-C Effective Practice Award
Collection: 
Vendor EPs
Author Information
Author(s): 
Robert L. Blomeyer, Ph.D
Author(s): 
Dazhi Yang, Ph.D.
Author(s): 
Andy Hung, Ed.D.
Institution(s) or Organization(s) Where EP Occurred: 
Boise State University (Independently conducted construct validity study)
Institution(s) or Organization(s) Where EP Occurred: 
Online Teaching Associates, Ltd. (Secondary Educational Research and Development)
Institution(s) or Organization(s) Where EP Occurred: 
North Central Regional Educational Research Laboratory (2001-2005, Primary R&D)
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Assessment and evaluation of online professional development (PD) programs, workshop, courses and the examination of their presumed effects on instructors’ competency and performance have been largely relied on post PD self-reported surveys. However, there were no valid or reliable self-reported surveys (assessment instruments, performance rubrics or other assessment techniques) available for such assessment and evaluation. This study reports the validation, using psychometric methods, of an "Online Educator Self-Efficacy Scale" (OESES) which can be used to assess PD effects or the effectiveness of PD training concerning teaching online. Implications of the study provide support for: (1) evidence-based online professional development program evaluations, and (2) effective "data-driven" decision-making for online program administrators.

Description of the Effective Practice
Description of the Effective Practice: 

See: The Validation of A Research-based Tool for Assessing the Effectiveness of Online Professional Development Programs. By Dazhi Yang, Andy Hung & Robert Blomeyer. Paper presented at the 2013 American Educational Research Association's Annual Conference (AERA) in San Francisco, CA. May 1st, 2013.

http://onlineteachingassociates.com/wp-content/uploads/2013/04/AERA_2013...

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

Evidence illuminating the "construct validity" of OTA's Online Educator Self Efficacy Survey (OESES) is described in the AERA paper provided under "Effective Practice."
The OESES pre & post versions can be downloaded here.

http://onlineteachingassociates.com/?page_id=1167

Both are available under a Creative Commons License. (BY-NC-ND)

Preliminary Evidence supporting the impact & "effectivness" of OTA's "Online Teaching Course" (OTA-121) s available for download here:

http://onlineteachingassociates.com/wp-content/uploads/2012/05/Summary-F...

Note: At the time data from the 344 cases cited was collected in 2011, OTA couldn't claim the data was collected using a "valid and reliable" assessment instrument. Also note that the 231 cases used by Yang & Hung to assess the construct validity of the OESES instrument were collected in Wisconsin by CESA-9 using their licensed copy of Survey Monkey. The data analyzed by Yang & Hung went directly from CESA-9's project administrator to independent educational researchers at Boise State University. Their analysis was conducted with approval of and administrative oversight by BSU's Human Subjects committee.

How does this practice relate to pillars?: 

Sloan C's Third Pillar is "Learning Effectiveness. The following quote comes from the "Background" section of the 2013 AERA paper by Yang, Hung & Blomeyer; provided above under "Description of the Effective Practice."

"The effectiveness of online courses depends mostly on instructor’s effectiveness of teaching online (Rice, 2012). Knowledge and skills developed to teach in face-to-face settings are not adequate preparation for teaching online (Deubel, 2008). Many of today’s online instructors still lack the needed skills and knowledge to teach effectively online. Few teacher education programs in the U.S. offer any training in learning theories or teaching pedagogies appropriate for online environments (Patrick & Dawley, 2009).

According to Dawley, Rice and Hinck’s report on the status of professional development and needs of K-12 online teachers (2010), approximately 12% of brand new teachers had never taught face-to-face and 25% of new online teachers received no training in online teaching pedagogies. Professional development (PD) programs, workshops and courses designed for effective online teaching are usually the most common way for teachers to obtain the necessary knowledge, skills and competency for teaching online.

Assessment and evaluation of such online PD programs and online instructors’ competency and performance after receiving the PD training have been largely relied on post PD self-reported surveys. However, there were no valid or reliable self-reported assessment instruments, performance rubrics or assessment techniques available which could be used in online settings (Wijekumar, Ferguson, & Wagoner, 2006; Yang, Richardson, French, & Lehman, 2011). Therefore, there was a need to develop and/or validate assessment and evaluation instruments for the evaluation of PD training programs, workshops, and courses which target for effective online teaching. Only valid and reliable instrument can provide solid results of the effectiveness of the PD training and provide recommendations for the improvement of such programs, workshops and courses."

Op. Cit., Pg 2.

Equipment necessary to implement Effective Practice: 

Requirements for using the OESES assessment:

Pre & Post versions of the OTA Online Educator Self Efficacy Survey instrument and any suitable online survey survey delivery system.

Requirements for using the OTA Online Teaching Course:

Secondary and post-secondary education institutions & programs should contact OTA's President and CEO (Dr. Robert Blomeyer) to inquire about licensing either the secondary or post-secondary version of OTA's Online Teaching Course.

Estimate the probable costs associated with this practice: 

The OESES instrument is available for use by not-for-profit secondary and post-secondary educational institutions and programs without cost, under a Creative Commons BY-NC-ND license.

The OTA Online Teaching Course is available as online professional development for mid-level/secondary teachers and for full-time post secondary faculty or adjunct faculty members at costs ranging from $350.00 / seat for fully facilitated professional development courses (on OTA's Moodle server) to annual (5K / year) or perpetual (20K) for an unlimited, restricted license to use OTA's course on the contracting institution's local LMS. The Online Teaching Course is available for deployment on most LMS's as a IMS compliant Common Cartridge.

References, supporting documents: 

From: 2013 AERA Paper by Yang, Hung & Blomeyer, referenced above

Reference
Alagumalai, S., Curtis, D.D. & Hungi, N. (2005). Applied Rasch Measurement: A book of exemplars. Springer-Kluwer.
Allen, E. & Seaman, J. (2012). Going the distance: Online education in the United States, 2011. Retrieved from:http://www.onlinelearningsurvey.com/reports/goingthedistance.pdf.
American Educational Research Association, American Psychological Association & National Council on Measurement in Education (1999). Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association.
Bruce Consulting. (2004) Concept Validity of the Online Teaching Facilitation Course. Learning Point Associates: Naperville, IL. Retrieved from:http://onlineteachingassociates.com/?attachment_id=1259
Cavanaugh, C. & Blomeyer, R. (2007, December,) What works in K-12 online learning. In the proceedings of the International Society for Technology in Education, Oregon: Eugene.
Dawley, L., Rice, K., & Hinck, G. (2010). Going virtual! 2010: The status of professional development and unique needs of k–12 online teachers. White paper prepared for the North American Council for Online Learning. Washington, D.C. Retrieved from: http://www.inacol.org/research/docs/goingvirtual.pdf
French, B. (2006). EDPS 630: Research procedures in education. West Lafayette, IN. Copymat Services.
ISTE, (2011 ). NETS for Teachers. Retrieved from: http://www.iste.org/standards/nets-for-teachers.aspx
iNACOL (2011). National Standards for Quality Online Teaching, Version 2. Retrieved from:http://www.inacol.org/research/nationalstandards/iNACOL_TeachingStandard...
Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed.) (pp. 13–104). New York: American Council on Education, National Council on Measurement in Education.
Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50(9), 741–749.
Moskal, B. M., & Jon, A. L. (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research & Evaluation, 7(10). Retrieved May 31, 2007, fromhttp://PAREonline.net/getvn.asp?v=7&n=10
Partnership for 21st Century Skills, (2011). Framework for 21st Century Learning. Retrieved from:
http://www.p21.org/overview/skills-framework
Rice, K. (2012). Making The Move To K-12 Online Teaching: Research-Based Strategies And Practices. Pearson Education, Inc. Saddle River, NJ.
Schwarzer, R. (n.d.). The General Self-Efficacy Scale (GSE). Retrieved from:
http://userpage.fu-berlin.de/~health/engscal.htm
Schwarzer, R., & Jerusalem, M. (1995). Generalized Self-Efficacy scale. In J. Weinman, S. Wright, & M. Johnston, Measures in health psychology: A user’s portfolio. Causal and control beliefs (pp. 35-37). Windsor, UK: NFER-NELSON.
Wijekumar, K., Ferguson, L. & Wagoner, D. (2006). Problems with Assessment Validity and Reliability in Web-Based Distance Learning Environments and Solutions. Journal of Educational Multimedia and Hypermedia, 15(2), 199-215.
Yang, D., Richardson, J. C., French, B. F., & Lehman, J. D. (2011). The development of a content analysis model for assessing students’ cognitive learning in asynchronous online discussions. Educational Technology Research and Development, 59(1), 43-70.

Other Comments: 

OTA provided a customized version of the OTA-121 course for a group of faculty and adjuncts from the National College of Education at National Louis University in Chicago in late 2010. The OESES instrument was administered Pre-Post to that NLU cohort. A Pre-Post T-Test comparison is attached here as a "Supporting Document." Although the number of NLU cases considered in that T-Test comparison is is relatively small, the results appear comparable to T-Tests from 2011 based on as many as 244 cases.

 

Dr. Karen Swan (UIS) served as a paid, IES reviewer for the NCREL Online Teaching Facilitation Course (OTFC), developed originally at NCREL using Blackboard. in 2002-03. See Yang, Hung & Blomeyer 2013, in the section titled "The Online Educator Self-Efficacy Scale (OESES)" on Pg. 3 for details.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Dr. Robert Blomeyer; OTA
Email this contact: 
rblomeyer@earthlink.net
Effective Practice Contact 2: 
Dr. Dazhi Yang (BSU)
Email contact 2: 
dazhiyang@bosiestate.edu
Effective Practice Contact 3: 
Dr. Karen Swan (Univerty of IL., Springfield) Audience Participant @ AERA
Email contact 3: 
kswan4@uis.edu