Elevating participation and outcomes with digitized assessments in large-enrollment foundational STEM curricula: An immersive development workshop for STEM faculty

Author Information
Author(s): 
Ronald DeMara
Author(s): 
Baiyun Chen
Author(s): 
Charles Hartshorne
Institution(s) or Organization(s) Where EP Occurred: 
University of Central Florida
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

The Assessment Digitization Innovation (ADI) faculty workshop at the University of Central Florida (UCF) focuses on digitizing and remediating STEM assessments in large-enrollment foundational STEM curricula. In 2016-2018, a total of three cohorts of 25 instructors and 23 TAs from various STEM disciplines participated in this workshop and redesigned 14 flipped, blended, and online courses across eleven-degree programs that impacted over 10,000 students over the years.

Description of the Effective Practice
Description of the Effective Practice: 

Research has shown that digitized assessments enable instructors to efficiently and effectively manage their instructional activities (DeMara, Chen, Hartshorne, & Thripp, 2017) and ultimately lead to higher student learning in various learning modalities (Angus, & Watson, 2009; Schurmeier, Shepler, Lautenschalger, & Atwood, 2011). However, the design of effective digitization for formative and summative assessments that are suitable for online delivery remains an open challenge across disciplines in science, technology, engineering and mathematics (STEM) (Chen, DeMara, Soheil, & Hartshorne, 2018). The STEM-specific challenges include the need to adequately evaluate conceptual understanding, design skills, and solution structure that exceed the capabilities of rote multiple-choice formats. At the University of Central Florida (UCF), we have developed and evaluated a six-week cross-disciplinary Assessment Digitization Innovation (ADI) Workshop that supports STEM faculty in developing digitized assessments for flipped, blended and online courses.

The faculty workshop, having recently completed its third year of implementation, includes four face-to-face in-class sessions and two online modules. At the end of the six-week program, each participating instructor showcases an online assessment that they designed and developed throughout the workshop. Primary topics of the program include: 1) strategies to construct effective STEM assessments, 2) using relevant question types and features in Canvas, a learning management system (LMS), 3) implementing authentic assessment, 4) strategies to encourage academic integrity in online assessments, and 5) composing exemplar design vignette questions to reinforce connections between concepts to achieve integrative learning (DeMara, Chen, Hartshorne, & Thripp, 2017). As a result of the course redesign, the participating instructors improved the quality and productivity of the instruction by focusing their time and efforts on high-impact mentoring and collaborative activities for soft skill development, rather than low-yield logistical tasks, such as distributing paper-based tests and grading (DeMara, Salehi, Hartshorne, Chen, & Saqr, 2018).

The content of the ADI workshop was organized into six modules, at a rate of one module per week, plus a preparatory Week 0 module. The workshop began with an overview of the STEM pedagogy in Week 1 and then engaged the participants to plan the modularization of their target course. The immersive sample quiz was administered in Week 2 so participants were provided with the same process their students would experience in a digitized quiz. Weeks 3–4 concentrated on constructing study sets using exemplar vignettes, the process of score clarification as a pedagogical approach, and the question development flow. A panel discussion with graduate assistants was also held, which was well-received by participating faculty, who were provided opportunities to ask questions about logistics. Week 6 culminated in a capstone showcase activity, where all participating faculty presented their digitized assessments to their peers, and received a graduation certificate for their professional development records.

As promulgated in the ADI workshop, each study set typically focused on a single technical content area and/or principle. STEM problems were then decomposed into detailed subsections to enable partial credit formulation. The guidance provided in the ADI workshop focused on identifying governing equations for each sub-step of the problem, in an effort to address varied partial credit approaches. As a result, solutions can exhibit approaches and precise calculations, which are required for solving the given problems. Further, in the ADI workshop, flipped and blended classroom pedagogies were modeled by providing out-of-class homework with open solutions, but without submission, allowing for credit to be earned by completing the corresponding quiz for a study set.

The ADI workshop covered various types of questions for digitizing assessments, including multiple choice, multiple answers, and formula questions. The quiz items were constructed using incremental assessments with partial credit to enable the precise evaluation of comprehension and problem-solving ability of the students. Further, the ADI workshop advocated assigning a multiple-day window for completing the assessments, maximizing the pedagogical benefits of utilizing several clones of questions, each based on various versions of a core problem, and designed to avoid different students receiving identical problems.

Overall, this workshop-driven transportability approach has demonstrated efficacy for the challenges facing assessment digitization within STEM curricula. These included novel approaches to surmounting the challenges in digitized delivery of automatically-graded partial credit, solution composability/traceability including handwritten work via scanned-in scratch sheets, and problem conveyance within the coƒnstraints of contemporary LMSs to realize:
1) auto-grading for formative and summative assessments for STEM design problems,
2) computer-based methods for secure self-paced review of solutions by students with rapid remediation not previously possible, and
3) a novel Score Clarification approach utilizing a hierarchy of expertise from GTAs as tutors for routine concerns, with the instructor providing deeper guidance and follow-up. This self-motivates learners in a quest for partial credit to explain the problem-solving flow that they used in their formative assessment submissions. Thus, digitization of assessments can increase student engagement through either face-to-face or online in-person tutoring interwoven with assessment via Socratic discussions, fostering metacognition.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

The faculty workshop was delivered in the summer semester of 2016, 2017 and 2018. Participants included 25 instructors and 23 Graduate Teaching Assistants (GTAs) currently teaching gateway courses in the STEM disciplines. The participating instructors redesigned their courses with course release incentive from their department or the Center for Distributed Learning that impacted over 10,000 undergraduate students at UCF. As a result of the digitization initiative, various funding sources were received, including a $275K technology fee grant, and the support of State of Florida Information Technology Performance Funds totaling roughly $400K to-date for technologies, equipments, course release, etc. Additionally, this faculty workshop laid the preliminary groundwork for a later National Science Foundation awarded grant on the topic of Digitally-Mediated Team Learning:
https://www.nsf.gov/awardsearch/showAward?AWD_ID=1825007&HistoricalAward....

How does this practice relate to pillars?: 

This practice relates to pillars in the following ways.

Learning Effectiveness
A total of fourteen faculty participants have completely or partially redesigned their course with digitized assessments and active learning strategies after the conclusion of the ADI workshop. Student learning outcomes in those redesigned courses have been comparable or better than those of traditional courses. For instance, in one of the redesigned Electrical Engineering course, withdrawals were reduced 26.8% by the same instructor adopting the new modality than previously across multiple semesters, thus increasing retention by elevating engagement of at-risk students via directed tutoring (DeMara, Khoshavi, Pyle, Edison, Hartshorne, Chen, B., & Georgiopoulos, 2016). In another redesigned Computer Science course, students assessment scores in the digitized format were comparable with the traditional format (DeMara, Turgut, Nassiff, Bacanli, Bidoki, & Xu, 2018). One Mechanical Engineering course was delivered in a blended format for the first time in spring 2018. With its new testing delivery mechanisms and and remediation mechanisms, students’ learning achievements increased by up to 16.9% compared to conventional assessment strategies, while utilizing comparable instructor resources and workloads (Tian & DeMara, 2018).

Scale
Here is a list of the course titles that have been delivered in the redesigned format:
EEL3004: Electrical Networks
ESI4234: Quality Engineering
EEL4781: Computer Networks
EEE3342: Logic Design
EML4142: Heat Transfer I
COP4331: Object Oriented Software
CAP4104: Human & Tech interaction
COP3223: Intro Programming with C
EGN3343: Thermodynamics
CDA3103: Computer Logic and Organization
ESI4221: Empirical Methods for Industrial Engineering
CGN3700: Civil Engineering Measurements
EGM3601: Solid Mechanics
EGN3321: Engineering Analysis-Dynamics

Here is a list of the course titles that are in the process of redesign by the instructors:
EGN3310: Engineering Mechanics - Statics
CWR3201: Engineering Fluid Mechanics
ELM3034: Modeling Methods in MAE
EML4703 Fluid Mechanics II
EAS 3800C Aerospace Engineering Measurements
CWR 3201 Engineering Fluid Mechanics
BSC2010 Biology I
PHY2053 College Physics
MAC1105C College Algebra
MAC1147 Pre-Calculus Algebra/Trigonometry

The number of enrolled students who benefited from these redesigned courses totals over 10,000 enrollments.

Access
Assessment digitization has transformed the management and delivery of these large-enrollment foundational STEM course where students receive timely personalized support from their instructors and teaching assistants (TA) based on their performance on the formative assessments. Upon the closing of the testing window of each formative and summative assessment, students receive just-in-time feedback on their performance, including class high score, low score, mean, and a histogram distribution of performance. More importantly, students are afforded an opportunity to review their score, the solution, and their submission with the assistance of a TA. Additionally, the TAs and the course instructor are available to provide personalized tutoring based on students’ test performance and provide students assistance with design and debugging of projects.

Faculty Satisfaction
Upon conclusion of the program every summer, anonymous feedback was collected from participating instructors via a post-workshop survey. The post-workshop survey results were overwhelmingly positive. All respondents were “very satisfied” with the in-class sessions, the facilitators of the workshops, and the online modules. Specifically, they rated the program topics, examples, and resources provided to be highly relevant. The majority of the respondents agreed that the workshop will impact their future course design and development in beneficial ways, such as time-savings, convenience, student remediation, and the ability to serve large enrollments. Respondents indicated unanimously that the assessment digitization techniques presented were applicable to their targeted courses. They agreed that the digitization methods would improve their ability to serve large enrollments and enhance the convenience of assessment delivery in their course, and the majority agreed that the techniques can impart valuable time savings for themselves and their GTAs, meaning the freed-up GTA hours can be allocated to tutoring and increase their ability to identify areas for remediation.

Student Satisfaction
In one of the redesigned Electrical Engineering course, perceptions from students have been overwhelmingly positive with 90% of learners Agreeing or Strongly Agreeing that EPC’s pedagogy of Study Sets followed by computerized assessment are more effective for learning than Homework alone, 78% responding that access to the EPC resulted in a more personalized learning experience, and 87% responding that they wish tutoring from the EPC was available in other courses. Finally, employing an ensemble of GTAs who are pooled together across a variety of degree programs cultivates diversity of thought and delivery. Thus, their collective instructional involvements strengthen understanding throughout UCF’s diverse student population (DeMara, Khoshavi, Pyle, Edison, Hartshorne, Chen, B., & Georgiopoulos, 2016). Students in the redesigned Mechanical Engineering course also responded favorably in the course survey and agreed that computerized questions are adequate to
evaluate engineering problem-solving skills comparing to conventional exams (Tian & DeMara, 2018).

Equipment necessary to implement Effective Practice: 

The ADI workshop was delivered as a six-week mixed-mode format course. The face-to-face meetings were delivered in the Faculty Multimedia Center and the sessions were recorded for online participants. All participants had access to the Canvas training course and a sandbox course for their own development.

Estimate the probable costs associated with this practice: 

The probable costs associated with the ADI workshop and course redesign process include instructor compensation and any costs associated with incentivizing the course redesign. All participants were incentivized with a course release that instilled participants with the needed time and resources to succeed in implementing digital assessments and associated pedagogical practices.

References, supporting documents: 

Angus, S. D. & Watson, J. (2009). Does regular online testing enhance student learning in the numerical sciences? Robust evidence from a large data set. British Journal of Educational Technology, 40(2), 255-272.

Chen, B., DeMara, R., Soheil, S., & Hartshorne, R. (2018). Elevating learner achievement using formative electronic lab assessments in the engineering laboratory: A viable alternative to weekly lab reports. IEEE Transactions on Education, 61(1), 1-10. DOI: 10.1109/TE.2017.2706667

DeMara, R. F., Chen, B., Hartshorne, R., & Thripp, R. (2017). Elevating participation and outcomes with computer-based assessments: An immersive development workshop for engineering faculty. ASEE Computers in Education Journal, 8(3), 1–12.

DeMara, R., Khoshavi, N., Pyle, S., Edison, J., Hartshorne, R., Chen, B., Georgiopoulos, M. (2016). Redesigning computer engineering gateway courses using a novel remediation hierarchy. Proceedings of American Association for Engineering Education National Conference (ASEE), New Orleans, LA. June 26-29.

DeMara, R. F. Turgut, D. Nassiff, E. Bacanli, S. Bidoki, N. H., & Xu, J. (2018). Automated Formation of Peer Learning Cohorts using Computer-Based Assessment Data: A Double-Blind Study within a Software Engineering Course. Proceedings of American Association for Engineering Education Annual Conference (ASEE-18), Salt Lake City, UT, USA. June 24-26.

DeMara, R., Salehi, S., Hartshorne, R., Chen, B., & Saqr, E. (in review). Scaling up collaborative learning in large enrollment STEM courses: An exploration of learner perceptions. Submitted to Journal of Interactive Learning Research.

Schurmeier, K. D., Shepler, C. G., Lautenschlager, G. J., & Atwood, C. H. (2011). Using item response theory to identify and address difficult topics in general chemistry. In D. M. Bunce (Ed.), Investigating Classroom Myths Through Research on Teaching and Learning (137-176). Washington, DC: ACS Publications,.

Tian, T. & DeMara, R. F. (2018). Engineering assessment strata: A layered approach to evaluation spanning Bloom’s taxonomy of learning. Conference proceeding of the American Association for Engineering Education Annual Conference (ASEE-18), Salt Lake City, UT, USA. June 24-26.

Contact(s) for this Effective Practice
Effective Practice Contact: 
Ronald DeMara
Email this contact: 
Ronald.DeMara@ucf.edu
Effective Practice Contact 2: 
Baiyun Chen
Email contact 2: 
baiyun.chen@ucf.edu
Effective Practice Contact 3: 
Charles Hartshorne
Email contact 3: 
richard.hartshorne@ucf.edu