Faculty at the University of Reading's School of System Engineering have adapted customer-driven development techniques to involve students in the production of Assessment Learning Objects (ALOs) for computer programming courses. Students were asked to produce multiple choice questions to be used to create ALOs in order to improve the quality of ALOs and to reduce the effort required for mass development of ALOs. The process of creating source material for ALOs benefited students as a worthwhile educational activity, while the products themselves benefit students as learning aids. Faculty also benefited from greatly reduced production time and effort needed to create quantities of ALOs.
A software development method called eXtreme Programming (XP) uses customer-driven development to involve customers, in particular end users of all levels, in the software (re)design process. At the University of Reading's School of System Engineering, Andrew Adams and Shirley Williams have adapted this technique to involve students in the production of e-learning materials for computer programming courses. In 2004-05, students in a functional programming module were assigned the task of creating multiple-choice questions which would form the basic content of Assessment Learning Objects (ALOs). Students were asked to select one of a half-dozen topics related to Programming in Caml Light and create one multiple-choice question suitable for assessing student understanding of the content. Students were given specific design parameters for creating the questions, including number of answer choices, tips on creating "distracter" answer choices, and instructions for providing useful feedback for incorrect answers. Based on the assumption that the assigned task had unknown educational value, the instructors assigned a very low weight to the task for grading purposes (1% of the overall assessed grade for the module). These ALOs were used for formative assessment only and made available to help students self-assess their progress.
The main purposes of this project were to improve the quality of ALOs by involving students in their development and to reduce the effort required for faculty to produce ALOs in large quantities. To demonstrate the utility of this approach, Adams and Williams designed the project as an experiment with the following outcomes in mind:
Most of the participating students (37 out of 44) produced usable material and achieved good marks (3, 4, or 5 out of 5) for their contributions. Although only three students produced "ideal" questions (i.e., receiving marks of 5 out of 5), most students produced questions which could be used as ALO source materials to produce finished questions with relatively minor modifications. Only seven of the forty-two questions submitted could not be used as source material. Plans to obtain additional evidence: According to the paper, a second study was under way, using the same technique in an online C language imperative programming module, which will be used to teach first year programming to over two hundred students starting Fall 2007. The expectation is that student-generated ALO content will provide excellent support for the production of suitable online self-assessment ALOs. However, these plans have apparently not come to fruition; as Dr. Adams noted in an e-mail message [3/7/07]: Unfortunately for longitudinal studies of our student-generated content, the Functional Programming module was phased out of our curriculum the year after we did our experiments with the cohort and so continuing use of the material could not be made...I am considering applying this approach for our Discrete Maths section of our Software Engineering module in coming years.
Learning effectiveness: The authors note that "poor questions came from students who had a lack of understanding of the technical material...These students needed additional coaching so that they could master the subject." In other words, the ALO production process was itself an excellent form of formative assessment by being an effective way of finding out which students needed coaching. Also, a number of students reported that the task was a "useful educational exercise" and that they gained a "significant amount of understanding of the purpose and structure of multiple-choice questions" as a result of performing the task. Student satisfaction: Student feedback indicated that the students found the task interesting and different and did not find the task to be a burden. Cost effectiveness: Upgrading student-generated source material required less than half the time needed to for faculty produce such questions from scratch.
Faculty satisfaction: Producing finished questions from student-generated source material eliminated the "boredom factor" by removing the burden of having to produce numerous question variants on the same topic.
Adams, A. & Williams, S. (2006). Customer driven development for rapid production of assessment learning objects. The Electronic Journal of e-Learning, 4(1), 1-6. Available at www.ejel.org