Sponsor Videos

Software Secure

Keypath Education

Conference News

LinkedIn FaceBook YouTube GooglePlus www.instagram.com/onlinelearningconsortium


Download the Mobile App
IOS  |  Android
OLC Mobile App

Make your travel arrangements

Yoga with Jan

Add to my registration


American Higher Education in Crises book cover

Join keynote speaker Goldie Blumenstyk for a book signing.

Books are available for pre-purchase for $16.95 (+tax). 
Read more

Conference Program now posted! This year's line-up includes:


OLC Excellence and Effective Practice Award Recipients Announced


Add/remove sessions from the Program Listing on the website or in the mobile app to create a list of sessions you want to attend!

My Schedule

Join Keynoters Goldie Blumenstyck (Chronicle of Higher Education) and Phil Hill and Michael Feldstein (MindWires Consulting)

BYOD to learn, explore, and share knowledge within this lab environment

Test Kitchen

Save the Dates

22st Annual OLC International Conference
November 16-18, 2016 | Orlando, Florida | Walt Disney World Swan/Dolphin Resort

OLC Innovate 2016 - Innovations in Blended and Online Learning
April 20-22, 2016 | New Orleans, LA | Sheraton New Orleans Hotel

Enhancing Student Learning and Engagement in Online Courses by Using Learning Analytics Data

Florence Martin (University of North Carolina at Charlotte, USA)
Patti Wilkins (University of North Carolina at Charlotte, USA)
Session Information
October 14, 2015 - 12:45pm
Learning Effectiveness
Major Emphasis of Presentation: 
Research Study
Major Emphasis of Presentation: 
Practical Application
Major Emphasis of Presentation: 
Theory/Conceptual Framework
Major Emphasis of Presentation: 
Blended Program/Degree
Institutional Level: 
Multiple Levels
Audience Level: 
Session Type: 
Information Session
Europe 3
Session Duration: 
45 Minutes
Concurrent Session 2

Student learning and engagement patterns are analyzed and heuristic are provided for instructors based on Learning analytics data from two Quality Matters certified online courses.

Extended Abstract

In the fall of 2012, 6.7 million students were reported to be enrolled in at least one online course in higher education (Allen & Seaman, 2013). This demonstrates that there is massive growth in higher education in the number and percentage of university students taking online courses. As the number of online courses increase so do concerns about student learning and engagement (Nistor & Neubauer, 2010; Patterson & Mcfadden, 2009). Learning analytics is a new approach and a few studies have been conducted to measure online learning and engagement (Macfadyen & Dawson, 2010; Friz, 2011). Fournier, Kop, and Sitlia (2011) define learning analytics as "the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs" (p. 3). Learning analytics is also defined as "interpretation of a wide range of data produced by and gathered on behalf of students in order to assess academic progress, predict future performance, and spot potential issues" (Johnson et al., 2011, p.28). The introduction of learning analytics techniques into education research now enables the analysis of student learning and engagement in online learning based on the data from the Learning Management System.

Macfadyen and Dawson (2010) mined data from the LMS and studied the relationship between student LMS use (e.g., posting discussion messages, completing quizzes) and academic achievement. Their study used logistic modeling to predict the failing students. They also stated that "pedagogically meaningful information can be extracted from LMS-generated student tracking" (p.1). Fritz (2011) used the check my activity tool to study relationship between student performance and activity in the LMS. They found that students earning a D or F used the LMS 39% less than students earning a grade of C or higher. Arnold and Pistilli (2012) used an application called signals which was developed to provide instructors the opportunity to use the power of learner analytics to intervene and provide feedback to students who were not doing well in their courses.

Quality Matters
This study used the Quality Matters standards as the pedagogical characteristic to guide the analysis of design elements in the courses. The pedagogical characteristics include:
- course overview & introduction
- learning objectives
- assessment & measurement
- instructional materials
- course activities & learner interaction
- course technology
- learner support, and
- accessibility &usability

Purpose of this Study
The central objective of this project was to use learning analytics to enhance student learning and engagement in online courses. This was accomplished by analyzing the pedagogical characteristics of the courses using the Quality Matters rubric and learning analytics generated by the Moodle LMS in two courses in the 100% Online Master's Program in Instructional Systems Technology (IST) at a Southeastern University in the United States..

Research Questions
These two research questions that are answered in this study:
1. In what ways do learning analytics data on the pedagogical characteristics of online courses impact student learning and engagement?
2. In what ways might this learning analytics data collection and analysis contribute to the creation of a heuristic for instructors using Moodle or another learning management system in online courses to guide the improvement of online instructional delivery?

Research Design
We used observational research design, which is studying without experimental control over the assignment of participants. The data that was analyzed from Moodle was both quantitative and qualitative in nature. As such, mixed methods research design as outlined by Creswell (2012) and Miles and Huberman (1994) was employed.

Tools used

The LMS log files were exported and then analyzed using a general purpose data visualization tool. The tool provided a means for the analysis and visualization of quantitative data related to site usage, time spent, forum posts, and number of files attachment to these. The tool also allows for grades, student interaction, and collaboration to be aggregated. In this way, we way were able to create profiles of student pedagogical characteristics and be able to understand how they interact in the context of a Moodle course and thereby illuminate issues related to student success as it relates to course design. Additionally, the amount of text generated for these were also quantified.

Results and Discussion
Visualization from the data and discussion of the findings will be presented at the presentation.
The overarching goal of this project is to improve online teaching and learning by exploring pedagogical factors that contribute to student success (learning and engagement) in online learning. Instructors may use this knowledge in online teaching to design effective online courses. Instructional designers may use this knowledge to recommend best practices on online course design. Administrators may use these results to design successful online programs. Bringing together these points of view will help improve online teaching and learning.

References (not included in the word count)
Arnold, K. E. & Pistilli, M. D. (2012). Course Signals at Purdue: Using learning analytics to increase student success. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge. New York: ACM
Allen, I. E., & Seaman, J. (2010). Class differences: Online education in the United States, 2010, Babson Survey Group: The Sloan Consortium. Retrieved from http://sloanconsortium.org/publications/survey/pdf/class_differences.pdf
Fritz, J. (2011). Classroom walls that talk: Using online course activity data of successful students to raise self-awareness of underperforming peers. The Internet and Higher Education, 14(2), 89-97.
Fournier, H., Kop, R., & Sitlia, H. (2011). The value of learning analytics to networked learning on a Personal Learning Environment. 1st International Conference on Learning analytics and Knowledge 2011, Banff.
Johnson, L., Smith, R., Willis, H., Levine, A., and Haywood, K., (2011). The 2011 Horizon Report. Austin, Texas: The New Media Consortium.
Macfadyen, L.P., & Dawson, S. (2010). Mining LMS data to develop an "early warning system" for educators: A Proof of Concept. Computers & Education (54)11, 588-599.
Nistor, N., & Neubauer, K. (2010). From participation to dropout: Quantitative participation patterns in online university courses, Computers & Education, 55(2), 663-672.
Patterson, B., & McFadden, C. (2009). Attrition in online and campus degree programs. Online Journal of Distance Learning Administration, 12(2).

Lead Presenter

Florence Martin is an Associate Professor in the Instructional Systems Technology program at the University of North Carolina Charlotte. She received her Doctorate and Master's in Educational Technology from Arizona State University. She has a bachelor's degree in Electronics and Communication Engineering from Bharathiyar University, India. Previous to her current position, she taught at University of North Carolina Wilmington for seven years. She also worked on instructional design projects for Shoolini University, Viridis Learning, Maricopa Community College, University of Phoenix, Intel, Cisco Learning Institute, and Arizona State University. She worked as a co-principal investigator on the Digital Visual Literacy NSF grant working with Maricopa Community College District in Arizona. She researches on designing and integrating online learning environments.