Using Text Analytics to Enhance Data-Driven Decision Making

Author Information
Liz Wallace
Melissa Burgess
Phil Ice
Institution(s) or Organization(s) Where EP Occurred: 
American Public University System
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

 American Public University System (APUS) developed the Text Analytics for Surveys Model to analyze qualitative nontraditional student data related to retention and progression. By using this innovative model, APUS was able to effectively and efficiently analyze large amounts of qualitative data.


Description of the Effective Practice
Description of the Effective Practice: 

 Recognizing the importance of postsecondary student retention and progression in online environments, the American Public University System (APUS) routinely examines student retention and progression through a tripartite methodological lens using (1) descriptive, (2) inferential, and (3) exploratory data. APUS also collects volumes of explanatory (qualitative) data that cannot be efficiently analyzed using traditional qualitative methodologies given the exceedingly high volume of student replies (over 5,000 per month). At best random sampling provides a cursory overview that is little more than anecdotal in nature.

Toward addressing this issue, APUS developed the Text Analytics for Surveys Model--an innovative approach to assessing qualitative data related to nontraditional student retention and progression. The Model is framed by the Community of Inquiry (CoI) Framework (Garrison, Anderson, Archer, 2000), whereby keywords and phrases related to subcategories of the CoI Presences (social, teaching, and cognitive) are identified and entered into the Model. The CoI Framework is a process model of learning in online and blended educational environments and is grounded in a collaborative constructivist view of higher education (Dewey, 1933) that assumes effective online learning requires the development of a community of learners to support meaningful inquiry and deep learning.

Although libraries exist for text analytics, they are not specific to the CoI Framework or higher education. This implementation leverages the exiting business rules for text analytics against the CoI Framework. The end product is a rich, explanatory modeling system that provides clarity into APUS’s wealth of quantitative analyses.


Supporting Information for this Effective Practice
Evidence of Effectiveness: 

 Evidence of Effectiveness: Using qualitative end-of-course survey data based on the CoI Framework (CoI), the Text Analytics for Surveys model was framed from the results of an initial pilot study, (n = 428), and then further validated in the current study (n = 219). Results from the pilot study not only informed further refinement of the model prior to the current study, it provided baseline criteria for data selection using the Text Analytics for Surveys Model. Data from a two month sample were collected and entered into the IBM/SPSS Text Analytics for Surveys model. Results from the text analysis model were compared to the results of data that were analyzed using traditional qualitative coding methods. Accuracy of 81% was established, thereby evidencing and supporting the model’s effectiveness and efficiency. As this technique moves toward full implementation, APUS is empowered to address causality through examination of triangulated quantitative and qualitative data.

Innovation: The Text Analytics for Surveys model presents an innovative solution toward effectively and efficiently analyzing large amounts of qualitative data related to student retention and progression. Although libraries exist for text analytics, they are not specific to the Community of Inquiry Framework or higher education. Replicability: The practice can be replicated and implemented easily in a variety of learning environments. Potential impact: The Text Analytics for Surveys model is a simple solution toward analyzing retention and progression-related student data based upon the CoI Framework, therefore would undoubtedly gain the attention of postsecondary institutions who wish to analyze their own qualitative data to inform and improving student retention and progression efforts. Scope: The Text Analytics for Surveys model demonstrates strong interrelationships with several other pillars including learning, scale, access, and faculty and student satisfaction.



How does this practice relate to pillars?: 

 The Text Analytics for Surveys model most closely relates to the Sloan-C student satisfaction, faculty satisfaction, and learning effectiveness pillars. The Model relates student satisfaction as the APUS End-of-Course Survey provides the opportunity for students to communicate (quantitatively and qualitatively) their satisfaction on various components of the course. The Model relates to faculty satisfaction as the qualitative student responses from the APUS End-of-Course Survey will help pinpoint areas and course components of concern, so that instructors can modify their course design. From a learnng effectivenes perspective, the addition of explanatory qualitative data helps isolate and remediate areas of pedagogical concern on a timely basis.


Equipment necessary to implement Effective Practice: 

 SPSS Statistics Survey and Text Analytics for Modular


Estimate the probable costs associated with this practice: 

 Although there are the upfront costs of SPSS software (approximately $10,000), once the keywords and phrases relating to the CoI are identified and entered into the system, there is no need for modifications. Skilled coders and qualitative researchers will also be needed to build the libraries for analysis. It is estimated that 100 hours of preparatory work would be required.


References, supporting documents: 

 Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conferencing environment. Journal of Asynchronous Learning Networks, 5 (2).

Arbaugh, J.B., Cleveland-Innes, M., Diaz, S.R., Garrison, D.R., Ice, P., Richardson, & Swan, K.P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. The Internet and higher Education, 11(3-4), 133-136.

Council for Higher Education Accreditation. (2010c). Recognition policy and procedures. Washington, DC: Council for Higher Education Accreditation.

Clark, (2011). Blog

Dewey, J. (1933). How we think. Boston: D.C. Health.

Eaton, J. (2010). Accreditation and the federal future of higher education. Academe, 96(5), 21-24. [Online] Retrieved from

Garrison, D. R. (2009). Communities of inquiry in online learning: Social, teaching and cognitive presence. In C. Howard et al. (Eds.), Encyclopedia of distance and online learning (2nd ed., pp. 352-355). Hershey, PA: IGI Global.

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105

Garrison, D. R., Anderson, T., Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1).

Lowe J. (1998). Jack Welch speaks. New York: Wiley; 1998.

Meacham, J. A., & Gaff, Jerry, G. (2006). Learning goals in mission statements: implications for educational leadership. Liberal Education, 92(1), 6-13.

Rozycki, E. G. (2004). Mission and vision in education. Educational Horizons, 82, pp. 94-98.


Contact(s) for this Effective Practice
Effective Practice Contact: 
Phil Ice
Email this contact:
Effective Practice Contact 2: 
Melissa Burgess
Email contact 2:
Effective Practice Contact 3: 
Liz Wallace
Email contact 3: