Using the Community of Inquiry Framework Survey for Multi-Level Institutional Evaluation and Continuous Quality Improvement

Award Winner: 
2009 Sloan-C Effective Practice Award
Author Information
Author(s): 
Phil Ice
Institution(s) or Organization(s) Where EP Occurred: 
American Public University System
Effective Practice Abstract/Summary
Abstract/Summary of Effective Practice: 

Though several models seek to explain the online learning process, the one that has gained the most attention is the Community of Inquiry Framework (CoI). Originally conceptualized by Garrison, Anderson and Archer (2000), the CoI is a process model that describes collaborative, constructivist online learning as being a function of the intersection of three presences: teaching, social and cognitive. In 2007, the CoI was operationalized as a survey instrument and validated, on a multi-institutional basis, by eight researchers in the United States and Canada (Arbaugh et al, 2008). Following three semesters of beta testing, American Public University System (APUS) began utilizing the CoI survey, plus four additional items, as the end of course survey. Given the wealth of data that the surveys would yield, two of the eight creators of the CoI survey (Phil Ice, Director of Course Design, Research and Development, American Public University System and Jennifer Richardson, Associate Professor of Educational Technology, Purdue University) developed a set of protocols for multi-level data analysis for use at American Public University System. The process encompasses a broad range of analyses across Schools, Programs, Courses, and Instructors. Additionally, survey data is used to assess the efficacy of Instructional Design trends, Technology implementations and Retention patterns. The process has yielded a wealth of data that informs continuous quality improvement across all levels at APUS.

Description of the Effective Practice
Description of the Effective Practice: 

At American Public University, the CoI survey (plus four additional items) is used as the end of course survey. Once collected survey data are aggregated and sorted in the following ways: 1. Overall – analysis of all surveys is conducted using descriptive statistics and principal axis factor analysis. Means and standard deviations provide an overview of general satisfaction with each of the survey items across the University. Factor analysis is used to insure that overall conceptual alignment with the three presence (teaching, social and cognitive) construct is being maintained across the University. In this process attention is given not only to the strength of factor loadings but also potential co-loadings on other factors. For both descriptive statistics and factor analysis, three different runs are conducted. First, the data is analyzed on a standalone basis for the current semester to obtain a current state view. Second, data from all semesters to determine if significant changes have occurred in either the current state descriptive statistics or factor analysis as compared to all historical data. Third, a longitudinal comparison of descriptive statistics, using regression analysis, and factor analyses is conducted to examine potential trends in semester over semester data. 2. School – The same procedure detailed for the Overall data analysis is completed for each of the schools at APUS. In this process both descriptive statistics and factor analyses are compared within Schools and across Schools. This process is designed to determine how Schools compare to each other both in terms of student satisfaction with each of the survey indicators and conceptual alignment with each of the three presences. 3. Programs – The procedure described for Schools is repeated for Programs. Comparisons of descriptive statistics and factor analyses are made in two ways. First, comparisons are made between Programs within the same school. Second, comparisons are made between Programs across all schools. 4. Courses – The procedure described for Programs is repeated for Courses. The increasing degree of granularity in comparisons is applied at this level with courses being compared to all others at the Overall level, School level and Program level. While descriptive statistics are available for every semester that a course is offered, the n for minimum adequacy for principal axis factor analysis (300) is often not achieved for courses with a low number of sections. In this case, course data is aggregated over a number of semesters (typically three to six) until an adequate n is achieved and comparison of factor analyses are made for courses across a similar period of time. 5. Instructors – The procedure described for Courses is repeated for instructors, with the exception that instructor data is broken out in two fashions. First, all data for a given instructor is aggregated regardless of the course the data was derived from. Second, data for the instructor is analyzed by course taught. As with the Course level data analysis descriptive data is available every semester, however, it accumulating sufficient data for factor analysis is frequently a long term proposition, with four to six semesters being the norm for aggregate instructor data and six to eight semesters being the norm for instructor by course data. 6. Technology – Work by Ice and others (Ice, 2008a; Diaz, Richardson, Ice & Swan, 2009; Ice & Dom, 2009) has demonstrated that the CoI can be used to detect the effectiveness of new technologies on the 34 CoI survey indicators. At APUS, when new technologies implementations are made, data is analyzed comparing those courses in which beta implementations occurred to all other courses at the Overall, School, Program and Course levels. Analysis consists of comparative descriptive statistics, comparison of loadings and strength of loadings in factor analyses, and linear regression. With respect to linear regression treatment and non-treatment groups are classified as dummy variables allowing for analysis to be conducted across all 34 variables that indicates if the treatment had a significant impact on a given item and the amount of variance accounted for by the treatment. In instances where a deeper drilldown into cognitive outcomes is desired, student work products are assessed using a rubric that describes levels of engagement vis-à-vis the Cognitive Presence construct (Ice, 2008b). 7. Social Media – Integration of rich social media is a subset of technology, however, its effect is more localized as the impact is most dramatic with respect to the Affective Expression component of Social Presence. One potential reason for this is that social presence is required, almost as a prerequisite, for the development of teaching and cognitive presence. Moreover, the Affective Expression component deals with participants perceptions of their own impressions and projection of self in this environment, a requirement for the other components of social presence. When this special class of social media oriented technologies are assessed, the measures used in the general Technology class above are implemented, however, specific attention is given to assessing the strength of the loadings for Affective Expression elements. 8. Instructional Design – One of the problems noted in the general CoI literature is that, occasionally, a fourth factor, related to instructional design and organization, falls out when factor analysis is conducted. Staley and Ice (2009) found that in instances where a course is designed by an instructor, without the benefit of collaboration with an instructional design team, and then subsequently taught by another instructor, students frequently perceive a mismatch between the “voice” in which the content is presented and communications from the instructor. As such, at the course level an examination of factor loadings is conducted to determine if a fourth factor has fallen out and if this is a function of a course having had inadequate instructional design support. 9. Epistemological Alignment – Work by Ice, Akyol and Garrison (2009) reveals that even if factor analysis of large sets of CoI survey data produce the expected three factor pattern, hidden two factor patterns may be embedded at lower levels as a product of objectivist versus constructivist content and assignment orientations. Special attention is given to detecting this pattern at the School, Program, Course and Instructor level. 10. Retention – CoI surveys are sent to students using a unique link that allows for correlating a response with other student data across the University. (Access to this data is available only to the data analysis team. Under no circumstances are Deans, Program Chairs, Instructors or Staff permitted access to this information.) One of the primary measures of interest to the University is the impact of various factors on retention. With enrollment status as the dependent variable, over 100 independent variables, including the 34 CoI indicators are used to assess factors that influence Retention.

Supporting Information for this Effective Practice
Evidence of Effectiveness: 

A discussion of findings associated with each of the above categories, at APUS, follows: 1. Overall – Descriptive statistics revealed high overall student satisfaction with each of the 34 CoI indicators as well as the three additional quantitative items. Principal axis factor analysis produced the expected three factor solution. The most recent analysis of year to date aggregated data (n = 43,932) can be viewed at: http://tinyurl.com/lfh4ll 2. School – Analysis of descriptive statistics revealed a significant difference between two Schools at APUS. One of the schools had 23 indicators, across the three Presences, that were significantly lower than the means obtained in the Overall analysis. The other School had nine indicators, seven in Teaching Presence, one in Social Presence and one in Cognitive presence, that were significantly higher than the means obtained in the Overall analysis. Factor analysis revealed a two factor solution for the same School that had significantly lower means across 23 indicators. Upon review the two factor solution was in alignment with the two factor solution related to objectivist content orientation described by Ice, Akyol and Garrison (2009). For two other Schools factor loadings in the .5 - .6 range were noted across the majority of Social Presence items, indicating that even though an intact construct existed the cohesion was not optimal. Conversely, survey data from one School produced exceptionally strong scores (.85 and above) for all social presence items. While analysis at this level was informative it was generalized in nature, however, it did provide a guide for which related Programs, Courses and Instructors deserved closer scrutiny in lower level analysis. 3. Program – Descriptive statistics revealed significant differences among Programs, with five Programs being significantly below Overall means across six or more indicators and three programs being significantly above Overall means across the same parameters. With respect to factor analysis, the same five Programs had significantly lower means produced a two factor solution, representative of an objectivist orientation. Six Programs displayed exceptionally strong factor loadings across Social Presence items, including the three Programs that had significantly higher means across six or more indicators. While also guiding subsequent drilldowns, these findings were a catalyst for exploring both Programmatic weaknesses and criteria related to exemplary outcomes. These initiatives utilize standard mixed methods techniques and are currently in process. 4. Course – Currently the data obtained for administration of surveys is not adequate for comprehensive analysis of all 1536 courses. To date, only 293 Courses have minimum sample size adequacy for principal factor analysis. However, an adequate n has been obtained for making meaningful comparison of descriptive statistics for 919 Courses. With respect to descriptive statistics, 121 Courses were identified that as having significantly lower means than Overall means across six or more indicators. 82 Courses were identified as having significantly higher means than Overall means across six or more indicators. The latter group of Courses are being examined for evidence of exemplary practices that may be implemented in other Courses, especially those that had six more means significantly lower than Overall means. With respect to factor analysis, the expected three factor solution was present in 203 of the 293 Courses that had minimum Sample size adequacy. 32 Courses produced a two factor solution was in alignment with the two factor solution related to objectivist content orientation described by Ice, Akyol and Garrison (2009). 37 Courses produced a four factor solution that will be discussed in the section on Instructional Design below. The remaining 21 Courses produced four and five factor solutions with loadings that are not consistent with any previous CoI survey research. These courses are currently being explored to acquire an understanding of the anomalous findings. 5. Instructors – Of 914 Instructors an adequate n, for principal factor analysis, has been obtained for only 328. However, an adequate n has been has been obtained for making meaningful comparison of descriptive statistics for 622 Instructors. With respect to descriptive statistics, 76 instructors were identified as having significantly lower means than Overall means across six or more indicators. 36 Instructors were identified as having significantly higher means than Overall means across six or more indicators. The latter group of Instructors are being evaluated for evidence of exemplary practices that may be implemented by other Instructors, especially those that had six more means significantly lower than Overall means. Though intended to be used for analysis of at the Course level or higher, exploratory use of factor analysis at APUS has proved efficacious. Specifically, in instances where various Instructors taught different sections of the same Course comparisons of factor analyses has proven to be interesting. Though only six such data sets currently exist, in two instances analysis of individual Instructor data revealed a two factor solution for one and two instructors in each of the respective cases, despite overall three factor solutions at the Course level. This may indicate that even though a Course is constructivist in terms of content epistemological orientation, the Instructors facilitation of discourse and direct instruction techniques can subvert the intent and lead to a perception of an objectivist orientation on the part of learners. At APUS the Instructor level data analyses are used in two ways: 1. Evaluation of Course as Direct Feedback to the Faculty on Effectiveness leading to faculty self directed changes regarding teaching approach and class design: CoI data is presented to the faculty member at the conclusion of each course for faculty self assessment. In this review the faculty member (at the individual level of analysis) can determine effectiveness in facilitation of discourse and direct instruction. The faculty member can modify class instructional methods and adjust the design for future sections of the class. 2. Evaluation of the Faculty Member by the Institution on their Overall Effectiveness leading to individual faculty changes and program level changes: Each quarter the Program Director obtains a summary report of the CoI scores for each Instructor. This report is reviewed and integrated into the quarterly audits conducted by the Program Director on the effectiveness of the Instructor’s teaching. These reports are reviewed with the faculty member in a consultative dialogue where constructive feedback leads to changes in teaching style. 6. Technology – To date the CoI has been used to detect the positive impact of three technological implementations at the University level. In the first instance, the use of embedded audio feedback was found to produce significantly higher means across four Teaching Presence, one Social Presence and two Cognitive Presence indicators as compared to Overall means. In the second instance, the integration of rich, online document creation and editing tools were found to produce significantly higher means across one Teaching Presence, four Social Presence and two Cognitive Presence indicators as compared to Overall means. Qualitative analysis, using the previously described rubric, confirmed student perceptions of cognitive gains. In the final instance, the implementation of a general class of social media applications (e.g. Facebook, Ning and a rich information exchange student lounge developed in-house) was found to produce significantly higher means across one Teaching Presence and three Social Presence indicators as compared to Overall means. 7. Social Media – Though embedded within the Technology research parameters, the analysis of Social Media is treated as a special class as the impact can be assessed vis-à-vis a very specific impact on the Affective Expression dimension of Social Presence. Specifically, factor analysis reveals that in instances where Social Media is used, the strength of factor loadings for the three Affective Expression indicators are strengthened by at least .2. The assumption, given the CoI model, is that participants are able to better integrate themselves and realize the presence of others as the social media mode increases the available social presence cues. 8. Instructional Design – Though the Overall, School, Program, Course, Technology and Social Media analyses can all be informative to the Instructional Design Process, the most direct impact was assessed by comparing descriptive statistics and factor patterns for: 1. Courses that were developed by an Instructor with the assistance of the Instructional Design Team then taught by the same Instructor, 2. Courses that were developed by a third party with the assistance of the Instructional Design Team then taught by another Instructor, 3. Courses that were developed by an Instructor without the assistance of the Instructional Design team and then taught by that instructor, and 4. Courses that were developed by a third party and taught by another Instructor. Level of satisfaction across all 34 factors, were highest for category 1 and decreased in the order that the categories have been presented. In addition for category 4, a fourth factor emerged; Teaching Presence bifurcated with Instructional Design & Organization loading on one factor, and Facilitation of Discourse and Direct Instruction loading on a separate factor (Ice & Staley, 2009). From an institutional perspective, this analysis revealed the importance of the Instructional Design process at all levels of implementation, but especially for instances in which pre-designed Courses were taught by adjuncts. From a research perspective, this analysis has shed some light on a potential causal factor related to the emergence of a fourth factor in CoI survey analyses. 9. Epistemological Orientation – Utilization of research related to Epistemological Orientations (Ice, Akyol & Garrison, 2009) has been embedded in the discussions above. 10. Retention – Though only six semesters worth of data are available, initial findings indicate that establishment of adequate Social Presence may have a dramatic impact on Retention. Specifically, in the most recent regression analysis, only five factors out of more than 100 were found to be significant predictors of Retention, accounting for a total of 22% of the variance. Of those, two were indicators of the Affective Expression component of Social presence. Item 1 was: I was able to form distinct impressions of some course participants. Item 2 was: Online or web-based communication is an excellent medium for social interaction. The others significant predictors were grade obtained in the last course and affiliation with two specific programs. These later factors accounted for only 2% of the variance, while the aforementioned Affective Expression indicators accounted for 20% of the variance in Retention. This finding may be due in part to participants’ ability to recognize the fact that although online they are not isolated, and isolation in online learning has been shown in previous research to be a major barrier to student’s learning and retention in online courses.

How does this practice relate to pillars?: 

Student Satisfaction – As the CoI survey is based on student perceptions of the effectiveness of 34 indicators across Teaching, Social and Cognitive Presence, this practice allows for assessment of critical processes in online learning vis-à-vis Student Satisfaction. By drilling down to a granular level, the processes described provide an exceedingly rich body of data that are utilized for continuous improvement across all levels of the University, including faculty development, instructional design and, technology testing and implementation. Learning Effectiveness – As the highest levels of thought are achieved in a collaborative, constructivist environment, insuring adequate projection of each of the 34 CoI indicators is considered crucial in optimizing cognitive outcomes. In this endeavor, attention to descriptive statistics and related regression analyses is given a great deal of attention, however, factor pattern analysis is also considered crucial in that it demonstrates whether proper paradigmatic alignment has occurred at the levels described. Faculty Satisfaction – Being an online instructor differs significantly from teaching in the face to face environment. Though many reviewable best practices exist and some degree of training is available at most online institutions, role adjustment issues, generalized anxiety and uncertainty regarding expectations are well documented in the literature. Providing faculty with the type of detailed data described above allows for enhanced reflection, the ability to assess personal practice, and an opportunity to engage in targeted improvement of praxis.

Equipment necessary to implement Effective Practice: 

Statistical analysis software such as SPSS The CoI Survey Instrument

Estimate the probable costs associated with this practice: 

The CoI Survey instrument is free for use and available at www.communitiesofinquiry.com (note – attribution is expected for research published using the instrument) Statistical analysis software, such as SPSS, is required. In the overwhelming majority of instances institutions already possess such software. In the event that they do not, the cost is less than $1,000. Continuously collecting and analyzing the data requires a serious institutional commitment. Depending on the degree of integration with other institutional systems, a part-time or full-time staff position (or combined equivalent) will likely be required for survey deployment, database administration and data aggregation. Depending on the size of the data sets, a part-time commitment for a statistician or data analyst will be required.

References, supporting documents: 

Arbaugh, J., Cleveland-Innes, M., Diaz, S., Garrison, R., Ice, P., Richardson, J., & Swan, K. (2008).Testing a measure of the Community of Inquiry Framework using a multi-institutional sample. The Internet and Higher Education. 11 (3-4), 133-136.

Other Comments: 

More cited resources: Ice, P. (2008a, April). The impact of asynchronous audio feedback on teaching, social and cognitive presence. Paper presented at the First International Conference of the Canadian Network for Innovation in Education, Banff, Alberta. Ice, P., Akyol, Z. & Garrison, R. (2009, August) The relationship between instructor socio-epistemological orientations and student satisfaction with indicators of the Community of Inquiry Framework. Paper presented at the 7th Annual Hawaii International Conference on Education, Honolulu,HI. Ice, P. & Dom, J. (2009, June) Using the CoI to Assess the Efficacy of New Technologies. Paper presented at the Second Annual Sloan-C International Symposium: Emerging Technology Applications for Online Learning, San Francisco, CA. Staley, J. & Ice, P. (2009, August) Instructional design project management 2.0: A model of development and practice" Madison, WI: 25th Annual Conference on Distance Teaching & Learning.