Studying Student Experience Online

ericwerth
Community Explorer
2
4965

Reading responses to the https://community.canvaslms.com/polls/1421-which-do-you-prefer-online-vs-on-campus?sr=search&searchI... poll has been very informative and reminded me of a study I was involved in a few years ago.  I thought that there may be those in the Community interested in what we were examining in relation to student choice and experience in online classes.  At the time a few colleagues and I presented this at a regional conference and intended to publish after gathering more data to strengthen statistical tests, but other priorities arose so the results were not distributed as widely as we had hoped.

Background:  I was working at a liberal arts university that began to develop a series of online degrees.  Many courses within the general education core that were being offered face-to-face were also developed online.  Online courses were 8-weeks while the on-ground campus had 15-week semesters.

Online course sections were normally capped at 25 students and the decision was made to allow on-campus students to utilize unused “seats”, believing this would help those with scheduling issues and others like athletes and music majors with heavy travel schedules.  As a result, online GE courses normally had enrollments including both online and on-campus students.

We wanted to study this dynamic to determine why on-campus students chose to take an online course, determine the experience of the students in online sections, and explore if grades given in online sections were either higher or lower than the same courses delivered on-ground.

The study: In the fall of 2014 and spring of 2015, surveys were given to students in 100 (mostly) and 200 level GE courses where both an online and on-ground section was offered.  This included an art and music, public speaking, math, English, biology, philosophy and religious studies course.  Surveys included a section where students were asked why they chose an online option (if they were an on-campus student) and various questions about how well the course met certain quality criteria and their expectations entering the course.  Responses were generally indicated using a 5-point Likert scale where 1 represented students were “very displeased” with this element of the course, 3 was “neutral”, and 5 “very pleased”. Grades were compared in fall 2014 for matched online and face-to-face sections through data provided by the Registrar. 

Results: Between the fall and spring terms, we received 154 survey responses (45% response rate).  Of these 32% were from males and 67% were from females.  Forty percent of the surveys were from students taking the online sections while 60% from individuals taking on-ground classes.  Of those taking an online section, 77% were primarily on-campus students and 23% fully online.  While online classes included students in all years of their program, the majority were either freshmen or seniors.

Primarily on-ground students who chose online sections were able to indicate why they chose an online option from a list or add their own response.  They could select any or all choices within this list.  Respondents indicated that the largest motivation to take the online format over a face-to-face alternative was either time convenience (67%) or scheduling conflicts with other classes (57%). The perception that the online course would be easier was the third most selected option (33%) followed fourth by the desire to try an online class (27%).

To examine the perception of students in online and F2F sections, we compared student responses to the Likert scale questions.  The following table depicts these results.

Table 1- Comparison of online and F2F student responses

Online

Face-to-Face

Ability of course to meet stated objectives

3.90

3.74

Feedback from instructor

4.00

3.73

Community building with classmates

3.62

3.50

Spiritual reflection & growth

3.81

3.56

 

As can be seen, students in the online classes actually rated courses higher than F2F sections.  Feedback from instructor was the only difference found to be statistically significant (p=0.009) but it had a small to medium Effect Size of 0.227.

In the survey, we also asked students about their expectation of the quality of the online course before enrolling, their overall experience in the course, and their confidence in their ability to succeed online prior to starting class.  These responses were correlated with the responses listed in the table above.  Results of these correlations are below.

Table 2- Correlation of overall student feedback and specific course elements

Expectation for the quality of the course

Overall experience in the course

Confidence in the ability to succeed online

Ability of course to meet objectives

0.186

0.487*

0.346*

Feedback from instructor

0.023

0.519*

0.469*

Community building with classmates

0.157

0.517*

0.463*

Spiritual reflection and growth

0.053

0.384*

0.302*

*Significant at p < .05

Although a number of these findings were statically significant, r-squared values were only moderate.

We attempted to determine if student experience varied by class standing or experience with prior online classes.  Although our sample size for this data was too low to draw any concrete conclusions, in initial results we did not find either of these variables to impact student self-reported class experience.

Finally, after analyzing class grades in the fall, we found high similarity between delivery modes and student achievement.  While some variance existed, students in online and F2F classes normally received approximately the same proportion of A’s or B’s, C’s, or below C grades.  Results were similar enough that we chose to stop tracking this data after the fall term and only look at the percent of students who passed or failed.

Conclusions:

  • Online classes were of benefit to on-campus students who had scheduling conflicts, but not utilized by athletes as heavily as we anticipated (we expected many student athletes to register for the classes).
  • Students need to be advised effectively to ensure they understand the rigors of online classes and don’t overload their schedules, particularly if these classes use accelerated schedules.
  • Although not a large focus of this study, grade evaluation aligned with the published studies that suggest that when classes are built upon a strong pedagogical model, differences in student performance may relate more to factors such as pre-existing academic strength, motivation, etc. and less the course modality itself. We found no reason to believe that grade were either higher in online classes or that students struggled due to the self-directed nature of online learning.
  • The fact that student self-reported satisfaction in several important areas was higher in online sections is likely due to intentional inclusion of these during the instructional design process. In the course development process, faculty building online classes worked closely with an instructional designer to ensure alignment between course objectives, content and student assessment.  A focus of course design was also student engagement and interaction with the instructor.
  • Correlations between student confidence in their ability to succeed online and the four elements of course satisfaction studied suggested that efforts to boost student confidence, either through advising or an orientation course may be beneficial, although no causal relationship can be inferred from the correlations alone.
  • Similarly, correlation between the aspects of student satisfaction and overall experience in the class suggests a value in both developing a pedagogically sound course and training faculty for effective course facilitation.

The evaluation process led to a number of other benefits around campus.  One example was a reevaluation of our 100-level course offerings.  When we dug deeper into why a large number of seniors were taking classes normally associated with freshman class standing, we learned that we were not offering enough sections of these for first-year students so a number of them had to put these off until later in their educational career.

I would be interested in whether anyone else here has conducted a similar study, either formally or anecdotally.  Is your experience similar to what we found a few years ago or different?

Best wishes!

2 Comments