Studying Student Experience Online

ericwerth
Community Explorer
2
4930

Reading responses to the https://community.canvaslms.com/polls/1421-which-do-you-prefer-online-vs-on-campus?sr=search&searchI... poll has been very informative and reminded me of a study I was involved in a few years ago.  I thought that there may be those in the Community interested in what we were examining in relation to student choice and experience in online classes.  At the time a few colleagues and I presented this at a regional conference and intended to publish after gathering more data to strengthen statistical tests, but other priorities arose so the results were not distributed as widely as we had hoped.

Background:  I was working at a liberal arts university that began to develop a series of online degrees.  Many courses within the general education core that were being offered face-to-face were also developed online.  Online courses were 8-weeks while the on-ground campus had 15-week semesters.

Online course sections were normally capped at 25 students and the decision was made to allow on-campus students to utilize unused “seats”, believing this would help those with scheduling issues and others like athletes and music majors with heavy travel schedules.  As a result, online GE courses normally had enrollments including both online and on-campus students.

We wanted to study this dynamic to determine why on-campus students chose to take an online course, determine the experience of the students in online sections, and explore if grades given in online sections were either higher or lower than the same courses delivered on-ground.

The study: In the fall of 2014 and spring of 2015, surveys were given to students in 100 (mostly) and 200 level GE courses where both an online and on-ground section was offered.  This included an art and music, public speaking, math, English, biology, philosophy and religious studies course.  Surveys included a section where students were asked why they chose an online option (if they were an on-campus student) and various questions about how well the course met certain quality criteria and their expectations entering the course.  Responses were generally indicated using a 5-point Likert scale where 1 represented students were “very displeased” with this element of the course, 3 was “neutral”, and 5 “very pleased”. Grades were compared in fall 2014 for matched online and face-to-face sections through data provided by the Registrar. 

Results: Between the fall and spring terms, we received 154 survey responses (45% response rate).  Of these 32% were from males and 67% were from females.  Forty percent of the surveys were from students taking the online sections while 60% from individuals taking on-ground classes.  Of those taking an online section, 77% were primarily on-campus students and 23% fully online.  While online classes included students in all years of their program, the majority were either freshmen or seniors.

Primarily on-ground students who chose online sections were able to indicate why they chose an online option from a list or add their own response.  They could select any or all choices within this list.  Respondents indicated that the largest motivation to take the online format over a face-to-face alternative was either time convenience (67%) or scheduling conflicts with other classes (57%). The perception that the online course would be easier was the third most selected option (33%) followed fourth by the desire to try an online class (27%).

To examine the perception of students in online and F2F sections, we compared student responses to the Likert scale questions.  The following table depicts these results.

Table 1- Comparison of online and F2F student responses

Online

Face-to-Face

Ability of course to meet stated objectives

3.90

3.74

Feedback from instructor

4.00

3.73

Community building with classmates

3.62

3.50

Spiritual reflection & growth

3.81

3.56

 

As can be seen, students in the online classes actually rated courses higher than F2F sections.  Feedback from instructor was the only difference found to be statistically significant (p=0.009) but it had a small to medium Effect Size of 0.227.

In the survey, we also asked students about their expectation of the quality of the online course before enrolling, their overall experience in the course, and their confidence in their ability to succeed online prior to starting class.  These responses were correlated with the responses listed in the table above.  Results of these correlations are below.

Table 2- Correlation of overall student feedback and specific course elements

Expectation for the quality of the course

Overall experience in the course

Confidence in the ability to succeed online

Ability of course to meet objectives

0.186

0.487*

0.346*

Feedback from instructor

0.023

0.519*

0.469*

Community building with classmates

0.157

0.517*

0.463*

Spiritual reflection and growth

0.053

0.384*

0.302*

*Significant at p < .05

Although a number of these findings were statically significant, r-squared values were only moderate.

We attempted to determine if student experience varied by class standing or experience with prior online classes.  Although our sample size for this data was too low to draw any concrete conclusions, in initial results we did not find either of these variables to impact student self-reported class experience.

Finally, after analyzing class grades in the fall, we found high similarity between delivery modes and student achievement.  While some variance existed, students in online and F2F classes normally received approximately the same proportion of A’s or B’s, C’s, or below C grades.  Results were similar enough that we chose to stop tracking this data after the fall term and only look at the percent of students who passed or failed.

Conclusions:

  • Online classes were of benefit to on-campus students who had scheduling conflicts, but not utilized by athletes as heavily as we anticipated (we expected many student athletes to register for the classes).
  • Students need to be advised effectively to ensure they understand the rigors of online classes and don’t overload their schedules, particularly if these classes use accelerated schedules.
  • Although not a large focus of this study, grade evaluation aligned with the published studies that suggest that when classes are built upon a strong pedagogical model, differences in student performance may relate more to factors such as pre-existing academic strength, motivation, etc. and less the course modality itself. We found no reason to believe that grade were either higher in online classes or that students struggled due to the self-directed nature of online learning.
  • The fact that student self-reported satisfaction in several important areas was higher in online sections is likely due to intentional inclusion of these during the instructional design process. In the course development process, faculty building online classes worked closely with an instructional designer to ensure alignment between course objectives, content and student assessment.  A focus of course design was also student engagement and interaction with the instructor.
  • Correlations between student confidence in their ability to succeed online and the four elements of course satisfaction studied suggested that efforts to boost student confidence, either through advising or an orientation course may be beneficial, although no causal relationship can be inferred from the correlations alone.
  • Similarly, correlation between the aspects of student satisfaction and overall experience in the class suggests a value in both developing a pedagogically sound course and training faculty for effective course facilitation.

The evaluation process led to a number of other benefits around campus.  One example was a reevaluation of our 100-level course offerings.  When we dug deeper into why a large number of seniors were taking classes normally associated with freshman class standing, we learned that we were not offering enough sections of these for first-year students so a number of them had to put these off until later in their educational career.

I would be interested in whether anyone else here has conducted a similar study, either formally or anecdotally.  Is your experience similar to what we found a few years ago or different?

Best wishes!

2 Comments
waaaseee
Community Contributor

Your survey has re-inforced my assumptions that online is preferred over F2F since it offers better scheduling opportunities and the fact that on-campus and online students basically score the same is re-invigorating (for me at least). 

Interestingly, your results did teach me two new things.

The first is that students rated online higher in the 'spiritual/growth/community-building' section which is a bit counter-intuitive for me. But I realize that the world has changed in the past half a decade and online communication is really a default now perhaps to the extent that online communication is now just 'communication'. It would be interesting to contrast this survey to a similar one conducted perhaps 5/10 years ago. ericwerth‌ could you delve on the actual questions a bit? 

The second is that instructor feedback is higher in online, which again, is counter-intutive to me. But in my work with researchers/professors I have observed that they have a lot running through their minds, and very few are "extroverts" so I guess feedback is easier to give and take in an online setting (without the fear of being judged). 

ericwerth
Community Explorer

Good thoughts waaaseee.  A number of folks we talked to at the time felt as though those same results were counter-intuitive.  However, this is where I believe course design and faculty involvement play a huge role in student satisfaction.  Particularly when referring to students who are primarily on-campus, I believe there can be a tendency to focus on academics in the classroom and assume that other forms of growth (social, spiritual, etc.) happen as part of the campus experience.  When developing our online classes we tried to plan specifically for student-student and student-faculty interaction because there were not groups specifically planning extra curricular, online student activities.  When it comes to community building, for example, what percent of students in a standard sized on-campus class participate in classroom discussions?  Faculty can plan activities like "think-pair-share", polling, etc. to try and get everyone engaged but often even great class discussions leave out a number of students who choose not to respond in class due to their personality or fear of being wrong.  In an online class, everyone is expected to participate and it is much harder to "hide" during discussions.

Being said, I am starting to think that part of engagement in online classes may relate to students themselves, as you point out.  Later this week I am having a meeting to talk about how to create a Canvas calendar that all students can access which lists campus activities and other events.  This came directly from student requests since many, according to them, "live" by the Canvas app.  So, it may be possible that our online classes provided students with a semester's worth of deadlines and reminders in a format they valued and found easy to access where the on-ground courses did not as much, and that this was reflected in our results as part of "community".

Anecdotally, I attribute some of the "feedback from instructor" results to be related to our expectations of instructors.  We asked faculty to respond to student questions within 24 hours even if it was simply to let the student know they received the message(s) and would provide some guidance in the near future.  We also asked faculty to return assignments within a week unless they needed more time to grade and in these situations to just communicate this to students. Too often I feel that students in ground classes don't receive quick feedback on assignments that they can use to improve their skills before being asked to turn in additional work.  However, I agree that instructors online may be able to give feedback more freely, and I certainly heard that many of our faculty were being more aware of how they communicated to students via assignment feedback since they weren't meeting F2F where verbal and non-verbal messages were also part of the conversation.

I am going to go back to the literature we cited and see if anything from 10-12 years ago looked at community building or similar characteristics to see if our results aligned with these.  If I remember correctly (and probably not unsurprisingly), our results agreed with some published research and not others.  It seemed as though some folks found student experience to be equal to or higher than F2F courses and in other situations they found the opposite.  

Thanks again for the thoughts!