Skip navigation
All Places > Higher Education > Blog > Authors Eric Werth

Higher Education

2 Posts authored by: Eric Werth Champion

 

Reading responses to the Which do you prefer: online vs on-campus? poll has been very informative and reminded me of a study I was involved in a few years ago.  I thought that there may be those in the Community interested in what we were examining in relation to student choice and experience in online classes.  At the time a few colleagues and I presented this at a regional conference and intended to publish after gathering more data to strengthen statistical tests, but other priorities arose so the results were not distributed as widely as we had hoped.

 

Background:  I was working at a liberal arts university that began to develop a series of online degrees.  Many courses within the general education core that were being offered face-to-face were also developed online.  Online courses were 8-weeks while the on-ground campus had 15-week semesters.

 

Online course sections were normally capped at 25 students and the decision was made to allow on-campus students to utilize unused “seats”, believing this would help those with scheduling issues and others like athletes and music majors with heavy travel schedules.  As a result, online GE courses normally had enrollments including both online and on-campus students.

 

We wanted to study this dynamic to determine why on-campus students chose to take an online course, determine the experience of the students in online sections, and explore if grades given in online sections were either higher or lower than the same courses delivered on-ground.

 

The study: In the fall of 2014 and spring of 2015, surveys were given to students in 100 (mostly) and 200 level GE courses where both an online and on-ground section was offered.  This included an art and music, public speaking, math, English, biology, philosophy and religious studies course.  Surveys included a section where students were asked why they chose an online option (if they were an on-campus student) and various questions about how well the course met certain quality criteria and their expectations entering the course.  Responses were generally indicated using a 5-point Likert scale where 1 represented students were “very displeased” with this element of the course, 3 was “neutral”, and 5 “very pleased”. Grades were compared in fall 2014 for matched online and face-to-face sections through data provided by the Registrar. 

 

Results: Between the fall and spring terms, we received 154 survey responses (45% response rate).  Of these 32% were from males and 67% were from females.  Forty percent of the surveys were from students taking the online sections while 60% from individuals taking on-ground classes.  Of those taking an online section, 77% were primarily on-campus students and 23% fully online.  While online classes included students in all years of their program, the majority were either freshmen or seniors.

 

Primarily on-ground students who chose online sections were able to indicate why they chose an online option from a list or add their own response.  They could select any or all choices within this list.  Respondents indicated that the largest motivation to take the online format over a face-to-face alternative was either time convenience (67%) or scheduling conflicts with other classes (57%). The perception that the online course would be easier was the third most selected option (33%) followed fourth by the desire to try an online class (27%).

 

To examine the perception of students in online and F2F sections, we compared student responses to the Likert scale questions.  The following table depicts these results.

 

Table 1- Comparison of online and F2F student responses

Online

Face-to-Face

Ability of course to meet stated objectives

3.90

3.74

Feedback from instructor

4.00

3.73

Community building with classmates

3.62

3.50

Spiritual reflection & growth

3.81

3.56

 

As can be seen, students in the online classes actually rated courses higher than F2F sections.  Feedback from instructor was the only difference found to be statistically significant (p=0.009) but it had a small to medium Effect Size of 0.227.

 

In the survey, we also asked students about their expectation of the quality of the online course before enrolling, their overall experience in the course, and their confidence in their ability to succeed online prior to starting class.  These responses were correlated with the responses listed in the table above.  Results of these correlations are below.

 

Table 2- Correlation of overall student feedback and specific course elements

Expectation for the quality of the course

Overall experience in the course

Confidence in the ability to succeed online

Ability of course to meet objectives

0.186

0.487*

0.346*

Feedback from instructor

0.023

0.519*

0.469*

Community building with classmates

0.157

0.517*

0.463*

Spiritual reflection and growth

0.053

0.384*

0.302*

*Significant at p < .05

 

Although a number of these findings were statically significant, r-squared values were only moderate.

We attempted to determine if student experience varied by class standing or experience with prior online classes.  Although our sample size for this data was too low to draw any concrete conclusions, in initial results we did not find either of these variables to impact student self-reported class experience.

 

Finally, after analyzing class grades in the fall, we found high similarity between delivery modes and student achievement.  While some variance existed, students in online and F2F classes normally received approximately the same proportion of A’s or B’s, C’s, or below C grades.  Results were similar enough that we chose to stop tracking this data after the fall term and only look at the percent of students who passed or failed.

 

Conclusions:

  • Online classes were of benefit to on-campus students who had scheduling conflicts, but not utilized by athletes as heavily as we anticipated (we expected many student athletes to register for the classes).
  • Students need to be advised effectively to ensure they understand the rigors of online classes and don’t overload their schedules, particularly if these classes use accelerated schedules.
  • Although not a large focus of this study, grade evaluation aligned with the published studies that suggest that when classes are built upon a strong pedagogical model, differences in student performance may relate more to factors such as pre-existing academic strength, motivation, etc. and less the course modality itself. We found no reason to believe that grade were either higher in online classes or that students struggled due to the self-directed nature of online learning.
  • The fact that student self-reported satisfaction in several important areas was higher in online sections is likely due to intentional inclusion of these during the instructional design process. In the course development process, faculty building online classes worked closely with an instructional designer to ensure alignment between course objectives, content and student assessment.  A focus of course design was also student engagement and interaction with the instructor.
  • Correlations between student confidence in their ability to succeed online and the four elements of course satisfaction studied suggested that efforts to boost student confidence, either through advising or an orientation course may be beneficial, although no causal relationship can be inferred from the correlations alone.
  • Similarly, correlation between the aspects of student satisfaction and overall experience in the class suggests a value in both developing a pedagogically sound course and training faculty for effective course facilitation.

 

The evaluation process led to a number of other benefits around campus.  One example was a reevaluation of our 100-level course offerings.  When we dug deeper into why a large number of seniors were taking classes normally associated with freshman class standing, we learned that we were not offering enough sections of these for first-year students so a number of them had to put these off until later in their educational career.

 

I would be interested in whether anyone else here has conducted a similar study, either formally or anecdotally.  Is your experience similar to what we found a few years ago or different?

 

Best wishes!

Student retention is a concern in probably every college and university in America. We wouldn’t be in education if we didn’t feel it is a transformative experience and benefits not only the individual, but their family, community, and society as a whole. Many students start college and fail to finish often leaving them without the benefit of a degree/certificate and student loans to repay. Frequently institutions spend a great deal of effort trying to identify struggling students and connect them with resources intended to help them succeed, but at some point turn to what is generally referred to as early alert (EA) systems to increase the efficiency of their efforts.


I have been involved in early alert system efforts at two universities. At the first, I was very involved with evaluating products and ultimately selecting a system, but was not present for the majority of the implementation. At my current institution, I was not as involved in selecting the system, but will be in the implementation and future use. Where I am now we are just beginning the implementation process, and just completed the in-person discovery visit with the early alert system we selected, Nuro. I wanted to blog about my experience implementing the system, not related necessrily to a particular product but to relay what I learned through the process that may be beneficial for others considering or implementing a system as well.


I will attempt not to mention the same information that is easily found on the internet. The following are sites that have valuable information for anyone who is interested:


http://www.ecsu.edu/documents/faculty-staff-conference/bestPractices-CaseExamplesinEarlyAlert.pdf 
https://journals.uncc.edu/facultyguide/article/view/384/381


As we continue or implementation journey I hope to add additional blogs.


During Selection of a System
Here are a few thoughts about selecting a system that have shown to be very important.

1. Evaluate why a system is being considered. This may seem obvious but as word spreads on campus that this project is underway those who may be impacted (faculty, staff, etc.) will begin talking about what a new effort will mean to them and commonly why a change is warranted. Having information on retention rates, graduation rates, a clear picture of what resources are currently being spent on retention efforts, and what problems an early alert program could help solve is important in messaging across campus. Clearly mapping the current process and flow of information is valuable.
2. What administrative support exists for the effort? Early alert efforts normally impact the entire campus and require concerted and sustained effort from individuals with different reporting structures. IT, academics, student affairs, and financial aid are a few examples. Often these individuals report to different deans or vice presidents, and if any of these units are not fully committed, it could jeopardize the project as a whole. At our institution, the president and provost are behind the efforts and we have hired a consulting group to help manage and advise during the process. This level of commitment will hopefully ensure that those in each unit collaborate and do what is necessary to get the system up and running as smoothly as possible.
3. Ensure key individuals are involved in the decision-making process. Selection of a system should include frequent meetings and product evaluations from a number of key stakeholders on campus, including those from the most likely impacted units (IT, academic affairs, advising, business office, etc.). I encourage the process to be transparent so that those not involved directly can still feel connected and have input into the process at some level. Having a clear indication of what needs a product must fulfill helps guide this process. I suggest considering what an early alert system currently does, not what is planned for the future, or what it could possibly do as it is difficult to determine when particular features will be added or if the system could be altered to meet a particular need if it hasn’t been done before.
4. How easy is it to use? This goes without saying, but individuals will not use a system that is difficult or time-consuming to learn.


Early Stages of Implementation
We are still in the early stages of implementation, but here are a few things I would suggest based on experience so far.

1- Have subcommittees responsible for various aspects of early alert implementation with clear responsibilities, reporting, and accountability structures. I am a member of a subcommittee to look at data that will be included in the system and how we can improve our current system until a new process is in place.
2- Make a detailed map of current processes, who does what in this flow, and where information or students can fall through the cracks. We had a map, but I wish we had made ours more detailed. Share this will the entire early alert committee because many on the committee will not know how all of the current pieces fit (or don’t fit) together.
3- Make a list of all of current data that is being collected on students, how long that data has been collected, and how complete those data sets are. Also, identify where this data is stored (SIS, LMS, other system, etc.) and how, if at all, this data is passed from one system to another. Most EA programs have a predictive element where existing data is used to try and identify students who are at a higher risk of retention, and a current behavior/achievement element that would look at how students are performing and attempt to identify students in trouble to connect them with resources before they get to the point they cannot recover. Knowing what data the institution has collected gives insight into how it can be used to identify students at risk.
4- Specifically for a discovery meeting to kick off formal implementation:

a. If provided with a list of initial questions from the vendor, fill this out with as much detail as possible prior to the meeting. Ensure that everyone on the committee has read this information and has had a bit of time to process the data. This will help time be used more efficiently when working with individuals from the early alert company and make discussions within the committee more meaningful (shift the focus from what data exists to how it can be best used).
b. Ensure the right individuals attend the meeting in person. We had member of the committee attend, but a couple of our faculty members couldn’t be there due to class schedules. A few of us have experience as faculty, so could speak into that aspect a bit, but it would have been better to have current faculty in that meeting themselves. I also would suggest having the person who heads institutional research attend. Our institutional researcher helped fill out documentation, but what not present at the discovery meeting. In hindsight, it would have been better had she been there to answer follow-up questions. Similarly, many of our students are athletes but we didn’t have someone specifically from athletics a part of the meeting, but it would have been useful. We do have a medical and optometry school, and a member of the student services from these units attended. This was valuable as their needs for early alert are unique.
c. Administrative support is important. Our provost and president both stopped by during the on-campus meeting. I believe this showed full institutional support for the effort and set the right tone for everyone on the committee.
d. IT was key. Much of the discussion surrounded how data flows at the institution and what is needed to make all of the key system speak to one another. IT individuals with knowledge of all of these systems (SIS, LMS, etc.) was vital during this meeting.

 

Next steps
The discovery meeting was a good opportunity to determine steps moving forward, discuss data that will be included initially, and determine what access is necessary to make the early alert program communicate with other systems on campus. We will have a better feel of implementation timeline when we know that systems are speaking to each other as needed, but I am confident that we should be able to test the system before the end of our academic year. This is important so everything will be ready for full use in the Fall 2018 term when most of our students will arrive. Here are a few items that I know we will need to work on to make the system most effective.

1. Continue data determination- A number of common data points such GPA, gender, SAT/ACT, etc. will be included but we are also looking at including some of the results from national surveys given to students and involvement in athletics. There is the possibility of including some measurement from our student conduct system as well as information on students traveling to campus. Before long we will launch degree mapping software so next year this data may be integrated as well.  We have a lot of data that has been collected over time, but some may not have been given regularly enough or have enough value in determining student risk to include.
2. Grades- We as an institution will need to look at how we keep grades and how often these are updated. Prior to Canvas, many faculty kept their own gradebooks and entered a midterm score in our SIS. This will be too late to identify students with academic issues. When we began using Canvas in Fall 2017, the expectation was that faculty would start keeping gradebooks in Canvas. We will need to more clearly determine the requirement for keeping gradebooks and when grades need to be entered if the EA system will be able to reliably use this data.
3. Attendance- Similar to grades, in the past keeping attendance has been done a number of ways. We need to find a consistent and reliable way to keep attendance so that this information can be utilized effectively. Canvas is used, but there are some instances in which this has been problematic such as in courses where we have a combined lecture and lab, as there are instances where a person can be present or absent to either the lecture or lab on the same day. There may be ways around the issues we have faced, but it needs to be addressed.
4. Publishing courses- We will need to have guidance on when courses in Canvas need to be published. This is strait forward for our online classes but the majority of our courses are still F2F and there are questions about whether courses should be published the first day of the term, after the first class, by the end of the first week of classes, etc.


The early alert committee will be evaluating the need for changes in our policy or procedures and making recommendations to the administration.


We are excited to move forward and the potential to both using our existing resources more effectively and help students thrive at our institution. There is much work to be done, but as long as our community works together in this effort we have the opportunity to make substantive positive change.

 

I would love any insight from those who have been in a similar situation!

Filter Blog

By date: By tag: