Evaluating Training for Online Teachers
Recently I have been reviewing Kirkpatrick's four levels of training evaluation ( reaction, learning, behavior, and results ) because we have an online teacher training at Ventura College that prepares faculty to teach online in Canvas by helping them design an introductory module and an academic module for an upcoming course. As we are evaluating our training program, the question arises how does one measure these levels of organizational change? Would you be willing to share any unique measurements or unique goals? Do you build the evaluation into the training? Do you strive for level three and four changes such as behavior changes and results changes? What is a training session that has gone well and reached these higher levels of organizational change? How do you know?
Hi @mmoore1 . This is a good question and like you suggested, reaching the last two levels in Kirkpatrick's model can be difficult to determine. In the past we have tried to measure change in behavior with surveys to those who attended training but although we could, we normally do not actually go into the Canvas course and look around. This does give us some type of measure of behavior change, although it is self-reported data. In terms of results, at times we will do course reviews for faculty members using a QM-like list on a class that is in progress or recently finished. Normally this includes reviewing student surveys embedded in the class that gauge their view of course elements from the end-user standpoint. We will also sometimes be asked to review courses if there is a high failure or withdraw rate to see if we can determine potential improvements. Again, these may not be perfect but give us some useful information.
In terms of what sessions reach these higher levels, normally organically these are the sessions that hit on technologies or techniques that faculty are expected to utilize. At my institution faculty must use Canvas to post a syllabus, keep an updated gradebook, and for some classes take attendance. Training related to these tasks or improving the efficiency of work in these areas normally results in direct behavioral change and an easier time measuring results. We also ask departments to let us know what training they would like and when to offer this to their faculty. When the department is involved and the department chair is involved, the training is normally more targeted to immediate needs and behavior change is greater.
I am very interested in hearing the insight of others in the Community so hope that more folks chip in!
ericwerth Thanks! These are all great resources that I was overlooking as measuring points, but they could be used as informal evaluation and reflection. I am trying to stay away from formal evaluations, but I am trying to encourage faculty to self evaluate and reflect as they go through each term to refine a course.
Yes, I am also interested in other comments, best practices, and even policies if you have them.