[ARCHIVED] QA Reviews for Online Courses : Oh The Manual Labor...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello Fabulous Community Members,
Happy New Year! I have an interesting question for all of you....
At Touro College, we use an internal rubric to score the quality of our online courses. This process is very manually (factory-like) and presents a variety of challenges including:
- Discrepancies between ID scores and Evaluator scores - IDs have to "re-score"
- Iterative back and forth between both parties
- Tracking faculty interactions and live updates (e.g. who was met with, what was discussed etc)
Our QA reviews are completed semesterly. Based on your experiences doing QA reviews for courses, what are some of the methods you use to 1. save time, 2. incentivize faculty to complete necessary modifications, 3. norming among scorers/evaluators? In other words, what does your process look like?
Any information, resources, or guidance you can share is greatly appreciated.
Thanks,
Holly Owens
Assistant Director of Instructional Design, Online Education
Touro College and University System
New York, New York
This discussion post is outdated and has been archived. Please use the Community question forums and official documentation for the most current and accurate information.