Hi @krenz053 (and @bbennett2 ),
This post is a few months old, but I can jump in here to answer some of your questions from when I worked at my previous institution. It has been a little over a year since I've been there, but as I brush up on Outcomes, it appears not much has changed since I've last used it.
1. Because of how we layered our Outcomes by School > Program > Dept > Course (e.g. Public Health > Masters of PH > Epidemiology), we were able to run admin level reports to aggregate the data at those levels to provide us with not only course level competency achievement, but competency across a department and program. Setting up a proper naming convention with our outcomes helped us format the data from the CSV export. We didn't get the opportunity to do much with data visuals.
2. There are two admin level reports that can be pulled: Outcomes Results and Student Competency. The latter report will NOT pull quiz results. Courses have visual data charts that were helpful to our faculty for at-a-glance looks.
3. Yes, groups met at the program director levels to meet and discuss (this was at a state university) overall curriculum improvements based on competency performance. Our center would help pull and create the data reports to where improvement may be needed. We could see how competencies were performing across courses to give us some indication if poor performance was a result of the subject matter, content, pedagogy, etc. It also helped us redevelop Rubrics so that students were assessed based on the competencies so that when they received a grade for the assignment, it was highly reflective of their outcome performance.
4. We would look at the data semester-to-semester as many of our courses were offered in spring, fall, and summer. So some courses we had data three times per year, others once or twice.
This discussion post is outdated and has been archived. Please use the Community question forums and official documentation for the most current and accurate information.