Piloting Quizzes.Next (New Quizzes)

RebeccaMoulder
Community Contributor
4
1918

New Quizzes will eventually replace the default Canvas quizzing tool, but in the meantime, there's still a lot of development needed to bring it to feature parity. Here's what led The Wharton School to start using New Quizzes sooner rather than later.

 

Meeting Our Biggest Need

One of the largest core courses taken by all undergraduate students at Wharton is "Introduction to Operations, Information and Decisions" or OIDD 101. Depending on the term, this intro course will have up to 500 students enrolled. The bulk of the course grade comes from six online quizzes--each one has a mix of 10 multiple choice and numeric answer questions. Often, there is more than one way to interpret a question, resulting in the need to regrade quizzes after they are submitted and recalculate student scores.

 

In classic Quizzes, regrading is triggered by certain actions (eg, changing the correct answer) and is only available for certain automatically-graded question types. Unfortunately, classic Quizzes do not allow regrading for numeric question types. While infrequent, when the need to regrade a numeric question does arises, it's a pretty big headache. In the last instance of this course, even a small handful of regrades resulted in a few hours of manual regrading. And that's just for one course! Even as I was writing this blog post, I received a report of a manual regrade needed for a numeric question in a quiz taken by 240+ students . . . :smileyshocked:

 

Enter Quizzes.Next

If you've reviewed the Quizzes.Next FAQ or Feature Comparison pages recently or even started exploring the tool yourself, you know that while there are a lot of new features and question types in New Quizzes, there are still several pending features for development. These include some fundamental features, such as the Preview tool, the ability to allow additional attempts, LockDown browser compatibility, Surveys, and downloadable student and item analysis reports. After weighing the pros and cons of the feature comparison chart, the promise of a more robust regrade tool won us over and generated interest in piloting the tool for OIDD 101. 

 

We had hoped to start small, by migrating a few low-stakes practice quizzes to the new platform first. But when the faculty told us that practice quizzes would be given on paper this year and that New Quizzes would be used for the bulk of the course grades, we quickly went from dipping a toe into the pool to doing a full canon ball. Fortunately, we had the consolation knowing that if anything did go wrong, we could always revert back to classic Quizzes within the same course.

 

Spring 2019 Pilot

Successes

After securing faculty support (the lack of numerical regrade was a major pain point for the three instructors before, so they were eager to try something new), we enabled New Quizzes for a single sub-account and also enabled the "Quiz Log Auditing" feature option. This was key to accessing the View Logs, which were extremely helpful in troubleshooting issues later on. Two teaching assistants created the quizzes, after which we checked the settings thoroughly before the quizzes were published (our workaround to the lack of a Preview tool). Because the quizzes were named "Assignment 1, Assignment 2,  etc . . ," rather than "Quiz 1, Quiz 2 . . ." students were able to find them easily under the "Assignments" page. Students said they liked the look of the new interface, while the TAs and instructors found it intuitive to build new quizzes and add images to questions. The regrade feature correctly recalculated grades for  numeric answer quizzes (hooray!) and even handled multiple regrades for the same question (a problem with classic Quizzes). Based on this success alone, the faculty have already agreed to continue using New Quizzes in the Fall term.

 

Challenges

1. No Auto-Submit with "Until" Date [FIXED]: Each quiz was available to students for an entire week and late submissions were not accepted. Expecting the same functionality as in classic Quizzes, faculty told students that any quiz not submitted by the "Available Until" date would be automatically submitted by Canvas. When this didn't happen as anticipated for Assignment 1 and 10-15 students were left with "In Progress" quizzes, faculty felt like they had lied to students. To fix this issue, we re-opened the quiz for the students with an "In Progress" status, masqueraded as them, and then submitted on their behalf the responses they had added as of the due date (found under "Moderate" > "Attempts in Progress" > "In Progress" log).

 

For the next quiz, faculty stressed the importance of manually clicking the "Submit" button in order for Canvas to process their quizzes. While there were still a few students each quiz who didn't deliberately click "Submit" (or assumed that clicking "Submit" once, without clicking "Submit" again when the Submission Confirmation message popped up, was sufficient), these incidences lessened over the course of the term. 

 

2. No Quiz Log Data Saved: In a small handful of instances, students claimed to have answered all the questions, but their responses were not recorded in the quiz logs. After much troubleshooting, we came to realize that a specific behavior was causing the loss of data. Since these quizzes were available to students for a week at a time with no time limit, many students were leaving the quizzes open on their browsers for extended periods of time, sometimes several days without refreshing or closing the page. In that time, the Canvas session was timing out, so that by the time students went to input their responses, the data was unable to push out to the server. Unfortunately, when this happens little information, other than a timestamp for when the student began the quiz, is recorded, even in Instructure's server logs. The problem is avoided by students refreshing the page often or preferably, closing out of the quiz any time they are not actively working on it. 

 

3. On-Time Submissions Marked Late [FIXED]: If a student submitted a Quizzes.Next quiz within a few minutes of the due date/time, sometimes a processing lag in SpeedGrader resulted in the submission being marked late in the Gradebook. This bug could even happen for on-time submissions that were initially marked as on-time, but then manually graded after the due date! In our situation, the faculty were very understanding of this bug and knew that students weren't actually submitting quizzes late because of the availability dates. But for courses that have New Gradebook enabled and set to automatically deduct points for late submissions, this would be a more serious concern. 

 

Lessons Learned So Far 

With only one course in the pilot and many more developments in the pipeline for New Quizzes, we still have a lot to learn. But we've also gained a lot of experience in this first go-round. Below of some things we've discovered along the way:

  • Saving Quiz Logs: For quizzes that are available to students for an extended period of time, instruct students to close out of quizzes any time they are not actively working on them. This will ensure that their answers are recorded in the quiz logs and not lost due to the Canvas session "timing out" or a disrupted Internet connection. 
  • Auto-Submit [FIXED]: While classic Quizzes would automatically submit when the "Available Until" time passes, this doesn't happen in New Quizzes. Make sure students know that unless there's a time limit for the quiz, they will need to click the "Submit" button and confirm their submission in order for it to actually process. 
  • Question Types: Be sure you're using the right question type when you create a question. The question type can't be changed once you start drafting the question so if you need to switch types, you'll have to create a new question. 
  • Accessing SpeedGrader: To view all submissions in SpeedGrader, you'll need to access the link through the Gradebook, not in the quiz itself. Only individual attempts are visible within the "Moderate" tool.
  • New Question Types: The stimulus question type is a good replacement for "text only" questions. Note: If you embed an image in the stimulus that is larger than 600 pixels wide, students will need to scroll to see the whole image. The word count for essay questions is really helpful and it's great to finally have ordering and matching question types! 
  • Item Banks [FIXED]: Item banks are tied to user accounts, not courses, so right now only the user who created the bank can view, edit, or use it unless they manually share the bank with other users. This presents an issue for co-instructors who want to share item banks. According to this post, the ability to share item banks is a pending feature.

 

Thanks for reading about Wharton's initial experience with Quizzes.Next/New Quizzes! I'm looking forward to presenting about New Quizzes at InstructureCon 2019 and sharing follow-up blog posts as we continue this pilot. If you have used New Quizzes before and have other tips/tricks, or are holding off because of pending features, please comment below!

4 Comments