Skip navigation
All Places > Higher Education > Blog > 2019 > May
2019

New Quizzes will eventually replace the default Canvas quizzing tool, but in the meantime, there's still a lot of development needed to bring it to feature parity. Here's what led The Wharton School to start using New Quizzes sooner rather than later.

 

Meeting Our Biggest Need

One of the largest core courses taken by all undergraduate students at Wharton is "Introduction to Operations, Information and Decisions" or OIDD 101. Depending on the term, this intro course will have up to 500 students enrolled. The bulk of the course grade comes from six online quizzes--each one has a mix of 10 multiple choice and numeric answer questions. Often, there is more than one way to interpret a question, resulting in the need to regrade quizzes after they are submitted and recalculate student scores.

 

In classic Quizzes, regrading is triggered by certain actions (eg, changing the correct answer) and is only available for certain automatically-graded question types. Unfortunately, classic Quizzes do not allow regrading for numeric question types. While infrequent, when the need to regrade a numeric question does arises, it's a pretty big headache. In the last instance of this course, even a small handful of regrades resulted in a few hours of manual regrading. And that's just for one course! Even as I was writing this blog post, I received a report of a manual regrade needed for a numeric question in a quiz taken by 240+ students . . . 

 

Enter Quizzes.Next

If you've reviewed the Quizzes.Next FAQ or Feature Comparison pages recently or even started exploring the tool yourself, you know that while there are a lot of new features and question types in New Quizzes, there are still several pending features for development. These include some fundamental features, such as the Preview tool, the ability to allow additional attempts, LockDown browser compatibility, Surveys, and downloadable student and item analysis reports. After weighing the pros and cons of the feature comparison chart, the promise of a more robust regrade tool won us over and generated interest in piloting the tool for OIDD 101. 

 

We had hoped to start small, by migrating a few low-stakes practice quizzes to the new platform first. But when the faculty told us that practice quizzes would be given on paper this year and that New Quizzes would be used for the bulk of the course grades, we quickly went from dipping a toe into the pool to doing a full canon ball. Fortunately, we had the consolation knowing that if anything did go wrong, we could always revert back to classic Quizzes within the same course.

 

Spring 2019 Pilot

Successes

After securing faculty support (the lack of numerical regrade was a major pain point for the three instructors before, so they were eager to try something new), we enabled New Quizzes for a single sub-account and also enabled the "Quiz Log Auditing" feature option. This was key to accessing the View Logs, which were extremely helpful in troubleshooting issues later on. Two teaching assistants created the quizzes, after which we checked the settings thoroughly before the quizzes were published (our workaround to the lack of a Preview tool). Because the quizzes were named "Assignment 1, Assignment 2,  etc . . ," rather than "Quiz 1, Quiz 2 . . ." students were able to find them easily under the "Assignments" page. Students said they liked the look of the new interface, while the TAs and instructors found it intuitive to build new quizzes and add images to questions. The regrade feature correctly recalculated grades for  numeric answer quizzes (hooray!) and even handled multiple regrades for the same question (a problem with classic Quizzes). Based on this success alone, the faculty have already agreed to continue using New Quizzes in the Fall term.

 

Challenges

1. No Auto-Submit with "Until" Date: Each quiz was available to students for an entire week and late submissions were not accepted. Expecting the same functionality as in classic Quizzes, faculty told students that any quiz not submitted by the "Available Until" date would be automatically submitted by Canvas. When this didn't happen as anticipated for Assignment 1 and 10-15 students were left with "In Progress" quizzes, faculty felt like they had lied to students. To fix this issue, we re-opened the quiz for the students with an "In Progress" status, masqueraded as them, and then submitted on their behalf the responses they had added as of the due date (found under "Moderate" > "Attempts in Progress" > "In Progress" log).

 

For the next quiz, faculty stressed the importance of manually clicking the "Submit" button in order for Canvas to process their quizzes. While there were still a few students each quiz who didn't deliberately click "Submit" (or assumed that clicking "Submit" once, without clicking "Submit" again when the Submission Confirmation message popped up, was sufficient), these incidences lessened over the course of the term. 

 

2. No Quiz Log Data Saved: In a small handful of instances, students claimed to have answered all the questions, but their responses were not recorded in the quiz logs. After much troubleshooting, we came to realize that a specific behavior was causing the loss of data. Since these quizzes were available to students for a week at a time with no time limit, many students were leaving the quizzes open on their browsers for extended periods of time, sometimes several days without refreshing or closing the page. In that time, the Canvas session was timing out, so that by the time students went to input their responses, the data was unable to push out to the server. Unfortunately, when this happens little information, other than a timestamp for when the student began the quiz, is recorded, even in Instructure's server logs. The problem is avoided by students refreshing the page often or preferably, closing out of the quiz any time they are not actively working on it. 

 

3. On-Time Submissions Marked Late [FIXED]: If a student submitted a Quizzes.Next quiz within a few minutes of the due date/time, sometimes a processing lag in SpeedGrader resulted in the submission being marked late in the Gradebook. This bug could even happen for on-time submissions that were initially marked as on-time, but then manually graded after the due date! In our situation, the faculty were very understanding of this bug and knew that students weren't actually submitting quizzes late because of the availability dates. But for courses that have New Gradebook enabled and set to automatically deduct points for late submissions, this would be a more serious concern. 

 

Lessons Learned So Far 

With only one course in the pilot and many more developments in the pipeline for New Quizzes, we still have a lot to learn. But we've also gained a lot of experience in this first go-round. Below of some things we've discovered along the way:

  • Saving Quiz Logs: For quizzes that are available to students for an extended period of time, instruct students to close out of quizzes any time they are not actively working on them. This will ensure that their answers are recorded in the quiz logs and not lost due to the Canvas session "timing out" or a disrupted Internet connection. 
  • Auto-Submit: While classic Quizzes would automatically submit when the "Available Until" time passes, this doesn't happen in New Quizzes. Make sure students know that unless there's a time limit for the quiz, they will need to click the "Submit" button and confirm their submission in order for it to actually process. 
  • Question Types: Be sure you're using the right question type when you create a question. The question type can't be changed once you start drafting the question so if you need to switch types, you'll have to create a new question. 
  • Accessing SpeedGrader: To view all submissions in SpeedGrader, you'll need to access the link through the Gradebook, not in the quiz itself. Only individual attempts are visible within the "Moderate" tool.
  • New Question Types: The stimulus question type is a good replacement for "text only" questions. Note: If you embed an image in the stimulus that is larger than 600 pixels wide, students will need to scroll to see the whole image. The word count for essay questions is really helpful and it's great to finally have ordering and matching question types! 
  • Item Banks: Item banks are tied to user accounts, not courses, so right now only the user who created the bank can view, edit, or use it. This presents an issue for co-instructors who want to share item banks. According to this post, the ability to share item banks is a pending feature.

 

Thanks for reading about Wharton's initial experience with Quizzes.Next/New Quizzes! I'm looking forward to presenting about New Quizzes at InstructureCon 2019 and sharing follow-up blog posts as we continue this pilot. If you have used New Quizzes before and have other tips/tricks, or are holding off because of pending features, please comment below!

Hi Everyone,

 

I'm usually not one to write too many blog posts, and I really debated the best place to put this.  As Ally is an accessibility tool it could have certainly gone in the accessibility group (and beginning my community college career in DSPS I do have a soft spot for UDL and 508/ADA compliance--so important for student success), but this has more to due with implementation and challenges regarding our processes and complexity of getting a tool of this scope in-place at a multi-college district with over 50k FTES.  I do believe this is more applicable to this Higher Education group, as there are specific challenges that we face in our environment that may not be as applicable to some of the other sectors.  Also, please forgive me as I've left some of this intentionally vague so that I don't identify any specific folks at our district, as everyone is wonderful to work with here.

 

To begin, we had a subgroup that I was part of that was charged with analyzing which potential tools we could adopt in order to enhance accessibility for students, and after looking at a few options it was determined that our best path forward was to explore Blackboard Ally.  We piloted Ally for a semester, and after positive feedback from the small testing group we then signed a 3 year contract.  The thinking was that after using the tool in a somewhat limited capacity with that small group it was found to be valuable, and we could then begin an opt-in rollout to specific courses where faculty could use the tool the first semester (where we could provide additional training and use those experiences to develop additional resources), then roll it out to all courses the following semester.

 

The main complexity started when we began to look as a District at how the content that Blackboard Ally identified as needing some level of remediation, was in actuality going to be remediated.  Looking at the sheer amount of content that we need to remediate, it is a daunting task.  As I mentioned above, we're a pretty large district, with four colleges and over 50k full-time equivalent students.  Looking back at just one semester of content that Ally identifies, we can see almost 800,000 pieces of content.  While the course numbers are a bit inflated as we create a course shell for every section, the content number is fully accurate regarding what's in Canvas.  

 

LRCCD FA18 Ally Stats

 

This leads me into the challenge that we are still facing, and why we have had to delay our rollout--simply that we need a comprehensive plan on how this content is going to be remediated.  Right now we have courses that have content in them that is not fully accessible, and we can see that in the account level reporting.  We are not looking at or evaluating the course specific accessibility reports, though they are available.  The challenges is that content was there before we implemented Ally, as it is there now with Ally implemented, the only difference is that we can't preach ignorance or pass the buck when we have reporting that shows we do have inaccessible content.

 

We are now having to somewhat on-the-fly come up with plans on how to help faculty remediate content.  Many of the courses we have are fully online (and fully developed) and have been taught and continually have evolved for years.  When there are hundreds of pieces of content, each of which can take between minutes and hours to remediate, there is just too large of a burden to expect faculty members to fully remediate the content themselves in a timely manner.  We are evaluating options such as hiring more faculty coordinators at each campus to help with remediation, hiring district-wide instructional designers to remediate content, having stipends available to faculty for content remediation above their regular teaching load, etc.  With four colleges and so many decision makers needing to be consulted and the ultimate decision needing to be negotiated with faculty, this process is not something that is able to be accomplished in a week or even a month.  It is critical we get this done for students, as they need fully accessible content, but there are so many considerations that need to be made it is quite the process.

 

In closing, the main reason for making this post was to inform others regarding the challenges that are presented once you begin identifying inaccessible content.  Hopefully you have a good experience using whatever tool or solution that your institution chooses, and I just want to make sure that those charged with making those decisions consider the implications when they choose to implement their solution.  Having a comprehensive plan regarding how to remediate content is very valuable.

 

Thanks for your time reading this.

 

Ken

We made some cool Canvas stickers! 

 

We plan to have a Canvas kick-off event for our students soon. If they show us they have the Canvas Student app installed on their mobile device, we're going to give them one of these cool Canvas stickers as a fun way to help with marketing and get them excited about our new LMS.

 

If anyone wants the Google Drawings files to create (or modify) for your own stickers, see below:

 

 

The vendor we used is Sticker Mule and they were fantastic to work with and they turned out great. These stickers are 2-inches in diameter but they have all kinds of different shapes and sizes to choose from.

 

Colored Canvas sticker animated GIF

Filter Blog

By date: By tag: