cancel
Showing results for 
Search instead for 
Did you mean: 

New Quizzes looking forward; 2021

Instructure
Instructure
3 19 1,615

As we’re all getting ready to firmly shut the door on 2020, we have a prime opportunity to look forward to all of the changes 2021 will bring. 

One of these changes you may have already noticed—me. Hi! I’m Susan Sorensen, a product manager new to Instructure who will be working on Quizzes. My career has been supporting faculty and administrators. For the past seven of those years, I’ve led product decisions and worked hand in hand with users to make sure that we’re delivering successful outcomes.

One of the most critical steps in my onboarding process here is simply to understand you and your users. If you haven’t already joined our New Quizzes User Group in the Canvas Community, please join me there. 

As previously mentioned in the user group, due to the educational changes from Covid-19, the July 2021 enforcement date for New Quizzes has been updated to summer {July for most, December for APAC} 2022

We’re in a discovery process to understand the current landscape.  In the past year, your needs have changed, and as those needs evolve, we are excited to evolve with them. 

I’ve been getting to know the Community to better understand how New Quizzes are currently being used and how we might improve. Along the way, I’ve met some amazing admin advocates who like the features and functionality of New Quizzes. In some cases, a few limitations are preventing use in particular circumstances or require workarounds. We don’t want you to struggle under those constraints. Understanding and defining this functionality will be our immediate concern. We have a goal of helping everyone become successful with New Quizzes in demonstrating student understanding of learning materials.

By the start of 2021, we will share the quizzes plan for 2021 and beyond, which focuses heavily on reducing barriers for using New Quizzes. We have several projects we are exploring in the New Quizzes User Group and invite you to get involved. Your feedback will help determine details and inform the future roadmap New Quizzes.

 

19 Comments
Adventurer II

Hi @SuSorensen and welcome!  We have shared our needs in multiple forums in the past 2-3 years, including directly with previous product managers and as part of larger groups such as the Higher Ed R1 Peers. Nothing has changed in our needs, or with our dealbreakers for enabling the service. We just need Canvas to move forward on the development and not delay with another discovery process due to a change in Product Manager. No offense intended, as I'm happy to hear you're focused on this and looking forward to working with you in the future, but I just want to share my perspective.

Instructure
Instructure

I can certainly understand that sentiment!

The good news is engineering and I will be hitting the ground hard at the start of 2021!

I can also assure you that the time I've spent getting up to speed hasn't been lost time. Over the past month and half, we've been focused on resolving bugs. I hope you will feel a big improvement.

Adventurer

@SuSorensen , You mentioned you're exploring several projects in the user group. Where can we find those conversations?  Thank you for mentioning barriers that prevent us from using New Quizzes; I'm relieved to hear Canvas is going to address those. I'd like to make sure the barriers to use that we face are on your radar. 

Adventurer III

For starters, the fact that new quizzes was built as an LTI tool has presented several issues. I would love a crystal ball to go back to those early discussions to see how this implementation even received approval. As an outsider, it just seems like this caused so much unnecessary technical debt. So much so that the team is still trying to figure some things out that don't work well.

For instance, let's take apart the official FAQ's page.

  • Why are ungraded assessments not shown in the To Do List?
    • The product team has been allowed to blame this on the LTI framework. If you see and identify something that is broke, fix it. This means the quiz team needs to work with the team over the to do list and figure something out. This should also give Instructure pause on just how limited external tools are that use the LTI framework. If you want an immersive experience, provide a way to do that. 
  • Why does Gradebook show a score even though the assessment is not fully graded?
    • The product team has been allowed to blame this on the LTI framework. I completely disagree with the response in the FAQ. The team can scan a quiz and if it sees quiz questions that are open ended, it can wait for a grade to be sent back. Agile product development should not keep you from creating a 'sync grades when ready' tool. The argument then becomes, well we already have that framework in the grade book. That's great. Piggy back off that functionality and allow faculty to toggle the 'post grades' feature from the quiz engine.
  • Is the rich content editor missing features?
    • We need one unified RCE experience in Canvas. The fact that this tool was built as an LTI is no excuse. Find a way.
  • Are there partial points grading for items?
    • Every question type with multiple items analyzed in a single question needs a partial credit option. 
  • Can I migrate Question Banks from Classic Quizzes?
    • This one really makes my blood boil. I completely understand that all development work has stopped with classic quizzes. But just because classic quizzes did not have an API fully developed for Question Banks does not mean that something shouldn't be added to solve for this. The response to this question is showing how little Instructure values my time. Why in the world is a workaround promoted by adding all the questions to another quiz. I'm sorry, that's just not going to cut it. If a quiz uses Question Bank, the team will need to migrate that over successfully. I will not be telling faculty that they have to recreate the wheel to solve for a problem that the product development team wont solve for because it falls outside the definition of Minimum Value Product. 

 

Those are the big ones right now. The main thing I've noticed as a failure is the ability to point blame at the implementation. It's been used as an excuse since general release. My hope is that you will not allow it to continue to be a barrier and find ways to make the experience better. 

@SuSorensen 

Adventurer III

For starters, the fact that new quizzes was built as an LTI tool has presented several issues. I would love a crystal ball to go back to those early discussions to see how this implementation even received approval. As an outsider, it just seems like this caused so much unnecessary technical debt. So much so that the team is still trying to figure some things out that don't work well.

For instance, let's take apart the official FAQ's page.

  • Why are ungraded assessments not shown in the To Do List?
    • The product team has been allowed to blame this on the LTI framework. If you see and identify something that is broke, fix it. This means the quiz team needs to work with the team over the to do list and figure something out. This should also give Instructure pause on just how limited external tools are that use the LTI framework. If you want an immersive experience, provide a way to do that. 
  • Why does Gradebook show a score even though the assessment is not fully graded?
    • The product team has been allowed to blame this on the LTI framework. I completely disagree with the response in the FAQ. The team can scan a quiz and if it sees quiz questions that are open ended, it can wait for a grade to be sent back. Agile product development should not keep you from creating a 'sync grades when ready' tool. The argument then becomes, well we already have that framework in the grade book. That's great. Piggy back off that functionality and allow faculty to toggle the 'post grades' feature from the quiz engine.
  • Is the rich content editor missing features?
    • We need one unified RCE experience in Canvas. The fact that this tool was built as an LTI is no excuse. Find a way.
  • Are there partial points grading for items?
    • Every question type with multiple items analyzed in a single question needs a partial credit option. 
  • Can I migrate Question Banks from Classic Quizzes?
    • This one really makes my blood boil. I completely understand that all development work has stopped with classic quizzes. But just because classic quizzes did not have an API fully developed for Question Banks does not mean that something shouldn't be added to solve for this. The response to this question is showing how little Instructure values my time. Why in the world is a workaround promoted by adding all the questions to another quiz. I'm sorry, that's just not going to cut it. If a quiz uses Question Bank, the team will need to migrate that over successfully. I will not be telling faculty that they have to recreate the wheel to solve for a problem that the product development team wont solve for because it falls outside the definition of Minimum Value Product. 

 

Those are the big ones right now. The main thing I've noticed as a failure is the ability to point blame at the implementation. It's been used as an excuse since general release. My hope is that you will not allow it to continue to be a barrier and find ways to make the experience better. 

@SuSorensen 

Adventurer III

@SuSorensen,

I have to agree with @jwadec that I never understood the reasoning behind making New Quizzes an LTI. It just seems to have added many unnecessary obstacles towards parity with classic quizzes.

  • In classic quizzes, we had immediate access to any imported outcomes. In New Quizzes we do not, we have to wait. So if I'm teaching someone how to use New Quizzes, I have to tell them to import their outcomes well before I do the training so they'll be available to work with.

  • Outcomes results are not relayed to the Learning Mastery Gradebook. 

  • In classic quizzes, we can see item analysis data almost immediately. Now teachers have to wait hours to get their item analysis report. This data needs to be immediately accessible. We use assessment data to inform our instruction. If this is a quick formative assessment, I have to use a different platform that can provide better immediate reporting.

  • There is no API and New Quizzes is not open source. I could script out the creation of quizzes using the old quizzes API very easily. 

  • The new quiz copy feature (and sending to others) is super broken. It gets stuck so often or just fails and requires a retry or it copies a completely blank quiz. Our support requests just don't seem to help much, we just end up trying it over and over until it works.

  • Module workflow is broken when a student opens a New Quiz.

  • In agreement with what has already been pointed out, the inability to move question banks from classic to new questions banks is a HUGE obstacle to get a veteran quizzes user to move to the new platform. 

  • Also in agreement, the RCE needs to be the same in both platforms. Especially the math equation editor

  • Instructure's acquiring of MasteryConnect and Certica Solutions makes the assessment landscape of Canvas extremely uncertain. It makes me feel like Instructure is abandoning New Quizzes for these platforms, but still stringing us along. I don't want to put in a ton of work the way we did with Gauge only to have to relearn another platform again.

I remember taking a survey to rank the issues with New Quizzes. I think a new survey would be welcome to prioritize our issues. Maybe I'm missing it somewhere, but I really wish there was a lot more transparency about what is actively being worked on. A priority list. 

Adventurer II

I wrote a long post about this back in the old Community which has not migrated. Its original title was 17 features that still need to be added to New Quizzes (or similar). When I last updated it, the list was done to 13, so an improvement.

If I was only allowed 2 then it would be these - 

My Top 2

1. RCE for all quiz types. Would allow use of tables and multimedia across all quizzes eg Matching and Drop down.

2. Partial credit for all quizzes. Not a new feature as in place in Classic quizzes. Why was it not brought back? Removes self marking element of quiz and hugely increases teachers' marking and student frustration.

A brief trawl through submitted Ideas will show votes in the 1000s for these!

 

Adventurer

I don't know if this is new, but in testing NQ today I see that you can choose to give - or not - partial credit for the Multiple answer question type.  Multiple Fill-in-the-Blank appears to always give partial credit, whereas Matching and and Categories types are all-or-nothing.  This is an important option for us too.

 

hesspe_0-1607975310658.png

Another (seemingly) missing feature that our faculty members use and I'm sure will expect to find, is the ability to grant more time to quiz takers while the quiz is in progress.  I anticipate some weeping and gnashing of teeth if that hasn't been fixed before NQ is introduced to a wide audience.

Navigator II

The biggest barriers in my testing of new quizzes include:

  1. Accessing assignment settings:  Due dates and total points assigned can ONLY be set from the three dot menu on the assignment tab.  Why can't we access these options from every three dot menu, or why is that assignment page not the default landing page for the quiz with another link set to access the current landing page?
  2. Manually graded questions:  There is NO indication that a new quiz needs grading.  IF the grade posting policy is set to manually post grades then there is at least that, but students still call panicked about the score new quizzes automatically directs them to.  The manual posting policy only helps initially, once zeros have been posted for missing submissions, any make-ups disappear into the gradebook completely.  There needs to be a needs grading indicator regardless of the grade posting policy, and when grades are hidden in the gradebook, new quiz scores should not be accessible elsewhere.
  3. Practice quizzes:  There is no true practice quiz in New Quizzes.  The work-around (set the quiz to "do not count towards grades" still puts the quiz as clutter in the gradebook where it is inappropriately subject to the late policy.  That and another work-around of setting it to be worth zero points prevents it from working correctly with module rules.  Module rules such as score at least and complete in order, make practice quizzes requirements.  They need to be displayed to students as required, and late penalties should not interfere with meeting those module requirements.  Late penalties on practice quizzes set the wrong message anyway.
    Ideally, a practice quiz would show up in all student to-do and calendar lists, and have a score that works with module objectives (i.e. exempt from the late penalty for "score at least"). 
    If present in the gradebook, the instructor view needs a filter to remove all practice quizzes from cluttering the display. 
  4. Importing legacy quizzes with question groups.  The question groups (not banks but groups specific to the quiz) need to become new quiz banks with the imported new quiz set to pull the same number of questions from the new quiz bank as were previously pulled from the legacy quiz question group.
  5. Moderate functions failing.  I tried on several occasions to reopen a quiz for students who had a legitimate technical difficulty.  The moderate page and quiz log do not display the same times.  A quiz might display that it took the student the full time on the moderate page, but the quiz log shows they stopped interacting 10 minutes early at the time they reported the technical difficulty.  Will a reopened quiz allow the remaining 10 minutes, or does the instructor need to add that time.  (There should be an option to restart the clock at the student's next access in a given window.  Coordinating the restart was a nightmare.)  If the first coordination fails does the added time need to be repeated, or does the added time need to be doubled now that it has been spent once? 
    Another instructor commented that they discovered they cannot allow an extra attempt to a single student unless the quiz was set to allow multiple attempts... Even though that meant the setting was then set to attempts allowed 1, since the rest of the class was not allowed multiple attempts. 
  6. Changing question types after you start writing the prompt should be a few simple clicks.  Currently I cannot find any way to change question types at all.  Each question must be restarted from scratch.
  7. There should be a clear indication of the partial credit options available for each question type.  For example, fill-in-the-blank can almost create a partial credit matching word bank - except that the blanks also have drop down lists with only the choices for that specific blank.  Why can't the matching question type use the same method for partial credit?  and/or Why don't the drop down menus for word-bank style fill-in-the-blank questions list the entire word bank?

 

Explorer

In an effort to reduce clutter, I tried not to repeat issues brought up by others. I would like to echo those concerns though, as each is necessary before the New Quizzes tool is functional. To add to what others have already said here:

The initial setup area is difficult to return to if you need to update anything. It is only accessible through Quizzes>3 dots. Why not through Assignments, and why is there no button inside of the quiz editing area to take you there?

The “Submission Attempts” setting in the initial setup area does not appear to do anything, but can be manipulated as if it would have an impact.

Changes made when editing a New Quiz does not push to the initial setup area. Two notable examples: 1) Submission Attempts has two areas it can be set with different values. 2) The name of the quiz can be changed while adding quiz questions, but that change will not push to the initial setup area, modules, quizzes, or assignments.

New Quizzes cannot be copied between courses across subaccounts, forcing users to recreate content if they work across multiple subaccounts. I am not sure if this could be mitigated by fixing the broken Question Banks functionality mentioned by others.

Why no HTML editor?

If you take a quiz using the Test Student, the quiz always thinks that a student has already taken the assessment, even if the Test Student Data is reset. This results is multiple copies of quizzes if you are tweaking your own assessments in advance.

Categorization, Fill in the Blank, and Matching, questions do not have access to the Rich Content Editor for their answers.

No warning is given if the point total for the assignment does not match the total number of points of the questions. I like the flexibility to have a 10 point quiz consisting of a different number of questions, but there should be communication to the instructor so they can verify against mistakes.

Explorer II

Welcome @SuSorensen. I'm just starting to get a handle on New Quizzes and how my program might use it, and my big concern right now is how student feedback is handled for the multiple answer question type. In classic quizzes, you could provide feedback for each answer option, just like with multiple choice questions. In New Quizzes, you cannot. Feedback per question choice is an essential feature for us, as students need to actually understand what they got wrong. That means there are still a lot of scenarios where using New Quizzes is out. Here's a link to my Idea Conversation about this issue.

In addition, a couple of other issues that have come up so far:

  • In classic quizzes, we use graded surveys to gauge student participation in some activities. There seems to be no option to do this in New Quizzes.
  • There seems to be no easy way to import question banks from classic quizzes.
  • The New Quizzes RCE is *much* less featureful than either the old or new Canvas RCE.
  • Cutting and pasting from a Word document, the New Quizzes interface for creating questions seems to want to add carriage returns before -- and sometimes after -- whatever you've pasted in. But even worse, it doesn't add those carriage returns until after you've clicked away from the input box, so you have to click back in to remove them.

I know, I know -- litany of complaints. But it really is good to meet you and I wish you well!

Learner II

Here are the top requests for New Quizzes improvements from a faculty member here at Georgetown. This prof has been the top advocate FOR New Quizzes, and here's his take on what needs to be changed before NQ are ready for prime time:

  1. There's no way to change an answer to a problem after the quiz has been completed by students without regrading all the problems. One by one.  Manually.

  2. If I make a change to a quiz after students have started it, the students will always see only the original, incorrect, version of the quiz, if they so much as looked at it before the change. That seems bizarre to me, and it does not seem fixable via Canvas automatic regrading.

  3.  If I add a question, I must go back and change the total quiz points.

  4. Printing quizzes to PDF doesn’t work.  Going to preview mode and hitting print only prints one page.  Seth C J at support says that’s being looked into by engineering.

  5.  Copying problems from one quiz to another is an involved task. Edit each problem. Save to an “item bank”. Then import them one by one into the new quiz.  Yes, one-by-one.
Adventurer

Hi @SuSorensen ,

Like many others who have commented here before me, I think many of the core issues with New Quizzes stem from it being developed as an LTI.  I remember back in 2014/2015, there were various people from Instructure touting that new quizzing would be done as an LTI to show off the power and flexibility of the LTI standard.  It seems to me what's actually happened is that new quizzes is showing all of the shortcomings of LTI, and there are no clear workarounds.  There will probably never be a clear answer, but I've always wondered if New Quizzes was developed as an LTI so it wouldn't have to be open sourced with the rest of Canvas and could be potentially sold as an add-on product for schools using an LMS other than Canvas.  Other than those possibilities, I don't really understand why stripping quizzes out of the core product was viewed as a good idea.  Mini-rant over...

As for our specific needs before we'd promote New Quizzes to faculty:

  1. New Quizzes *must* support LTI tools for the RCE.  We use Kaltura for video management, and need to be able to embed videos from Kaltura into new quiz questions (and maybe even answers) just like we can for assignments, pages, discussions, etc.
  2. The RCE must be consistent with core product.  New quizzes feels very disjointed from Canvas, and that is not a good thing overall.  It brings back memories of Sakai, where there were 2 or 3 different RCEs depending which tool you were in, and that was a complete mess for everyone involved.
  3. All content from classic quizzes needs to be able to easily migrate to new quizzes, including question banks.  I don't know all of the tech details, but this is a non-negotiable item.  Asking faculty or admins to perform a bunch of extra steps or telling us certain things won't migrate is not acceptable.
  4. There have been a lot of small nagging errors with new quizzes, and they are just unacceptable.  Some examples include: "?"s being shown instead of student names in the moderate area.  Some student's quiz attempts not starting correctly, and not being able to resolve that effectively, etc...

I'm sure there will be more along the road, and I see some of the posts from others here are mentioning some critical issues too (like students being able to know grading is not yet completed).  I'll echo some others in saying we need to see development activity on these items.  There has been a lot fo conversation in the past few years, but very little in terms of real changes so far.  The sooner we see things being developed, the more confident I think everyone will become that other issues will get resolved soon.  I think almost everyone is concerned that New Quizzes is going to be forced upon us before all of the issues are resolved, which would be a huge setback for Canvas as a whole.

I hope your new leadership will help push things in a positive direction and put some of our fears to rest!

-Chris

Adventurer

I second many of the concerns above. Additionally, I don't think these dealbreakers below have been mentioned yet. Some of them are serious enough that if NQ were enforced with these issues, we would have to seek out an external quizzing tool. In other words, they make NQ unusable for us in many courses. To be fair, some of these issues may already have been addressed and I missed those updates; we are only using NQ in a couple courses because of its limitations.

  1. Most seriously, question banks belong to a person and not to the course. We are a team of IDs working with a rotating and constantly changing team of instructors; having question banks belong to a person would make it impossible for us to keep courses updated. 
  2. Disconnect between the information (mostly point value) that you see in the list of assignments vs. what you see in the Quizzes tool. In other words, the numbers don't always match, especially if you have to update a question after the quiz is created.
  3. Programming feedback is confusing, especially with multiple attempts. It doesn't seem to work the way it is supposed to. 
  4. Errors when including NQ in course copies--sometimes they copy over as empty quizzes, sometimes they copy over unpublished, sometimes the adjust dates feature with a course copy doesn't adjust those dates. 
  5. Not every question type is accessible 
  6. The last time I checked, it was not possible to use Respondus to import questions. 

I would echo the thoughts above that many of these issues seem related to NQ being an external LTI.

 

 

Adventurer III

@SuSorensen , here is another issue that came in today. Inactive students show up on the moderate tab which can really clutter up a teacher's moderate screen. I see there's an idea up for voting about this one as well:
https://community.canvaslms.com/t5/Idea-Conversations/New-Quizzes-Remove-Inactive-students-from-Mode... 

Explorer III

Thanks for the update @SuSorensen. Can you please remind where is the best place to keep track of enforcement dates for all features? This delay was a very pleasant surprise btw. We have many concerns after we took NQ on another testing run recently. Although we have seen some improvements since our last testing run, there were not enough for us to comfortably recommend it to our academics. Still too risky in our opinion. Happy to share our results if that helps. Perhaps we should share with our CSM though. Let us know.

Instructure
Instructure

Thank you all so much for your feedback and support.

 

I can't say much to the reasons behind why decisions were made in the past, but I can convey that Instructure is serious about making New Quizzes a success and will be my sole devotion. 

Above, some of the comments relate to issues which seem like buggy behavior, which we will absolutely address. Some of the team capacity is specifically reserved for bug fixes. Copy has come to my attention several times recently and is something the team is actively working to get to the bottom of. 

I've also added a short novel here, so that you and the rest of the user group have some insight into what I've been doing, themes I'm hearing and my current assessment of priorities for feedback. 

 

Community Team
Community Team

@kirsten_ryall we post enforcement dates in Upcoming Canvas Changes. If you're subscribed to that page, you'll be able to get all the updates as that page is modified. 

Thanks,

Erin

Community Team
Community Team

[UPDATE 18-DEC: Followup post added to New Quizzes User Group; please see Priority Gathering for the New Quizzes Roadmap]