Quizzes Planned Priorities and Roadmap 2021

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

SuSorensen
Instructure Alumni
Instructure Alumni
72
23359

[As of July 2021, please post questions about New Quizzes in the New Quizzes user forum. For questions about the Classic Quizzes Timeline, please see Classic Quiz Sunset Timeline (Subscribe!)]

New Quizzes User Forum This year Quizzes will be taking a two pronged focus with our development. One focus will be on content migration. The second focus is continued enhancement of the New Quizzes experience. The two will occur concurrently to ensure that you are seeing continual improvement if you are currently using New Quizzes.  We just finished an improvement to the workflow experience of Adding New Quizzes from Modules. Next we’ll improve the Item Bank experience

 

Content Migration

We will be prioritizing the experience of your Instructors in the Data Migration. Two important aspects to note here are 1. training and change management and 2. all Classic Quiz content, question banks and quizzes, will be moved to the New Quizzes without requiring Instructor intervention. 

 

It is worth noting that while it doesn’t make anyone’s heart race faster or make eyes twinkle, we recognize that your data migration is critical. It will also take time. We know that an incomplete or buggy migration will cause instructors to lose confidence. So we are taking our time to get the experience just right for Instructors.

 

New Quizzes Enhancements

Thank you all for your comments when I asked for your individual priorities a couple of weeks back. The top five priorities for customers were:

  1. Item Banks expanded permission types/ Item Banks with sharing at course/account level
  2. Integration with the New RCE, 
  3. Public APIs (for reporting and third party tools) 
  4. Printing Quizzes and 
  5. more support for Partial Credit


The Quizzes team will address the Item Banks improvements next, which will make it easier to share item banks.  The below roadmap communicates the feature stops we plan to make as we evolve New Quizzes. These are meant to illustrate the high level plans. The plans are not locked, and we will continue to address high priority needs first, whatever those needs may be. If room in the plan emerges, having your feedback from earlier will allow us to adjust and continue to provide you with valuable improvements. The short term commitments can be found here: https://community.canvaslms.com/t5/Roadmap/ct-p/roadmap

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

72 Comments
kerstin
Community Explorer

Please allow multiple answer quiz questions to have partial credit with NO penalty.  We are currently going in manually to correct points for students. I have seen one work-around that I will test, however it would make us much happier to have it built into the system!

kerstin
Community Explorer

Please update Quizzes to have multiple answer questions with partial credit and NO penalty.  We are currently having to manually grade these questions and it is very cumbersome. Please advise what timeframe is expected for this upgrade.

Thank you.

RobDitto
Community Champion

@hesspe, thanks for posting those details, it helps me understand the current issues better! My sense is that these two current Idea Conversations might be good to rate or comment by everyone interested in the same capability you're seeking:

arnold_cassie
Community Participant

Can you give me any hope that partial credit and/or printing might be in by the beginning of August? Any hope at all? I REALLY don't want to show my teachers all these awesome new question types and then tell them BUUUUUTTTT its now extra work because they don't auto do partial credit without some sort of info on WHEN we might see it.

 

SuSorensen
Instructure Alumni
Instructure Alumni
Author

Hello, @arnold_cassie  -- we'll be working through the list in the above priority order, which was determined through conversations with customers and the prior community post for feedback

Item Bank sharing at the course/account and view permission are coming first, then so on. I would not expect Partial Credit to be completed by August. 

admin_nolan
Community Participant

Great to see the amount of planning going into the New Quizzes change over!

For us, we would like to have more users start to use New Quizzes as soon as possible.

Two main logistical issues slowing us down:

1.  Ability for proctoring vendors to integrate their product with New Quizzes. 
With co-vid, proctoring has become even more of an integral part of administering quizzes online.  So, hard to get many of our users to try New Quizzes until proctoring vendors can get their products working with the new quiz engine.  We happen to use Honorlock for proctoring

2.  Classic question banks and associated quizzes migration tool.
Our heaviest users of classic quizzes, who will need the most time to make the transition, rely heavily on quizzes built with classic question banks.  Need 6 months or more lead time to get items moved over and verify all went well.

Would help to know a timeline for these items.  With just a year to go, we want to get people using New Quizzes as soon as possible.

Thanks,
Kevin Nolan
Colorado State U

joe_fahs
Community Participant

Will this be part of Content Migration?

Many publishers (McGraw-Hill, Cengage, and others) provide QTI formatted test banks that do not meet the New Quizzes import standard (QTI 1.2 or 2.1). The alternative right now is to import each publisher chapter bank into Classic Quizzes, export the quiz bank as a Canvas QTI, and import the standard QTI into a New Quizzes Item Bank. 

There are bound to be QTI 1.1 (or in some cases identified as 2.1 but do not import into New Quizzes Item Banks) publisher test banks that currently can only be imported into Classic Quizzes.

Sylvia_Ami
Community Contributor

Wow @joe_fahs , this is good to know, thanks for the heads up. We haven't made the switch yet because of a few lingering glitches like this. Is there a better solution ahead besides the workaround you described? Is this something that publishers need to change or does Instructure need to do more work on New Quizzes so that this other format can be used? Are they aware of this issue?

joe_fahs
Community Participant

@Sylvia_Ami ,

I have made it known more than once to Instructure including a 3-21-21 comment on the Priority Gathering page

Publishers are slow to respond and I am not confident that they will convert all their old-formatted QTI banks.  In fact, some publishers provide only a proprietary test bank format such as TestGen requiring many steps to convert them into New Quizzes

It would be great if Instructure provides a converter for legacy QTI test banks after Classic Quizzes is removed. 

I worry that EDU instructional technologists supporting faculty will be caught between Instructure support (no, we do not support the import of legacy QTI's) and publishers who do not respond to convert these banks into the 1.2 or 2.1 format.

 

 

CarlaD07
Community Explorer

Thank you @SuSorensen for the update. It is helpful and encouraging to see the priority list of planned enhancements for New Quizzes. 

We are currently onboarding all 6-12 staff preparing for a full implementation of Canvas in Fall 2021. In turn, we are only training teachers to use New Quizzes because we don't want to train them on something that will be soon phased out. Our SPED department is concerned about how to best provide text to speech accommodations without access to the audio/video recording feature of the Rich Content Editor for each question. I recognize this is a priority list, and not everything is on the roadmap yet. Is there any hope the RCE will be operational at the question level in New Quizzes before next August? Or should we be planning a work around? Yes, we have other apps and tools that can provide a text to speech accommodation but our learners, especially ELL, struggle with how unnatural they sound and their inability to correctly read complete questions and formulas.

We are also standards based, but are not using outcomes and mastery grade book because New Quizzes cannot talk to the mastery grade book level. I agree with @danaleeling outcomes are a vital component of providing accurate feedback and would like to see this improved with New Quizzes. 

jonespa1
Community Participant

The equation editor in New Quizzes does not have all the features that Classic Quizzes has, and Classic Quizzes is still missing components that I need (for example writing a piece-wise function using correct notation).  There are some instances where I have had to create my equation using MathType, save it as an image, and then import that image to get the correct formatting.  

Also, I use test banks in almost every quiz or test that I create, so I have multiple test banks that I will need to convert to item banks.  I started this process, but it is going to take HOURS and HOURS to finish converting.  Creating a new classic quiz with test bank items, migrating, editing each individual question to move it to a new question bank.  It is very time consuming and tedious!

One question type that is missing in New Quizzes is fill in the blank with multiple blanks that allow numeric responses.  I will likely have to rewrite many of my questions because I do not see a way to have a numeric answer with a margin of error for a fill in the blank question.

If you are asking teachers to stop using one tool (classic quizzes), then please make sure the new tool (new quizzes) has at least the same functionality if not more options before forcing this switch!  This is creating many more hours of work!!

Steven_S
Community Champion

I am concerned to see that not only is there zero progress resolving the difficulties between multiple attempts of new quizzes and the late policy of the gradebook, but canvas has "shelved" the issue rather than even considering an attempt to address it: New-Quizzes-New-Gradebook-Late-Policies-and-Multiple-Quiz

This was recognized as a bug in response to this idea conversation, which was archived when the known issue page was opened: Idea-Conversations/Make-late-policy-affect-only-attempts-submitted-after-the-due

Now that canvas engineers are not trying to correct the issue, there are two additional idea conversations ongoing with concerns about it.  Quizzes-with-multiple-attempts-should-correctly-calculate-scores and late-penalty-behavior

Specifically, there is a failed communication between the scores calculated within new quizzes, and the late policy triggered in the gradebook.

The late policy should only be applied to those attempts that are actually late.  Since the grade is sent from new quizzes, but the late policy is applied in the gradebook, this means new quizzes needs to know what percentage will be applied as a late penalty to each attempt. 

For the highest score: new quizzes does not need to resend the score if the highest score is not the latest score after the late penalty - and if an earlier score is resent the gradebook needs a way to recognize the actual submission time attached to that score, rather than treat this as a new score to have a later time used for the late penalty.

For the average score: this is more complicated because the late penalty is applied to the score transmitted.  If an average score is transmitted as a late score, than the late penalty applies to the entire average rather than only to those attempts that are actually late.  Clearly communicating this situation to faculty and students might be the simplest solution, but it's not impossible to solve.  I'm sure the computers in new quizzes could be programmed to calculate the average with the appropriate percentage penalty applied to the appropriate attempts, and then calculate what score to transmit so that the late penalty applied by the gradebook would result in the calculated average.  The source of such a score might be difficult to communicate clearly, but the new quiz results page could probably display the math and the same could be shown to instructors on the moderate page.  Even if the average score is not fully addressed, the highest score calculation should be. 

An alternative solution would be for new quizzes to have their own late penalty calculation, and for the gradebook to have a way to exempt specific activities from the gradebook's late penalty.  (Of course that would mean new quizzes would need information about the due date, and the ability to update calculations if due dates are updated.)  Exempting certain activities from the late penalty would be particularly helpful for practice quizzes for which it would be better to have no late penalty at all. 

I know this is not a simple problem, because any solution appears to require efforts on behalf of both new quizzes and also canvas, but "shelving" the issue creates a roadblock for schools that use a late penalty due to other assignments not related to new quizzes.

DeletedUser
Not applicable

Given Canvas track record, most likely problem will be ignored.  Or sometimes they'll write back to you:  problem is fixed, check it.  Then you check it the problem is still there... -_-

SuSorensen
Instructure Alumni
Instructure Alumni
Author

@Steven_S This was brought to my attention this month and I am working with the PM who leads gradebook to collaborate and figure out how we should resolve. There are, it seems, a variety of expectations from teachers. 

eddie_bonetdiaz
Community Explorer

Does the item relating to Public APIs (for reporting and third party tools) potentially address the issue with Outcomes assigned to individual questions in New Quizzes not being recognized by the main Canvas system? Or not being able to assign Outcomes to items in the Item Banks?

The RCE item is confusing me a little since I haven't seen any negative news about it. Is there an article you could refer me to so I may inform myself about what the integration issues are?

Also, Copy Item Banks should probably be next. Teachers I may share my item banks with may like them but want to modify them to fit their student levels. I don't see an easy way for them to do this UNLESS there were an export feature. Hopefully you can see how at least one of the two should be bumped up in importance.

Circling back to Outcomes. They seemed work nicely enough with Classic Quizzes and although it's not a critical need it definitely shaves off hours of analysis for courses and school that currently are implementing Standards Based Grading assessment strategies.

adeski
Community Member

Please prioritize the Export Item Bank functionality in forthcoming development. Assessment items developed for high-stakes assessments are incredibly resource intensive to write, standard set and performance assess - it is very un-nerving not to be able to back up / archive / benchmark Item Bank contents and changes for review outside the limits of the Canvas platform, particularly where extra mural scruitny / collaboration is required.

nancym
Community Explorer

Thank you for this blog.  Many items have been well addressed by others. 

A major problem that has been mentioned, but not as frequently, is not being able to bulk download submissions to file upload questions.  

In order to grade PDF files submitted as answers, I believe each must be individually  downloaded from the speed grader.   

 

 

dschuma
Community Participant

@SuSorensen  will questions, answers, submissions, grades, and analytics be available to users after the bulk migration?

Thanks.

CanvasJackie
Instructure
Instructure

@SuSorensen Hi Susan, can you please expand on which question types are going to be included or that are being considered in the partial credit development on the timeline? Thanks so much! 

cvalle
Community Participant

I think Multiple Answer questions should just be treated as a series of true false questions. You either check the box or you don't, and so you either get the credit for that box or you don't depending on whether the option is correct or not.

So for example let's say a question is worth 4 points and has options A,B,C,D. Now suppose A and B are the correct options and C,D are false, but the student selects A and C. They would get 2 points for correct selecting A and correctly not selecting D. They would not earn the points for B and C since they should have selected B and un-selected C. Thus they would get 2/4 points.

Stef_retired
Instructure Alumni
Instructure Alumni

@cvalle  Have you tried using a stimulus with a series of true-false questions attached? Sounds like that's a perfect fit for your use case.

SuSorensen
Instructure Alumni
Instructure Alumni
Author

 Over the past several months, I've been chatting with customers and our users about migration we think we have a good idea about the least impactful way to move forward with migration. Please take a look: Possibilities for User Experience of Classic Quizzes to New Quizzes Migration