cancel
Showing results for 
Search instead for 
Did you mean: 

Quizzes Planned Priorities and Roadmap 2021

SuSorensen
Instructure
Instructure
27 69 6,164

This year Quizzes will be taking a two pronged focus with our development. One focus will be on content migration. The second focus is continued enhancement of the New Quizzes experience. The two will occur concurrently to ensure that you are seeing continual improvement if you are currently using New Quizzes.  We just finished an improvement to the workflow experience of Adding New Quizzes from Modules. Next we’ll improve the Item Bank experience

 

Content Migration

We will be prioritizing the experience of your Instructors in the Data Migration. Two important aspects to note here are 1. training and change management and 2. all Classic Quiz content, question banks and quizzes, will be moved to the New Quizzes without requiring Instructor intervention. 

 

It is worth noting that while it doesn’t make anyone’s heart race faster or make eyes twinkle, we recognize that your data migration is critical. It will also take time. We know that an incomplete or buggy migration will cause instructors to lose confidence. So we are taking our time to get the experience just right for Instructors.

 

New Quizzes Enhancements

Thank you all for your comments when I asked for your individual priorities a couple of weeks back. The top five priorities for customers were:

  1. Item Banks expanded permission types/ Item Banks with sharing at course/account level
  2. Integration with the New RCE, 
  3. Public APIs (for reporting and third party tools) 
  4. Printing Quizzes and 
  5. more support for Partial Credit


The Quizzes team will address the Item Banks improvements next, which will make it easier to share item banks.  The below roadmap communicates the feature stops we plan to make as we evolve New Quizzes. These are meant to illustrate the high level plans. The plans are not locked, and we will continue to address high priority needs first, whatever those needs may be. If room in the plan emerges, having your feedback from earlier will allow us to adjust and continue to provide you with valuable improvements. The short term commitments can be found here: https://community.canvaslms.com/t5/Roadmap/ct-p/roadmap

69 Comments
urbansk6
Community Participant

Thank you so much @SuSorensen ! It's really great to know that our voices are being heard and that New Quizzes are continually improving.

ken_i_mayer
Community Participant

Data migration DOES make my heart race and eyes twinkle. Thank you for committing to it.

I also request that the other side of data migration--exporting and archiving New Quizzes be a priority. 

Administrators are required to store quizzes, student responses, and grades, and that obligation does not end even after the LMS is discontinued. Instructors also want to use the same material in two different Canvas instances. For example, Georgetown professors teach classes to high school students, students abroad, and incarcerated people who cannot access our main Canvas instance. We send them to Canvas Free for Teachers, so they need to export their courses from our instance and reimport into another.
We need a way to export New Quizzes and associated question banks within a course container. 

SuSorensen
Instructure
Instructure

Thanks, Ken. Glad I can make someone's heart race with our choices 🙂 

The data exporting context is really great to know. Thank you for taking the time to tell the story of why it is important to your institution. Of course, I can't commit to it at this time. However, knowing this now, I'll be able to talk to engineers about if there is any functionality in our Classic to New Quiz migration to leverage this type of export sooner. When a post for our Migration plans is ready, I'll be sure to update with what I know for exporting at that time. 

jwadec
Community Contributor

Thank you for the update @SuSorensen. This is all great news!

ken_i_mayer
Community Participant

Another context for data exporting is that instructors spend years building and perfecting their courses in Canvas. Really it's only in the 3rd year of teaching that the prep work starts paying off. And then they get their PhD and go to a new institution, or accept a job somewhere else, or their university IT people transition to another LMS. If there is no way for them to export, store, and preserve their quizzes and question banks, no sane faculty member should devote hours upon hours developing them and then be forced to start from scratch. What will happen is few people will use the Canvas quiz engine, and will instead rely on quality third-party products.

l_lucas
Community Participant

What a shame that fixing the navigation is still not in the roadmap. I have been banging on about it to Instructure for about two years now it seams.

Until we can put new quizzes into the navigation for a courses (with back and next buttons) - then its effectively unusable for self-directed learners - and dramatically inferior to other software that LTI 's into Canvas 

😞

RobDitto
Community Champion

Thanks so much, @SuSorensen, for this proactive guidance. We will make use of every single one of these improvements at our institution, once they're released. They seem like the best possible choices in light of Community feedback.

I'm hopeful that #3, in particular, will lead to related improvements this year, such as:

jsowalsk
Community Advocate
Community Advocate

I second @RobDitto especially about Honorlock, Design Tools, and Panopto.

Steven_S
Community Champion

These are all good priorities.  Still serving as a roadblock, however, for new quizzes with essay questions, are: New-Quizzes-Grading-Indication  and New-Quizzes-respect-hide-grades and New-Quizzes-should-not-automatically-grade-written-responses

A likely simple solution should be for new quizzes to transmit a symbol (whatever symbol canvas already uses) for "needs grading" in place of the auto-graded zero for all essay questions and any true fill-in-the-blank (not word bank or drop-down list) that is not automatically identified as a match to the provided answer.  The grades tab, speedgrader, and probably even the to-do list, should all restore expected indications for grading with that simple change. 

Maybe "needs grading" can be included with the partial credit updates?

Steven_S
Community Champion

@RobDitto   Quizzes.Next should work with Lockdown Browser or Proctorio says it is in beta testing, but it is functional with live students using lockdown browser.  Updates even now allow the use of chromebooks with lockdown browser.  (I have not tested proctorio or honorlock in new quizzes.  Maybe those are the ones in beta testing?) 

There are improvements needed for the lockdown browser integration, such as remembering the lockdown settings for copied quizzes the way it does for classic quizzes.  Eventually, I would love to see a default advanced settings at the top of the lockdown browser tab, and then a simple checklist to decide which quizzes to apply those settings to. 

For now, though, lockdown browser is function with new quizzes, so you won't need API adjustments.  With new quizzes, it also does a neat trick where it auto loads lockdown browser when students get to a quiz.  That simplifies the instructions we need to give students.

rhenri24
Community Member

Thank you @SuSorensen! Keep the improvements and the updates coming!

cvalle
Community Member

Thank you for this update! I am very excited about the ability to print quizzes. 

stephanie_schot
Community Member

Thanks so much, this gives a lot of teachers at my school hope - especially with content migration of formatted/multi-line fill in the blank and fill in the table style questions needing RCE support. Thanks for all you do!

 

mcalhounbsd
Community Member

So glad to see the expansions happening in quizzes. Thank you!

danaleeling
Community Participant

I am surprised that reporting of outcomes to the learning mastery section of the gradebook is not a priority. Assessing learning against outcomes using quizzes and tests is critical to meeting both college and accrediting commission assessment of learning requirements. At present outcomes can be attached to New Quizzes, both to the whole quiz and to individual questions, this is fantastic. But that data is not getting reported to the learning mastery view in the gradebook.

I took a look at dropping back to Classic Quizzes (we are an institution new to Canvas, so using a soon to be deprecated feature seems counterproductive - why train faculty who are new to Canvas to use a tool that is going away in a year?) and found that the Classic quizzes approach to be clunky at best. Quiz questions apparently have to logged to question banks which are then attached to outcomes, rather than directly attaching outcomes to a quiz or to individual quiz questions.

Please make outcomes reporting to learning mastery a priority...

nicole_fleetwoo
Community Participant

@Steven_S it's supposed to work with Respondus, but it does not.  There were problems with quizzes containing over 50 questions - some students received all questions, others just the 50, while others a random number.  That issue has supposedly been fixed, but we have opted not to try it for obvious reasons.  Our Nursing faculty have all moved back to Classic Quizzes due to the number of problems they've experienced with Respondus and New Quizzes.  Those issues were not just limited to not all questions being given.  We have recommended that if you want to use Respondus in any fashion, then use Classic Quizzes.

lenzeb
Community Participant

Thanks for the update. Can we please have an estimated timeline? The phrase "we are taking our time" paired with New quizzes doesn't inspire much confidence. Will this migration be ready for Fall 2021? 

Steven_S
Community Champion

@nicole_fleetwoo  That's a good point.  Most of my quizzes are 8-10 questions long, and set to ask one question at a time.  Those new quizzes have gone well in lockdown browser, and have been easier for students to get stated with than classic quizzes with lockdown browser.  I hope someone has continued testing the long quizzes you struggled with to confirm that the fix canvas released was successful.

SuSorensen
Instructure
Instructure

@lenzeb this is a list of significant items we are focused on delivering in 2021 to prepare for the July 2022 retirement of Classic Quizzes. Migration will be completed in time to give educators 6 months to review migrated content as well as any change management or training schools would like to do in advance of the transition.  "Taking our time" was my way of indicating this is a high priority, we're not going to rush it though just to say it is done. It will continue with a high priority emphasis through testing, customer feedback and validation prior to full release. We know that first impressions of the migration will be lasting.

lenzeb
Community Participant

@SuSorensen Thanks I appreciate the timelines and wish you and your team success in the endeavor I can only imagine the amount of stress on getting it right. 

dkrasne
Community Member

Things that weren't available as options in the poll but that are REALLY important:

- Ability to set when students will be able to see the answers, AND to make this group-dependent; if one student has a different due-date, I shouldn't have to delay the ability of all other students to see the answers. This was possible in the old quizzes.

- Ability to manually grade/adjust a few questions at a time without students repeatedly being told that their quiz has been graded, before it's actually done. (Also, fixing the bug where the gradebook doesn't accurately reflect/report the grade on a quiz.)

- Ability to modify/add an answer before all students are done with the quiz, rather than it only applying to those that have been submitted. This is just common sense.

meichin
Community Participant

Hi!  We really appreciate this blog. it is so helpful as we prepare for New Quizzes.

We have done some testing and found a couple of things we wanted to suggest or get more information about.

  1. We are missing hover tooltips on the main build page. This poses a problem for users who are less comfortable with technology (are screen-readers picking up on these icons? If so, can the icons have a hover tooltip that displays the same text label the screen-reader picks up?). We have instructors who will want to know exactly what they are clicking on and may not feel comfortable with the mostly icon-driven interface. There is a tooltip for the “+” button that adds a new question, but it would be really helpful to have hover tooltips for all the icons.
  2. Different Autosave behavior - no “save” or “cancel” button at the bottom of the page as is present in Classic Quizzes. Instead, when editing a question and then clicking outside the question box, it autosaves. We’re worried this could make it really easy for instructors and TAs to accidentally mess up quiz questions without realizing they’ve made any changes, especially anyone who might use a touchscreen device, like a tablet. Here are the things we have found that may trip some of our folks up:
    • When first creating the Item (question), the user must click Done.
    • When making edits to the question, clicking outside the Item (question) box will auto-save the changes.
    • Users must click Cancel if they want to void their changes FOR THAT QUESTION. No way to cancel all changes made to the quiz in one fell swoop as we have in Classic Quizzes.
  3. Question: Can you see the quiz revisions? That may help if the instructor accidentally makes changes not realizing it autosaves.

Thank you very much!!!

Steven_S
Community Champion

@meichin  the save functions in classic quizzes caused their own problems.  There we have to click save on each individual question to save it.  Once a question saved ("by clicking "update question") there is no reverting to the original version, but that does not mean the classic quiz is fully saved with the new question. There is no way to revert the quiz to its prior questions using the cancel button.  Even if they cancel out of the quiz changes teachers will still see the newly saved ("updated") questions, but students continue to see the original questions until the teacher saves the changes to the entire quiz.  It is far too easy to save ("update") the changed questions, but not save the changes to the quiz in classic quizzes.  The autosave in new quizzes completely removes that issue.  It would be nice to be able to see the quiz or question revisions though.

hesspe
Community Champion

@SuSorensen Re: "to prepare for the July 2022 retirement of Classic Quizzes. Migration will be completed in time to give educators 6 months to review migrated content as well as any change management or training schools would like to do in advance of the transition. "

This seems to me to be inconsistent with what appears in the Timeline   (2022 July/September).  <https://community.canvaslms.com/t5/Releases/Upcoming-Canvas-Changes/ta-p/254349> where it says "New Quizzes will be enforced for all accounts. However, customers can continue to use Classic Quizzes in conjunction with New Quizzes." 

I think the discrepancy comes down to what is meant by "continue to use."  If I didn't know any better I would assume that Classic Quizzes would remain fully functional indefinitely, including the ability to create new "Classic" quizzes.

hesspe
Community Champion

Also, @SuSorensen  I'm surprised the issue of surveys has not yet been raised.  We have 9 years of survey development in classic quizzes.  100s if not 1000s of surveys, with some people collecting longitudinal data.  Our CSM says we can switch to a 3rd party tool when New Quizzes is enforced.  Other than that, do you have anything in the works to ease the pain of that transition?

Steven_S
Community Champion

@hesspe  Although New Quizzes does not have a specific setting for surveys (or practice quizzes), there is a way to create a new quiz that functions similar to a survey.  You can make the quiz worth zero points (in assignment settings no matter what the points on the actual quiz are) and check the box for do not count towards grade and set the grade style to complete/incomplete. 

Then in quiz settings you can use a single option to restrict student results view, if you do not check any option below that restriction students will have no access to their scores and so it will not matter what is marked correct or incorrect, from the student perspective.

You can also use a multiple answer question style instead of multiple choice, so that you can mark all answers "correct" and showing results to students won't be an issue.  However, you cannot transform an existing multiple choice into multiple answer (or any other style) so, if necessary, make that change in classic quizzes before migrating.  I hope new quizzes will eventually allow question types to be switched, but currently you would have to make a new question in the new style and copy and paste everything from one to the other.

What you cannot do in new quizzes is set up an anonymous survey.  The instructor will be able to see how each student responded.  It is possible to set speedgrader to hide student names, however, and new quiz analytics do not show student names with the statistics of their responses.  It would be more of an honor system for instructors not to look at who said what, so rather than call it an anonymous survey, you might have instructors state to students that they will turn off names when viewing individual results or only look at cumulative results and statistics.

The new quizzes set up as described will still show up in the gradebook even though they do not count towards the grade, which is why I suggested setting the grade type to complete/incomplete and zero points.  For graded surveys, use the desired score instead of zero and do not check the box that says, "do not count towards the grade." Complete/incomplete grading will give the students all of the assigned points when they submit the quiz. 

Any late policy set in the course will be applied to the score displayed, and the score displayed is used for module rules.  That means students who submit a graded survey late will get partial credit (as specified by any late policy), but then they might be blocked from proceeding on to whatever module step the survey was a prerequisite for if you require that they "score at least."

The late policy applying to practice quizzes (set up basically the same way, except they are usually set with the same point value as are in the quiz) is also a problem for that reason, and also because students see the lowered score from a "late" response and that lowers the perceived retake value for the practice quizzes we intend students to use for studying. 

hesspe
Community Champion

@Steven_S  I very much appreciate the time you took to answer this question and you have cited several workarounds that I hadn't thought of.  I think what you suggest will be workable in a lot of cases though I do not look forward to explaining to faculty all of the adjustments they have to make.  A whimsical suggestion that occurred to me was to strip out  the quiz specific functionality of Classic Quizzes and rename it Surveys.  

jeannette_okur
Community Member

Some of us use Practice quizzes in public Canvas courses.  There seems to be no workaround for those if practice quizzes do not exist in New Quizzes.  How are those of us running open course materials for the public supposed to adjust materials?

Steven_S
Community Champion

@jeannette_okur  Practice quizzes basically only need you to check a box stating "do not count towards grades" in the assignment settings.   Practice quizzes will look the same as regular quizzes to students, so you might indicate that the quiz is for practice in the title. 

Practice quizzes will show up in the gradebook.  I hope there will eventually be a simple filter to only show what is counting towards the grade in the gradebook, but for now you can put all practice quizzes into an assignment group without any graded activities, and use the assignment group filter to look at each graded assignment group one at a time.  Migrated quizzes all go into one migrated quizzes assignment group, so sort out anything in that group into the correct assignment groups, then migrate all of your practice quizzes, and then change the group name to practice quizzes.  If you migrate something later, a new migrated quizzes group will be created.

You have the added option of giving students some set credit for attempting practice quizzes, which is not dependent on the accuracy of their responses through the complete/incomplete option, and then not checking the "do not count towards grade" box.  You also might choose to show the grade as points (instead of complete/incomplete) and set the point value to match the points in the quiz.  That will let students track their progress on the practice quizzes in the gradebook, and checking the box to not count this quiz towards grades will still prevent the practice quiz from impacting their score.

In my practice quizzes I have used question groups divided by topic, with random question selection from multiple groups, to increase the retake value of the quizzes.  I have tested the migrate tool, and these migrate by creating a new quiz item bank for each question group, and automatically set the migrated quiz to randomly select the same number of questions that were selected from the corresponding classic quiz question group.  Starting as a practice quiz does not prevent the migration.

The practice quiz I migrated did not check the "do not count towards grades" box automatically, and it should because no one starts out expecting a practice quiz to count towards grades at all.  Right now that means you will need to select edit from the three dot menu of that quiz on the assignment tab, and then select "more options" in the popup to find the setting and adjust it yourself.  There is already a solution in beta testing to bring you directly to the screen with that setting first when you open a new quiz, and so that will not be quite this complicated for very long.

As I noted above, the points students see in the gradebook, if you do show points, will be impacted by the late policy just like everything that does count towards their grade.  That will negatively impact student perception of the practice quiz retake value, and module requirements to "score at least" can become impossible to meet due to the impact of the late penalty. 

There are conversations open about creating a true practice quiz and addressing the problem of the late penalty with modules that require students to "score at least" and to exclude assignments that do not count towards grades from the late penalty:

https://community.canvaslms.com/t5/Idea-Conversations/New-Quizzes-Practice-quizzes/idi-p/398053

https://community.canvaslms.com/t5/Idea-Conversations/new-quizzes-late-penalties-and-module-requirem...

https://community.canvaslms.com/t5/Idea-Conversations/exclude-assignments-that-do-not-count-towards-...

smurphy12
Community Member

This great, but while you are enhancing the new quiz experience... please PLEASE PLEASE add in an intuitive way to print that works in any browser. Many of the accrediting bodies need to have printed copies of both the blanks and x amount of completed quizzes for their review. Right now, there is no easy way to print.  I have a work around but it is cumbersome and takes multiple steps to get a print out copy for use and it ONLY works in Chrome with a select all and print selection. NOT INTUITIVE. 

Anonymous
Not applicable

Please let us create "groups" like Classic Quiz where we can choose questions from multiple item banks.  And let us choose specific set of questions from an item bank that could be all or randomly chosen.

Steven_S
Community Champion

@Anonymous   Right now you can add a select number of random questions (or all questions) from one item bank, and then as another "question" add a select number of random questions (or all questions) from another item bank, and so on.  It's not a group randomly selecting the same number of questions from each bank, but you can create the same effect.  Just click the "+" to add a question, and click the pig icon instead of a question type.  Then the list of banks will open on the right side of the screen.  Select the bank you want to use, and the options for all or random will be a button at the top of the list of questions.  (You can also add specific individual questions one at a time if you choose.)

As for choosing "all or random" from a subset of questions from one bank, right now you need to make individual banks for each subset.  You can put the questions in a quiz, duplicate the question, and then add the duplicate to a different bank.  That is slightly better than having to retype the whole question for a new bank, just to divide the bank into subsets.  (It would be nice to be able to change the bank a question is in, or copy a question directly to a new bank as a duplicate that can be changed without impacting the first copy... and/or to have a list of banks that the question should appear in so that updates apply to all incidents of the same question.)

Anonymous
Not applicable

Hi Steven

Thank you for the advice.  I tried what you said.  But that doesn't produce the effects that I'm describing.  For example:

Suppose I have two item banks.

  • Item bank "BANK 1" has 10 questions.
  • Item bank "BANK 2" has 16 questions.

 

Now I go to make a New Quiz called "New Quiz X".  And then I begin adding questions to "New Quiz X" using only these two item banks.

  • I want to make a "Question 1" that will pull 5 specific questions from "BANK 1" and 8 specific questions from "BANK 2".
  • And I want "Question 1" to pull just ONE question at random from either "BANK 1" or "BANK 2".

 

So in this scenario I'm describing, I just want to make ONE question named "Question 1" that will pull at random specific questions from two different item banks.

Steven_S
Community Champion

@Anonymous   For now you will need to add each of the specific 5 bank 1 questions and each of the specific 8 bank 2 questions temporarily to the quiz.  (Instead of the add/random button, use the plus button next to each specific question you want to use.)  Duplicate each question (click the icon in the top right corner of the question that looks like a "+" in a box with another box behind it).  Delete each original (the first of the two identical copies) by clicking the trash can icon in the same corner.  Now you have identical copies of each question that are not tied to a bank. 

Click the edit pencil in the top right corner of each of those, and find the last option below the question, titled "Item Banking."  Use that to add the first question to a new bank, and then repeat with the others to add them to that same bank.  Now your questions "live" independently in two banks.  (I understand item bank searching is coming soon to help with the large number of banks this will create.)

Click the trash can icon on every question.  Then start adding questions from banks again, but this time select your newly created item bank, and use the add/random button to create your random selection.

Anonymous
Not applicable

Steven, can you make a short video demonstrating what you just described that will also produce the effects I mentioned previously?  Because doing what you just described defeats the whole purpose of my original input.  Now imagine doing what you prescribed repeatedly every time this scenario occurs.

Steven_S
Community Champion

@Anonymous  I was trying to help you get the results you want now - instead of waiting for a new feature to be added.  It's not a fast solution, but it does give you random section of questions from the specific subset you desired. 

If my description of how to do it is not clear enough, I'm sorry.  The screen recorder I usually use in powerpoint is broken.  It keeps shutting down when I click record.  But I sense that all you want to do is make a point that your way would involve fewer clicks and less time, if it could be made to exist.  As a teacher responsible for building all of my own courses without a course designer or any canvas training or warnings about new features... I can tell you I'm already aware of that.

As I noted earlier, I agree that adding options " to change the bank a question is in, or copy a question directly to a new bank as a duplicate that can be changed without impacting the first copy... and/or to have a list of banks that the question should appear in so that updates apply to all incidents of the same question." would be helpful.  Those options just are not here yet, and I don't work for canvas, so all I can do is suggest a way to get it done with what tools are there now. 

urbansk6
Community Participant

@Steven_S Thanks so much for listing the workarounds, but for most of us (especially those of us with more content, more collaboration, and bigger class sizes to juggle) the fact that the workarounds to a bunch of these problems are so complicated, unintuitive, hard to find, non-native to Canvas, and time-consuming is part of the problem—it makes the workarounds just as problematic as if they didn't exist at all. In the meantime, I think some of us don't want to risk the issues outlined here being overlooked by Instructure staff just because there's technically a time-consuming workaround. We appreciate you doing what you can with the resources you have, but we're also trying to alert the Canvas team to the issues with the current system, and I think that might be a bigger priority here.

Steven_S
Community Champion

@urbansk6  I understand that it can sometimes be difficult to communicate an issue to canvas, but if you do not acknowledge the existing work around in your comment while saying, "I cannot make a quiz that..." (is a practice quiz or survey for example) the canvas team is likely to point you to pages where the "workarounds" were designated as the way to accomplish these types of quizzes, rather than add a new solution.  Instead of simply rejecting existing work arounds, communicating where those work-arounds fail is more effective at making it clear why solutions are needed. 

I agree that those options need to be improved, and the fact that users are asking the question without being aware of the work-arounds is evidence of your point that the work-arounds are not intuitive.  The fact that they are not intuitive is why I respond to those questions by describing the work-around.  The work-arounds have a lot of settings to keep track of, and so quick-set links to apply "practice quiz defaults" or "survey defaults" would make it more intuitive.  Not being intuitive  is only one reason for needing updates, however.

For example, surveys and practice quizzes both clutter the gradebook.  Current gradebook filters only let you look at one assignment group or module at a time. If surveys and practice quizzes must fill up the gradebook there should be a filter to only show items that count towards the grade.  If surveys and practice quizzes end up completely excluded from the gradebook they still need to retain a way to show up in student to-do lists (which already causes a problem for students who do not visit the modules tab when classic practice quizzes are mandatory due to module requirements.)

Once in the gradebook, the late penalty applies, which lowers the perceived retake value for students after the due date.  In fact, even if the practice quiz is set to count the highest score, the late penalty gets applied to whichever attempt is transmitted from new quizzes to grades, even if earlier attempts were on time.  Also, new quizzes does not consider the late penalty when deciding which score is the high score. 

Even though a practice quiz does not count towards grades, seeing their score go down makes students concerned.  A 10% per day penalty means that students studying for an exam by repeating a practice quiz from 10 days earlier will earn a score of zero points, no matter how well they do on the quiz. 

Additionally, module rules set to require students to "score at least," which I use with my practice quizzes as a way to require students to achieve a certain degree of comprehension before continuing, will only use the final grade posted in grades, which means the grade posted after the late penalty.  So that student who was retaking a practice quiz 10 days late might no longer meet module requirements that unlocked the exam they are studying for.  (For this reason I cannot migrate my practice quizzes yet.) 

An option to exclude everything that does not count towards grades from the late penalty would resolve all of the late penalty issues (for practice quizzes - graded quizzes still need a resolution for multiple attempts interacting with the late penalty.)  However, the new "first attempt" option will run into a similar issue if teachers try to apply the suggestion from the deploy notes to let later quiz attempts be used to meet module rules to "score at least."  That would require new quizzes to interact directly with the module requirements, separately from its interaction with the gradebook.  (This would resolve part of the late penalty issue but students would still see their score drop in grades after repeating a practice quiz, and so exempting the practice quizzes from the late penalty is still necessary.)

Also, for graded quizzes there is no way to provide question specific feedback (as pre-sets or even after grading) without displaying at least the questions, which would lower exam security.  Allowing "show item feedback" to be select-able separately from "show items and questions" would solve that problem.

And of course, the issue of needing the same questions randomly selected in different combinations, creates way too many banks since we do not have filtering or searching yet for item banks.  Also, it negates one of the perks of banking questions (updating the question in one place).  If we do need to put the same questions into multiple banks, there needs to be away to link those duplicates for updates.  And of course @Anonymous has a point that the work around is time consuming.  If the solution is that the identical question should be in multiple banks, then there should be an option in the item bank to "also link question to __ bank"  It is important that linking questions to multiple banks be distinct from duplicating questions, which can be used for the creation of a set of very similar but unique questions (and would be "broken" if editing a duplicated question changed the original.) 

There are many issues that need to be solved, and I see a lot of good progress towards addressing issues in new quizzes lately.  I don't think you need to be concerned that a time consuming work-around will cause the problem to be overlooked.   It might not be as urgent as a problem without a work-around, but new quizzes has something new happening almost every time I look at it these days.  The problems we've reported are being addressed, a work-around does not prevent that.

Anonymous
Not applicable

@meichin I totally agree with the points you highlighted.  Autosave on editing is NOT helpful.  Give the user the option to cancel changes made.  And make the user actively click save to confirm that they actually want those changes.  But make this per question only!

In class quiz, it was super silly that you had to confirm the edit on the question, and then confirm the edit by saving the quiz itself.  Just make the user confirm on the individual question that they want to save edits for.

rislis
Community Champion

Would it be recommended to export Classic Quizzes as QTI Files and then import them as New Quizzes and Item Banks into 2021-2022 courses? I had to do a little cleaning on a few multiple choice answers, but for the most part, it worked.

hesspe
Community Champion

@SuSorensen 

From the timeline:  July 2022.

  • Classic Quizzes is deprecated
  • New Quizzes is enforced

I can read that Deprecated means: "The software element or feature has been replaced by newer functionality and should be avoided, as deprecation precedes its complete removal on the specified date. Additionally, support for the element or feature is no longer provided." Enforced means "A previously announced preview feature (feature option) or other feature change is required for use by all Canvas accounts."

So I assume that means - please correct me if I'm wrong - that when you go [+ quiz] after Decenber 2022, you will see something like the current "Canvas now has two quiz engines. Please choose which you'd like to use."  Care to guess how many instructors at our institution will choose "New Quizzes" under that scenario?  I think I can, with remarkable prescience. 

Then, July 2022: “Classic Quizzes is removed from the Canvas interface (end of life)”!?!

A mere 5 months later Whoosh!! Classic Quizzes goes away entirely!!! 

This sounds to me like a recipe for disaster, both for me as a support person and admin, for us as an institution committed to Canvas, and for you as Instructure, if I ever heard one.

Any clarification would be appreciated. In the immortal words of Charlie Owens. "Say it ain't so Joe."  But of course it was.

Anonymous
Not applicable

@hesspe @SuSorensen 

Eh...this should be fixed before shutting off classic quiz.

New Quizzes does not import/export correctly.  It's broken.  That means all of your New Quizzes cannot be copied to anywhere.

 

Here are the steps to confirm this:

(1)  Create a new course, add a New Quiz, and create some questions directly in the New Quiz.

(2) Export this course directly to another new course or do it by creating an export package and then import it to another new course.

(3)  At this point, the New Quiz in that export will copy correctly onto the 2nd new course.

(4)  Now, reset both courses you just made using “Reset course content” button.

(5) Then use the export package to reimport that course with the New Quiz and you’ll see that the New Quiz will not make a copy.  All settings on that quiz and questions inside will disappear.

 

Please edit this behavior.  It’ll cause a lot of problems for a lot of users as well as for the support team.

And yes, I turned on the New Quiz option in setting before importing.

hesspe
Community Champion

This is my suggestion for a phased "end of life" process for Classic quizzes which I think would be easier (still not easy) on faculty who use Canvas quizzes and those of us to support them.

Phase 1) New Quizzes are enabled for all courses and faculty are given the option of choosing New Quizzes or Classic Quizzes.  Those who choose Classic first see a screen where they are shown the End of Life timeline.

Phase 2) New Quizzes is the default for creating quizzes, but on the settings page there's a button that allows an instructor to revert to creating a Classic quiz.  If Classic is selected, the faculty member first sees a screen showing the End of Life timeline and must acknowledge having read it.  When copying a classic quiz, the default is to convert the quiz to a new quiz, but conversion can be cancelled and the Classic quiz copied as is.

Phase 3) Classic quizzes can no longer be created, but they can still be copied (as above) and taken.  

Phase 4) End of Life for Classic quizzes.

These could be accommodated within the 20 month timeline for End of Life, or by extending it only a little.

smurphy12
Community Member

Please Please Please make the ability to print quizzes for the instructor and admins EASY! No hoops... No API or other app needed.  print button... The accrediting bodies (even though we are in a digital age) require both blank quizzes (adding banks to this as well would be awesome) and completed quizzes.  Also an option to print the blank with outcome alignments would be very helpful... Thanks

SuSorensen
Instructure
Instructure

Thanks for bringing the confusion around New Quizzes enforcement/ Classic Quizzes deprecation to my attention, @hesspe & @Anonymous . I'll post a more detailed explanation of what we're imagining for sunsetting of Classic Quizzes. I appreciate your suggestions about what would be an easier transition for you with the deprecation of Classic Quizzes and will absolutely consider feedback along these lines. 

m_a_fernandezpa
Community Member

Thanks very much @SuSorensen! Really looking forward to seeing improvements in the Export and Partial Credit features in particular! I'm sure teaching staff will find them very useful. 

David_chem
Community Member

@SuSorensen 

I am very glad that partial credit is included in the road map. As I read through other post on partial credit, I noticed calculation might be an issue. Here is a suggested mathematical function.

Total points = [(total correct selected) - [(total incorrect selected) - (total incorrect missed)]]*(points per correct answer)

where (total incorrect selected) - (total incorrect missed) >= 0

This work as the following: Questions has three right selections, and student selects two correct and two wrong (total of four selected). If each right was worth 1 point, the student would be penalized for selecting too many (total points = 2 - (2-1) = 1). But if they only selects one wrong answer, there would be no penalty (total points = 2 - (1-1) = 2).

This will solve the problem of over penalizing without allowing for selecting all answers and still getting all the points. It also gives flexibility for value of questions without forces a fixed penalty or point value per choice.

Thank you again for your work on this. I am look forward to all the improvement.

PhilJ
Community Member

A big YES to being able to print quizzes; in a hybrid teaching and learning environment, this is essential. 

schroederbill
Community Participant

@SuSorensen 

One feature we would like to see brought back from classic quizzes is the ability to add time to an in-progress quiz attempt.  We often run into issues in our testing center where students encounter some sort of technical issue and they lose time on an in-progress exam while the problem is being fixed.  With classic quizzes it is easy enough to just add back that missing time.  Unfortunately that is not an option with new quizzes.   

hesspe
Community Champion

@schroederbill @SuSorensen What I see when I click the Moderate button for a student with a quiz in progress, I see this:

hesspe_1-1618690286134.png

And this:

hesspe_2-1618690323390.png

But as the student I see this:

hesspe_3-1618690369214.png

(No additional time given)

And the Quiz is autosubmitted at the expiration of the original time limit:

hesspe_4-1618690544231.png

 

To me "Manage Current Attempt" clearly implies that you will be giving that student more time on the -um- "current attempt", but it my testing it does not. Please let me know what I am missing here.

 

Thank you.