cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
SuSorensen
Instructure
Instructure

Classic Quiz Sunset 2024 Feedback

This thread has been opened for comments and suggestions about the updated Classic Quizzes Sunset Timeline. 

128 Replies
ClubPilates
Community Member

Missing essential/crucial setting in the New Quizzes preventing conversion - "Only Once After Each Attempt" under the "Let Students See Their Quiz Responses (Incorrect Questions will be marked in Student Feedback)" option. 

 

We are not a traditional school and we do not have deadlines/due dates because we do not have a set grading period like semesters or quarters. Our students enroll in a one-year program and they work at their own pace with a maximum of a year from the date of their enrollment. So we cannot manually change who can see the quiz results since the timing would be different throughout the course and we cannot create a new quiz for each cohort because maintaining them if something breaks (like when we had an image issue or just assigning them individually for each new enrollment is not sustainable or realistic.

 

We were launching revamped new courses of our courses last month and I thought, perfect time to switch over to the New Quizzes and was trying them out but this made it so that I had to keep the old quizzes instead. I cannot see my organization switching to the New Quizzes at all if this isn't addressed and if the Classic Quizzes will sunset in 2024, then we will take our search for a new LMS a whole lot more seriously. I know we're not very big compared to Canvas' other schools but it seems Canvas is more concerned with traditional school structures only as updates are launched. 

gail_bliss
Community Participant

I apologize profusely if this question has already been asked and answered.  We just began the integration to the New Quizzes and I have discovered one very important deficiency.  We used the "survey" option to provide questions that accepted any answers chosen.  There does not seem to be such an option in the new quizzes.  We used this for log submissions and in end of course surveys.  Is there a plan for this type of question to be added in the New Quizzes?

rosie_sweet
Community Member

Is there going to be the ability to record a video or audio response straight into new quizzes like they can in classic quizzes? 

0 Kudos

@rosie_sweet yes, you can checkout the post about RCE here.

0 Kudos

thank you 

0 Kudos
themidiman
Community Champion

@rosie_sweet , when Instructure makes good on their promise to add the common Rich Content Editor to the New Quizzes interface, I imagine that the learner's ability to record into an essay style question will be an option. 

0 Kudos

thank you 

0 Kudos
sarreguin
Community Participant

Teachers on our campus are so thankful that you are waiting to Sunset the Classic Quizzes.  We have quite a few teachers that make use of the Audio and Video features in Classic Quizzes.  They would be sold on New Quizzes if we had this option available.  Will you be adding this feature to the New Quizzes?  It would greatly benefit the students.

tom_gibbons
Community Contributor

New Quizzes content editing doesn't follow the permissions set up in Canvas for user roles. 

I've just noticed this, because I'm at a new institution that exclusively uses managed curriculum.

Our instructors are not permitted to edit quizzes. This works when instructors try to access a NQ from modules--though it's a bit weird, because they get an "access denied" warning. This is because accessing from Modules would drop the instructor into the Assignment Settings page in Canvas-proper. Since their permission set doesn't allow them to edit assignments or quizzes, this is expected. 

When they open a NQ from the Assignments or Quizzes index, Canvas displays a framed-in version of the quiz if a student has taken the quiz; if not, the instructor gets dropped right into the NQ build environment. All of the quiz edit functionality is available for the instructor in either case.

Is there something that I'm missing? Is there a way to configure things so that NQ respects the established user-role permissions in Canvas-proper? 

0 Kudos
hesspe
Community Champion

@SuSorensen Among the items listed on the page  "Transparency into Quizzes Planning" lists in the column "Before July 2023" are (I think I'm quoting verbatim):

Student Analysis report to support teachers

Analysis downloadable csv for Teacher manipulation

Does this mean that both Student Analysis and Item Analysis reports will be available for Teachers to download in CSV format?  It would be helpful for clarity if you changed the wording to indicate whether both, or if not both which (if either), will be downloadable by 6/30/2023.

Thanks!

 

talway
Community Contributor

One item that I have only seen mentioned one time in the previous 8 pages of post is the ability to limit when students can see their responses after the completion of a quiz.  We have some testing that takes place whereby the entire class tests at the same time.  With Classic Quizzes, the instructor could limit the ability of the student to see the answers only one time, at the completion of the exam.  This eliminated sharing answers with future testers.  Currently, without that feature, student who finish the exam early can leave the testing center and go and make copies of the answers and share with those in future classes.  With New Quizzes, the best that an instructor can do is to remember to disable the ability for students to see the answers immediately after the last person has completed the test.  By this time, the damage has been done.  Please consider adding this feature.

talway_0-1648128905984.png

 

RobDitto
Community Champion

@talway and @isabel_anievas, if you haven't already, please consider giving a positive rating, and/or a comment about your need, in this long-running Idea Conversation:

isabel_anievas
Community Participant

Great point! That is a very useful feature in Classic Quizzes!

0 Kudos
themidiman
Community Champion

I wish there were a way to separate the requests for unique NEW quizzing features that are not part of the transition from Classic Quizzes to New Quizzes in this forum. Does anyone from instructure monitor this thread in this way?

For instance the assumption is that 100% feature parity with Classic Quizzes to New Quizzes when that milestone is reached. In other words there shouldn't be a need to ask for a classic quiz feature in new quizzes in this forum. Right?

I worry that community members sometimes ask for a very brand new quizzing functionality above and beyond what both classic and new quizzes currently do. Those should go in as a feature idea. Right?

pjonas1
Community Participant

I don’t think Instructure looks at this in terms of classic vs new features. To use a small example, I need the classic feature of students (if allowed by the instructor) being able to print their quizzes. This is a need both for our approach to tutoring in the context of video proctoring, but also for certain disability accommodations. Instructors only being able to print makes our current approaches from worse to impossible depending on the scenario. 

Instructure has “declined” to include this function for feature parity prior to sunset. They would like me to submit it as a new feature request instead. So I think feature parity is more of a selective concept. My impression is that Instructure’s focus is instead on social features to make the platform stickier. So everything is evaluated through a lens of: does this have a community component?

In turn, that makes this thread the only place where we can make our voices heard and everything and the kitchen sink ends up here. 

0 Kudos
themidiman
Community Champion

I suppose my point is we don't know what Instructure's dev team might interpret those requests to be if they aren't commenting regularly on requests that don't necessarily fall into what I understand the to be  feature parity category. NQ has been a feature release for way too long such that some community members may have been using it instead of or more than CQ and therefore want NQ to have feature x, y, z by the anticipated 2024 release/sunset of CQ/whatever when it doesn't have feature a, b, and c that CQ has had since day 1. 

I just want to see the transition made smoothly and on time without making it a more complex task for the dev team, so I know that the institution I support knows we can be prepared for CQ going away altogether.

0 Kudos

I understand what you are saying, but I am not sure what is the reason behind this concern. I have not seen faculty coming up with random requests for features we have never seen or used before in Classic Quizzes. Maybe it is happening. But I have not seen it. 

In my experience, I mostly see current faculty asking for features that we currently use and depend on in Classic Quizzes. 

I see language teachers like myself asking for the ability to embed Canvas Studio videos with full functionality, and to record our own audio and video media in the questions, as we currently do in Classic Quizzes (basically to have access to the same full rich editor we currently have) We are also hoping for a smoother method of migration to New Quizzes that does not force us to re-record all our media (I have tried the available migration tools and my media gets lost in the migration). Re-recording all the media for the hundreds of quizzes we have been creating over the last few years would be an Herculean task for many of us.

I see math and science teachers asking for certain type of questions, symbols, etc. that they currently use in their math/language classes.

I see faculty concerned about the migration of question banks. 

I see faculty of all disciplines asking for other features we currently have in Classic Quizzes, such as the ability to limit when students can see their responses after completion of a quiz or to be able to keep the Survey option, also currently available in Classic Quizzes (and which apparently has already been decided will not be included in New Quizzes). The argument seems to be that surveys are "not quizzes". However, anonymous surveys are a strong recommendation from the OIE  Course Design Rubric and are recommended as best practice for effective contact in my institution and many others.  

I am personally not in a hurry to transition to New Quizzes, to be honest. Classic Quizzes do everything I need to do and I would be perfectly happy to stay with Classic Quizzes indefinitely. I am not excited about the prospect of spending a huge amount of time migrating all my quizzes into a platform that is currently inferior to what I am using (for the purposes I am using it). 

If and when the transition happens, I would like to see 100% parity first and a reasonably smooth migration strategy that does not require faculty to rebuilt our courses from scratch (or re-record all our quizzes media). For me, and I think for most faculty members, at least the ones I know and talk to about this issue, a smooth transition that keeps the current functionality we rely upon is much more important than a "timely" transition that may compromise functionality or may not provide smooth and effective migrations tools.  

So, I hear what you are saying, but I am not sure this is an issue right now. At least, not in my experience. I do not know about other faculty, but personally, I am not a web designer or an IT, person. So, I am not going to be requesting some random features I have never seen or used before. Maybe you have seen that kind of thing happening and I have missed it. 

If we get 100% parity with Classic Quizzes and easy and smooth method of migration, I think many faculty would be much more on board with New Quizzes that we currently are. 

 

 

 

 

@isabel_anievas 

The faculty I support aren't asking for new features in NQ either. It's this community forum where I see those coming through. Like many of you, I would love to see NQ flourish with the additional time given for the CQ sunset. I have nursing faculty who haven't been using Canvas at all for quizzes, and they want to dive right in to NQ purely for the newer question types. They have a lot of question banks in MS Word format that they would rather not have to type by hand, and I'm finding that the QTI format that comes out of Respondus 4.0 going directly into NQ is absolute garbage. It was just this past week that I met with one of them to try this using Instructure's own guide and my client was patient and understood that NQ is currently a half baked project. Same goes for transitioning a CQ to NQ albeit with a bit more functionality. These dysfunctional features being fixed in a reasonable timeline would be great, but I don't consider them new features. It's a feature parity aspect that has been working with CQ for ages. 

I'd love for the developers assigned to the task to focus on feature parity first before responding to requests to add extra features above and beyond the feature parity category.

0 Kudos

I agree, we have been using NQ from the start due to the date we implemented Canvas.  Having said that, there are a few times we have had to resort to the CQ for specific functions, which form integral parts to our assessments.  To redesign these if the functions are not include in NQ will be a transition and version control nightmare.  It would be great to have clear guidance from the development team on when and if functions will be added. 

pjonas1
Community Participant

Speaking of using CQ until NQ are feature complete: many instructors at my institution started on NQ because we adopted Canvas with less than a year left on CQ according to Instructure's initial guidance. Our lives would certainly be easier if we had started on CQ.

Is there a migration tool to take our item banks and migrate them to test banks, as well as migrating our NQs to CQs? I would hate to have to redo everything from scratch, but after years of troubles with NQ: I think it may be time to consider migrating from New Quizzes to Classic Quizzes.

Jeff_F
Community Champion

@pjonas1 - I was just updating our leadership on the status on NQ development and basically said I could fill in the project timeline once we had progress with a few items:

  1. Confirmation the new batch conversion tool functions without flaw
  2. More details on how CQ question groups convert to NQ.  The NQ FAQ does not provide details
  3. A functional RCE
  4. A survey tool or viable workaround

I also made note that the end of life for CQ is June 2024.  So even if possible, if you were to change from NQ to CQ, that would be perhaps a year or slightly more.  Then you would need to revert again shortly after?  May I ask if you have NQ issues other than the RCE?  Are they so substantial to warrant such an approach? 

 

pjonas1
Community Participant


@Jeff_F wrote:

@pjonas1 - I was just updating our leadership on the status on NQ development and basically said I could fill in the project timeline once we had progress with a few items:

  1. Confirmation the new batch conversion tool functions without flaw
  2. More details on how CQ question groups convert to NQ.  The NQ FAQ does not provide details
  3. A functional RCE
  4. A survey tool or viable workaround

I also made note that the end of life for CQ is June 2024.  So even if possible, if you were to change from NQ to CQ, that would be perhaps a year or slightly more.  Then you would need to revert again shortly after?  May I ask if you have NQ issues other than the RCE?  Are they so substantial to warrant such an approach? 


Thanks for engaging. For me, the issue is students not being able to print quizzes. And at least at the moment, Instructure is saying that they are "declining" to migrate that feature from CQ to NQ.

Migrating my content from NQ to CQ would of course be extra work for me. But until CQ are sunset, even if it's less than two years, my students with computer disabilities would get a fairer chance and all students would be able to get the tutoring they are supposed to. Those benefits weigh heavily for me. Plus, the CQ sunset date has gotten pushed backed so many times in the past, I might be able to secure those benefits to my students for even longer than is currently announced. Who knows, NQ might even be declared a failed project and we'll stay with CQ.

Reverse migration is definitely on the table for me.

bdye
Community Participant

We continually get asked about our timeline for rollout at the institution.

It so frustrating and I feel like it makes me incompetent to constantly change the project timeline to leadership as things get delayed.

As we approach summer, I could be migrating quizzes over but the lack a of migration of quiz banks and lack of ability to work with proctoring services prevents it.

My latest thought was to check back on the progress summer of 23 before any big rollout takes place.

Just because so much time and resources have gone into a project doesn't mean it's best to pouring resources in it.
david_downs
Community Member

Looking at the Transparency into Quizzes Planning roadmap, I see in the Before July 2023 column an item called "gather anonymous student feedback." Can we learn more about what that entails? Is it intended to address the concerns about surveys not being included in New Quizzes? I note that in the Canvas New Quizzes Feature Comparison document there is now a comment re: surveys that says, "Third-party tools can be connected to Canvas if an institution needs true survey capability. Canvas will meet many needs similar to surveys with future tooling." Thanks for any insight you can offer!

0 Kudos
lekern
Community Participant

We are struggling to figure out how to advise our instructors on New Quizzes, and this problem seems to only get worse. Today an instructor who is using New Quizzes was told that they would have to manually re-grade 300+ quizzes in an active quiz that had the wrong answer marked as correct. In reviewing the roadmap I don't see an item that indicates that regrading is being addressed. Any help in understanding when / whether this issue is being addressed would be helpful.

Tags (2)
themidiman
Community Champion

I would think something like this would be ready as part of the timeline for feature parity with CQ. There's a certain amount of regrading that works in CQ. One could assume that feature parity goals would bring this in sooner than later and hopefully improve upon it. I was just supporting a CQ user who is frustrated that they cannot regrade questions that come from a linked question bank. 

I wish we could see some more feature parity goals as part of the transition timeline. @SuSorensen do you have anything to update or is the vague timeline still all that we have to rely on? @lekern , It seems like development is concentrating on getting API endpoints working first. I've heard many times that Instructure builds the API it expects other users to access first and then they build user experiences and interfaces on top of the API. 

0 Kudos

Hello @themidiman - this transparency post provides a little more detail. We had not yet heard that there was a concern for the way New Quizzes regrade and Item Banks function. The team will explore this priority, thanks for making sure it is on our radar. 

themidiman
Community Champion

Thank you @SuSorensen 

Looking back at the transparency post it seems like we all needed a reminder (myself included) that this is helpful information when commenting about things we want to see in NQ and the general timeline of when they will be ready. We all need to be reminded that this is required reading before asking any questions on this community forum about when will NQ have feature (x, y, or z), or why current feature (a, b, or c) isn't working. 

0 Kudos
benjamin_rodrig
Community Contributor

Quick Question,

 

I see admins will have the opt in to Enforce Migration to New Quizzes. What does that mean exactly? 

To me it looks like, if someone copies a Classic Quiz to a new course, that it will copy over as a New Quiz if this feature is turned on on the account level. Does that sound about right?

0 Kudos
PamelaImperato
Community Participant

Can you please advise when (anticipated month/year) the "delayed release" feature will become available in New Quizzes?  For clarification, this would allow a professor to set the time in which the answers to a quiz would be made available to students. In this way, a faculty member could deploy a particular quiz over the span of several hours (or days) and specify a time in which answers would be viewable.

The present "Settings" are in New Quizzes does not allow for this feature.  It only allows for the answers to be (or not to be) viewable during the creation point of the quiz.  

In prior communications it was indicated that this feature would be available in New Quizzes but it is not on the present timeline of features. Thank you. 

 

khahn1
Community Participant

When Canvas was pitched to our university, they scoffed at the fact that we were still using SoftChalk for assignments, seemingly because it was created in the 90s and the New Quiz feature would be able to do the same and more. However, SoftChalk can apply partial credit for all question types while New Quizzes can't, in my opinion, making some of the features of that software that was developed in the 90s superior to the New Quizzes. I know Canvas fixed the matching partial credit, but why is fixing the partial credit application for the ordering and categorizing questions still not even listed as a future development feature despite those of us on the ground actually teaching courses repeatedly asking for this for nearly four years? I have had software developers tell me more than once that this should be an easy enough fix since the program is already set to mark which options are correct and incorrect in student responses. I think I can speak for multiple people on this; why would you program your New Quizzes to mark which portions of the student's answers are correct and incorrect but not include the option for us to give students the partial credit they deserve automatically? Who does that help? It is frustrating and disheartening to students to miss one out of 10 when it comes to a categorization question (for example) and receive a 0 for a score. It is frustrating and exhausting for those of us teaching to have to go back through every attempt for a quiz to adjust for this manually. It is a waste of our valuable time that we could be working on improving our courses or spending with our families. Is there a reason Canvas keeps ignoring the pleas of hundreds of instructors who have been begging for this for years? Please explain this to us because those who have to waste hours every semester fixing what should be automatic are incredibly frustrated and do not understand why we are continuously being ignored. 

 

Additionally, I would really really like to see the pagination idea put into development. https://community.canvaslms.com/t5/Idea-Conversations/Visual-quiz-structure-in-New-Quizzes-page-brea... 

gwright
Community Member

While our institution has used Canvas for several years and is satisfied overall, you do raise some valid points about the development roadmap for New Quizzes. It appears that a number of important features were not taken into account when planning New Quizzes. We can only assume that Instructure has ventured down a path that has been more challenging to their developers than they foresaw or that the initial roadmap was not well-planned when it comes to discovering the features most important to educators. The result is that New Quizzes development has been dragging on for years. For the time being, we have just abandoned our efforts to get all faculty to make the transition. Our attempt to get them to migrate at the end of 2021 created a lot of angst because of migration issues and missing features. We don't want to repeat that.

LMacaulay
Community Participant

Quiz Restrict Student Result View Settings

The options in quiz settings for what information is shown to students after taking a quiz is beyond dense in Classic Quizzes and faculty usually don't understand what checking boxes actually does show the students. In New Quizzes this is a bit better due to the granular checkboxes under the "show items and questions" section of the settings.  However, we still need the option to show the items selected in the "restrict student result view" section to be on a certain date/time, only once immediately after an attempt, or only after last attempt.

It would also be useful if when assessments have multiple attempts, if settings for what is allowed to be seen by students could be different for earlier attempts vs. the last attempt.  As in, I might want to allow students to see the questions and their responses, but not if they are correct/incorrect until their last attempt. On the last attempt I may also want to allow them to see the correct answers.

What we had in Sakai was close to what I am asking for, so I have attached a screenshot of those settings for reference.

hesspe
Community Champion


@LMacaulay wrote:

Quiz Restrict Student Result View Settings

The options in quiz settings for what information is shown to students after taking a quiz is beyond dense in Classic Quizzes

I'm surprised by this observation.  I thought the following was pretty straightforward:

NQ options.png

Whereas the options in  New  quizzes, in addition to being less flexible and requiring several more clicks to access, also  seem opaque to me:

NQ options 2.png

I'm very curious to know what others think.

 

 

 




0 Kudos
Anya_
Community Member

I do find the settings in Classic Quizzes to be pretty straightforward (though I can see where some may find the more granular settings in New Quizzes preferable). They also take less time to get to (you just have to click edit quiz, whereas in you have to go to Build then the right tab in New Quizzes). I also don't really like the paradigm in New Quizzes. In Classic Quizzes, the settings are "opt-in" to showing results. In New Quizzes, they are "opt-out." I prefer the former. 

The real dealbreaker for New Quizzes settings for me is that Classic Quizzes settings integrate with the gradebook better. In essence, from what I can tell the gradebook posting policy has no effect on New Quizzes, only the individual Quiz settings around showing results do. Which to me, makes the existence of the gradebook posting policy useless for New Quizzes. This also means whenever I do want to show the results of quizzes, instead of being able to easily manage that from one place (the gradebook), I have to go into each individual quiz's setting in order to adjust this.

 

"I also don't really like the paradigm in New Quizzes. In Classic Quizzes, the settings are "opt-in" to showing results. In New Quizzes, they are "opt-out." I prefer the former. "

This is the part that is a deal-breaker for me. I really don't think my faculty even realize there is a "Settings" Tab that they must navigate to in order to find these settings (I have explained it, but honestly I often forget myself. These are the types of settings that you should be forced to confirm before publishing).

The global default setting for all quizzes must be that students DO NOT see any results after taking a quiz. They can always be changed later but students can't unsee results once they've seen them.

kaas-ku
Community Member

It was nice when feedback on alternatives were added to Multiple Choice questions in New Quizzes but why was is not added to Multiple Answer questions as well?

We are in most cases using Multiple Answer qustions with feedback on alternatives in Classic Quizzes. It is a unfortunate set back if it cannot be applied to New Quizzes.

0 Kudos
abutton
Community Member

I'm not sure if this is the right place to address our concern, but we would like the option for a model answer to be available to students (i.e., the general feedback) before an educator marks the question. At the moment, having some general feedback immediately available to students was functionality in the classic quizzes, which has been removed in New quizzes, so a teacher has to go and manually assign a grade before students see anything. This seems counter intuitive for pre set feedback

Nancy_Webb_CCSF
Community Champion

Hi, I have some questions about "Export a new quiz outside of an account" on Transparency into Quizzes Planning - Instructure Community (canvaslms.com).

  • Will this include ability to export item banks, one or a selection?  
  • I am assuming there will also be the ability to import?
  • Do we know whether these will be csv, imscc or some other format?
  • Any chance we can get an actual date on the timeline for this feature?  It is sorely needed.

This should be the path for giving a copy of a bank or quiz to another instructor or for use in a different school.  At least that's my interpretation, and I hope I'm right.

Thank you.

ericmaso
Community Member

We just ran into an issue where New Quizzes doesn't seem to support some basic functionality we used in Classic Quizzes, and which we are hoping will be available in the future. We have a non-term Canvas course we use to manage the training and communication with our university's writing center staff (approx. 75-100 people). As part of the onboarding process, we have consultants take a survey that gives us more granular insight into what skillsets our consultants have.

Here is our issue: in "Classic Quizzes," we could ask employees which citation styles they were familiar with, allowing them to check each style they knew.  Later, when we needed to know which consultants were familiar with a specific citation style (like Turabian or AP or AMA), we could go into the survey statistics and click on that answer and see which specific students had answered that they knew that style. Using the "New Quizzes" tool, we can see overall indexes and performance metrics, which are not useful to us in any way in this context. We can also see that 10 students said that they know AMA style. This is nice, but we have no way to identify those students except to go through speed grader and search manually (which, obviously, takes forever when you have 100 people having taken the survey).

We don't need or want to grade the individual questions for these surveys (I mean, surveys don't really have correct answers), except to be able to note that the survey has been completed. Perhaps we need to look into seeing if there is an external survey tool that truly treats things as surveys rather than "ungraded quizzes." 

As a side note, my university uses non-term Canvas courses for many purposes (grants education, program information for current students, college faculty info) and sending ungraded surveys to these groups, anonymous and identified, is a common desire. And I can see many uses for them in classes as forms of low-stakes writing and assessment where all that's needed, as other individuals on this forum have stated, is to be able to give students credit for submitting it, and be able to view the collective and individual results easily. None of this seems that complicated compared to the new quizzes interface and integrations, so we hope some sort of simple survey option is made available that still gives you granular insight into how submitters responded. 

Thank you for your time.