cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
SuSorensen
Instructure
Instructure

Classic Quiz Sunset 2024 Feedback

This thread has been opened for comments and suggestions about the updated Classic Quizzes Sunset Timeline. 

128 Replies
themidiman
Community Champion

I think I might be speaking for a lot of people when I say thank you, Instructure, for listening. I want New Quizzes to be successful and the added time will make a lot of people happy. I look forward to seeing others' comments and feedback.

I do have one concern...will there be bug fixes and support for Classic Quizzes (that is short of adding new features that will be seen in New Quizzes)? For a time it seems like the dev team was more interested in getting New Quizzes into production than fixing potential issues with Classic Quizzes. 

Hello @themidiman. Great question. I'm sure this thought is on the minds of many. We will be maintaining a similar experience to the past year.

Fixes which critically impact the workflow and are broad in the scope of users effected will be addressed. For example, last week we fixed an issue in Classic Quizzes where only one answer was showing for multiple drop down after the teacher refreshed the page. Not being able to see all of the possible answers is certainly a big problem!

We are fully committed to addressing the outlined needs of New Quizzes and therefore will devote most of our efforts there. It is a balancing act, and we do want to be good stewards. I would urge any support requests to help us contextualize the problem beyond what has happened. 

0 Kudos
addisoncombs
Community Member

We started Canvas in 2020 and were informed that Classic Quizzes were retiring within the year, and New Quizzes would be fully launched in the same time. With that info, we launched straight into New Quizzes. Now it's almost 2 years later and we are still dealing with bugs and incomplete features in New Quizzes, and the timeline for a complete quiz format keeps getting pushed back further and further. Now we are looking at 2024 for us to have a New Quiz format that is fully functional, and this is frustrating. We are already in New Quizzes due to the information from Canvas, but I really wish we could migrate OUT of New Quizzes back to Classic. 

 

Setting realistic timelines, especially when discussing retirement of a quiz format, is important, and the communication and pushing of deadlines by Canvas has been frustrating from a customer point of view. We chose New Quizzes based on that communication, and are now stuck for another 2 years on the incomplete quiz format. 

@addisoncombs I can't speak to those concerns directly, as you've been with the Canvas family longer than I.  Shaun, the Product VP for Canvas, adds some perspective on why there have been challenges and the vision for the future here: A few thoughts on Classic and New Quizzes in a time of change.

0 Kudos
atcarver
Community Contributor

Thank you, Instructure team!

One question I have about the adjusted timeline: what is the new date at which point Classic Quizzes can no longer be created? This was called out on the prior timeline, but I don't see it on the new one. Is that just the 06/2024 date?

Thank you again for listening - it is very much appreciated.

@atcarver we no longer believe it is in the joint best interest to have Canvas dictate this experience. We will provide you with a feature flag to use as part of your transition process if you desire. The Opt in Available row on the timeline lists the soonest the flags will be available for you to turn on. However, the only enforcement we are applying is that by June 30, 2024 Classic Quizzes will be turned off.

aaron_bahmer
Community Contributor

I didn't see any reference to Respondus on the timeline. What is known about the endeavors to allow full integration of test import via Respondus?

Following as we are interested in Honorlock support for New Quizzing. This is very important for a number of our faculty. Thanks for pointing this out.

I emailed our Respondus rep several weeks ago about NQ and Respondus 4.0. Here's what was shared with me:

"We do not have a true integration with Respondus 4.0 and New Quizzes yet. We are waiting on Instructure to build the appropriate API’s and web services so we can do so.  I don’t have an update of when that will be.  See link below for a workaround. 

https://support.respondus.com/support/index.php?/Knowledgebase/Article/View/604/1/publishing-to-inst..."

This seems to imply that the transition period (happening now or very soon) is that the APIs need to be finished before Respondus can build tooling in their product. This is somewhat alluded to in the transparency page:

https://community.canvaslms.com/t5/Quizzes-Transition/Transparency-into-Quizzes-Planning/ta-p/502615 

For what it's worth this was the same with respect to Respondus Lockdown Browser working with NQ. There was a waiting period for Instructure to get the APIs finished before RLB+Monitor could be used to proctor a quiz authored in New Quizzes.

aaron_bahmer
Community Contributor

"...somewhat alluded to in the transparency page..." and thereby not transparent. 😉

Thanks for your reponse regarding Respondus. It makes sense - let me rephrase.

Once APIs for NQ are complete, then integration with some 3rd party applications will be possible.

Precisely! I have nothing more to add at the moment. @themidiman covered it all. 😉

As we continue to make progress, we'll update in this transition user group.

0 Kudos

Greetings, @aaron_bahmer . We'll leave the communicating about third party tooling to those vendors. However, Respondus has expressed what they need for Respondus 4.0 and those needs will be satisfied with the work the team is currently doing related to APIs for quiz building and third party proctoring support. 

brycezmiller
Community Member

@SuSorensen Please elaborate on how the "Possible Later" section of the "Transparency into Quizzes Planning" image is different from the "Before July 2023" section. Some of the "Possible Later" items (such as knowing when a quiz needs manual grading and the grade by question feature) are crucial to certain bodies of my instructors, so knowing how likely it is that these features will be applied vs. being scrapped or put off for many years is very important to our institutional planning.

Tags (1)

Hi @brycezmiller the timeline is in a progression in columns from left to right. Specifically, Solving now is what is immediately underway, Planned Next is what we are interviewing users about, making prototypes for, etc. and then the Possible Later is what we're considering taking on next.  It is titled Possible because there might be movement between that column and the Before July 2023 column based on feedback and what teachers and students really need right away.

0 Kudos
PamelaImperato
Community Participant

It would be helpful to have available both the "End of Life 2024 Quiz Timeline" and "Transparency into Quiz Planning," is a font size and resolution that is  readable.  Can a version be made available for our low vision users or those that need a bit more magnification.  Much thanks! 

0 Kudos

Ouch. I'm so sorry about that! I'll brainstorm with the team about how we can share this in a better format. It's definitely not available to screenreaders right now either. 

venitk
Community Champion

I'm looking at this document and I think I see items that could replace surveys, but I'm not sure I see anything that could replace practice quizzes, ie something that gives the students points and can be used to unlock modules but doesn't show in the gradebook. Is that accurate? 

Case study: we use a code of conduct in each of our classes that's generated from a practice quiz. Students have to get full 10 points on the quiz (ie, agree to follow academic code of conduct by checking "I agree" next to each statement) in order to unlock the course materials in the modules. Those points currently don't show in the gradebook because that would be confusing. Not so much an obvious big deal if a course has 1000 points, but it would be more obvious in a course that has 20 total points if 10 of the points shown in the gradebook aren't being counted towards their grade. 

venitk
Community Champion

Sorry, I meant I'm looking at the trello-like timeline document. For some reason I couldn't edit my comment.

 

0 Kudos
Jeff_F
Community Champion

@SuSorensen  , all:

I've exported the Blog posts from New Quizzes: We're listening, cleaned the text and started to summarize it on the attached file. SPSS has a fine text analytics tool set for sentiment analysis, but I'll need to get that reinstalled on my new computer. Something for another day.

Anyhow, when reviewing keywords I noted 'item banks' and 'question banks' were prominent concerns communicated in the posts however neither appear to be addressed in the planning items posted here in this new forum.  Or I just cannot find it.  Where might I find information that will address the issues for these items?

 

Word cloudWord cloud

Jeff_F
Community Champion

@SuSorensen  - I have remaining questions about items banks and question banks.  Several comments relate to these in the prior forum and I cannot find where these issues are to be addressed.  For example, ownership of the item banks is set to be with the bank developer and not the course.

I've attached the full list of comments filtered to show only those comments concerned with banks.

Thanks all, Jeff

 

Jeff_F
Community Champion

@SuSorensen - as I asked the question a while back I thought best to circle back to ask if these items are under consideration.

What suggestions are offered to mitigate the challenges presented due to this pending change?

 

Many thanks, Jeff

Jeff_F
Community Champion

@SuSorensen  and here is the question I posed ----

I have remaining questions about items banks and question banks.  Several comments relate to these in the prior forum and I cannot find where these issues are to be addressed.  For example, ownership of the item banks is set to be with the bank developer and not the course.

I've attached the full list of comments filtered to show only those comments concerned with banks.

Thanks all, Jeff

k_oconnor2
Community Participant

What is meant by the Dec 2022 Planning caption? It would be good to have slightly more transparent target dates for the Transparency feature list

@PamelaImperato You can click the image, or download it and zoom in.

0 Kudos
pjonas1
Community Participant

Hi! It looks like I missed the previous thread about top three features to be ready for Classic Quiz sunset. I'll throw them in here. This is coming from the instructor perspective. I adopted New Quizzes in 2020 because at the time the guidance was that Classic Quizzes would be sunset in 2021.

  1. Reports are Broken
    1. Item Analysis cannot be printed or exported to CSV. Makes it impossible for instructors to document results.
    2. Item Analysis point biserial correlation (rpb) doesn't calculate correctly. Arguably the most important item analysis metric.
    3. Outcomes Analysis is not available for outcomes attached to questions in item banks. How are you supposed to do institutional assessment without that?
  2. Respondus Lockdown Browser Integration is Partially Broken
    1. Respondus Dashboard Settings do not carry over on course copy. Assuming a dozen assignments that's about 250 clicks per course to set back up.
    2. Print dialog in Respondus Lockdown Browser does not correctly print quiz or attempt history. This is related to the larger students can't print issue, but is acute in Respondus since screenshot etc are disabled.
  3. Item Banks are a Pain
    1. No mass editing of e.g. outcomes attached to questions.
    2. No good import function that preserves outcomes attached to questions.
    3. Fixing an error in a question creates a copy instead of letting you update the live course.
  4. Bonus: Gradebook Problems Not Specific to New Quizzes
    1. Late penalties are calculated wrong on multi-attempt quizzes. High score is selected first and late penalty applied second. E.g. if late penalty is 3 points and student A scores 7 and later 8, their score is 5. If Student B scores 7 and later 6, their score is 7.
    2. Late penalties can only be escalating per hour or day. No option to set a flat penalty, or have at least a longer timeframe such as a per week/month/year.
    3. Default grades for missing assignments and late penalty settings do not carry over on course import.
pjonas1
Community Participant


@pjonas1 wrote:
2. Respondus Lockdown Browser Integration is Partially Broken
  1. Respondus Dashboard Settings do not carry over on course copy. Assuming a dozen assignments that's about 250 clicks per course to set back up.
  2. Print dialog in Respondus Lockdown Browser does not correctly print quiz or attempt history. This is related to the larger students can't print issue, but is acute in Respondus since screenshot etc are disabled.

I have had two support cases open on these problems. Case 07727405 for 2.a. hasn't had any movement since June 2021. But last night I got an amazing update for issue 2.b., case 08336739.


Brandon, L2 Instructure Support, wrote:
[W]e are declining to resolve this behavior.

I don't even know what to say to that.

0 Kudos

Hello Philipp,

Following up and let you know that we have been made aware of the case file you submitted with Respondus about the problem. This will allow us to provide guidance about New Quizzes and allow them to resolve the problem. I know bouncing between companies can be frustrating and appreciate that you've kicked off the process for us by providing each details of the situation. We have reached out to our point of contact. Instructure will be in touch when we have news to report back. 

pjonas1
Community Participant

Hi Su,

Thank you very much! I don't mind being bounced between companies. That's the nature of the beast when it comes to these sorts of integrations. It's all good, as long as the vendors end up talking to each other, rather than trying to push the problem off on the other party.

I appreciate that Instructure is working with Respondus on finding a solution. I did get a an email from Instructure support about 10 hours after your post.

Monica, L2 Canvas Support wrote:
Unfortunately, our Partnerships Team has declined to solve the behavior at this time because Canvas does not support students printing quizzes.

So I'm not sure if everybody is on the same page yet.

0 Kudos

@pjonas1 ,
While Canvas allows teachers to print a new quiz, we don't have that feature available for Students. Respondus has communicated that unless we have student printing available, they will also not have student printing available. 

We've outlined an idea for Student Printing here  but it wasn't clear this was of interest. If you would like to lend your voice to the cause, please add your story here. 

0 Kudos
pjonas1
Community Participant

@SuSorensen 
Well, yes. The problem at a deeper level is of course that the way New Quiz content renders breaks the standard browser printing options. Respondus won't be able to address that deficit from their end all by themselves. Instructure would first have to render the content in a normal way; and then if for some reason student printing is supposed to still not be available in Canvas, disable printing using the usual JavaScript workarounds. I mean, let's face it: students can "print" quizzes at the moment by taking screenshots anyway. It's not like the broken rendering is an intentional security feature.

However, taking screenshots doesn't work within Respondus because the whole purpose of Respondus is to lock down that functionality. That presents a conundrum for viewing the attempt history by a student. Even if an instructor allows viewing the attempt history, if the quiz is within Respondus; the student cannot share that content with an instructor, tutor, etc. Fundamentally there are two ways to address this:

  1. Redesign the attempt history so that it doesn't require opening within the Respondus Lockdown Browser. Only active quizzes would open within the lockdown browser. Attempt history would open without it. Then, without the Respondus restrictions, students would be able to screenshare and screenshot their attempt history.
  2. Fix how content is rendered so that standard browser printing works with New Quiz content. Then instructors can allow printing in Respondus (if they wish), and students can print to PDF or print to paper in order to share with instructors, tutors, etc.

Now the benefit to number 2. would be that it would also fix other problems with New Quizzes. For example, right now you can't print item analysis; which is terrible if you have to document your work on that front as an instructor. Why can't you print it? Because it doesn't render right just like the quizzes themselves do. If you try to print it, you only get a weirdly truncated version of the first page. I have attached an example on how item analysis fails to print correctly just like all other New Quiz content.

The quiz printing that was added for instructors is a workaround where if you go into that menu, it renders a whole new version of the quiz content, correctly. It didn't actually address the underlying problem with content in New Quizzes.

0 Kudos
nburden
Community Participant

Thank you, the sunsetting of CQs being pushed back is very helpful!  Looking at the transparency document though, it seems like the option to export a course using the canvas common cartridge isn't going to be available until very late in the timeline (last, in fact), and that is a significant issue for content providers trying to re-deliver fully migrated courses to schools that need them.  If we can't deliver courses until CQ reaches end of life, there is no transition time for us to redeliver and have the schools transition too. 

Please consider moving the exportability of a course with NQs up in the timeline.  It would be great if common cartridge were available before July 2023 along with the individual quiz exports.

ruh
Community Participant

Thank you for listening.

I have two questions:

(1) At the University of Copenhagen, we are desperately waiting for the possibility to export New Quizzes via CSV. If I read the timeline correctly, it seems that the export will not be possible before 2023. Is this correct? If yes, this is rather late and, hence, a bit disappointing.

(2) Will it at some point be possible to bulk-migrate all classic quizzes for an admin? Is this a functionality that is envisaged?

Kind regards,

Ruth

Thanks for the feedback, Ruth, we will continually review this type of feedback from users to reassess what is needed and when.

The migration tooling, is available on beta instances. We're encouraging admins to give us feedback before we make it available to production. 
https://community.canvaslms.com/t5/Canvas-Releases/Beta-Release-Notes-New-Quizzes-Migration-2021-12-...

 

0 Kudos
TamasBalogh
Instructure
Instructure

@aaron_bahmer@asaylor@themidiman,

In addition what @SuSorensen said, we are working closely with third party vendors and proctoring tools to ensure we are providing the right set of APIs for them to able to integrate their solution to New Quizzes. However most of the tools won't need the full API to be finished in order to work, so the vendors will be able to start implementing the solutions as soon as our team release the appropriate API endpoints.

khackman
New Member

@SuSorensen I'm not sure if this is the correct place to address my concern, but I was wondering if EITHER Classic Quizzes or New Quizzes will have the option to select "First Response" in grading multiple entries when allowing multiple times to take a quiz. I allow students access to the quiz responses on secondary or tertiary attempts, but NOT the first attempt. Some students simply enjoy going over the homework quizzes to help them study, though I believe this to be a misguided way to study, as my questions are designed to make students use critical reading and thinking skills before answering. This prevents memorization of definitions and more focus on true conceptual problem solving.

That is a long way to point out that many of us would like to allow multiple attempts at certain quizzes with the ability to either select the FIRST response rather than the HIGHEST or the AVERAGE of the quizzes taken for an assignment. In the educational field, flexibility is a very important characteristic, and it often seems there is little flexibility in our options, without a VERY convoluted series of steps to arrive at an extremely simple outcome.  

0 Kudos
JocelynThamma
New Member

I have only used new quizzes and I really love them…the only thing I would really like to see updated on new quizzes is for when a new quiz is taken by a student that it shows up on the to do list that it was submitted…I rely on that to do list for grading and it’s wonderful. I really want to see new quizzes submissions on the to do list….please!!!! 

KarenGundal
New Member

Once I learned how to use the new quizzes, I much prefer it over the classic. It has so many great features like hot spot and the stimulus for multiple questions. The one thing I prefer with the classic, however, is the log feature. It was much easier to read and to know when a student left the platform.

amg10k
Community Participant

@SuSorensen I think this is a really good topic to discuss.  Our institution handles our own Tier 1 support for students and instructors.  One request that we commonly receive is to investigate student claims about why they didn't take a quiz or claims about errors during the test that caused hardships.  For these investigations, we have used Admin's ability to see student activity, such as when students click the button to start an exam (url ends with /take), as opposed to when they look at the quiz overview page.  Unfortunately, using an LTI tool means that we will not be able to see this and other similar kinds of interaction, which is completely understandable.  Would it be possible for your team to look into ways that admins can track student activity beyond the Moderation log. or if it's possible to relay more information in the Moderation log?  (I don't mean to imply this should be a "right now" goal, but I would appreciate if this could be investigated sometime in the future.)

AmandaIngram
New Member

New Quizzes needs to have the monitoring feature that Classic Quizzes has. I began using New Quizzes because I assumed Classic would be ending soon, and when a colleague told me what I was 'missing out' on due to using New over Classic, I was quite disappointed. 

I would love to see that feature added to New Quizzes sooner rather than later. I am very disappointed, like others, that we began working with New Quizzes due to an announced deadline that has been continually pushed back, and therefore features are not the same in the two quiz types. 

0 Kudos