I think I might be speaking for a lot of people when I say thank you, Instructure, for listening. I want New Quizzes to be successful and the added time will make a lot of people happy. I look forward to seeing others' comments and feedback.
I do have one concern...will there be bug fixes and support for Classic Quizzes (that is short of adding new features that will be seen in New Quizzes)? For a time it seems like the dev team was more interested in getting New Quizzes into production than fixing potential issues with Classic Quizzes.
Hello @themidiman. Great question. I'm sure this thought is on the minds of many. We will be maintaining a similar experience to the past year.
Fixes which critically impact the workflow and are broad in the scope of users effected will be addressed. For example, last week we fixed an issue in Classic Quizzes where only one answer was showing for multiple drop down after the teacher refreshed the page. Not being able to see all of the possible answers is certainly a big problem!
We are fully committed to addressing the outlined needs of New Quizzes and therefore will devote most of our efforts there. It is a balancing act, and we do want to be good stewards. I would urge any support requests to help us contextualize the problem beyond what has happened.
We started Canvas in 2020 and were informed that Classic Quizzes were retiring within the year, and New Quizzes would be fully launched in the same time. With that info, we launched straight into New Quizzes. Now it's almost 2 years later and we are still dealing with bugs and incomplete features in New Quizzes, and the timeline for a complete quiz format keeps getting pushed back further and further. Now we are looking at 2024 for us to have a New Quiz format that is fully functional, and this is frustrating. We are already in New Quizzes due to the information from Canvas, but I really wish we could migrate OUT of New Quizzes back to Classic.
Setting realistic timelines, especially when discussing retirement of a quiz format, is important, and the communication and pushing of deadlines by Canvas has been frustrating from a customer point of view. We chose New Quizzes based on that communication, and are now stuck for another 2 years on the incomplete quiz format.
@addisoncombs I can't speak to those concerns directly, as you've been with the Canvas family longer than I. Shaun, the Product VP for Canvas, adds some perspective on why there have been challenges and the vision for the future here: A few thoughts on Classic and New Quizzes in a time of change.
Thank you, Instructure team!
One question I have about the adjusted timeline: what is the new date at which point Classic Quizzes can no longer be created? This was called out on the prior timeline, but I don't see it on the new one. Is that just the 06/2024 date?
Thank you again for listening - it is very much appreciated.
@atcarver we no longer believe it is in the joint best interest to have Canvas dictate this experience. We will provide you with a feature flag to use as part of your transition process if you desire. The Opt in Available row on the timeline lists the soonest the flags will be available for you to turn on. However, the only enforcement we are applying is that by June 30, 2024 Classic Quizzes will be turned off.
Following as we are interested in Honorlock support for New Quizzing. This is very important for a number of our faculty. Thanks for pointing this out.
I emailed our Respondus rep several weeks ago about NQ and Respondus 4.0. Here's what was shared with me:
"We do not have a true integration with Respondus 4.0 and New Quizzes yet. We are waiting on Instructure to build the appropriate API’s and web services so we can do so. I don’t have an update of when that will be. See link below for a workaround.
This seems to imply that the transition period (happening now or very soon) is that the APIs need to be finished before Respondus can build tooling in their product. This is somewhat alluded to in the transparency page:
For what it's worth this was the same with respect to Respondus Lockdown Browser working with NQ. There was a waiting period for Instructure to get the APIs finished before RLB+Monitor could be used to proctor a quiz authored in New Quizzes.
"...somewhat alluded to in the transparency page..." and thereby not transparent. 😉
Thanks for your reponse regarding Respondus. It makes sense - let me rephrase.
Once APIs for NQ are complete, then integration with some 3rd party applications will be possible.
Greetings, @aaron_bahmer . We'll leave the communicating about third party tooling to those vendors. However, Respondus has expressed what they need for Respondus 4.0 and those needs will be satisfied with the work the team is currently doing related to APIs for quiz building and third party proctoring support.
@SuSorensen Please elaborate on how the "Possible Later" section of the "Transparency into Quizzes Planning" image is different from the "Before July 2023" section. Some of the "Possible Later" items (such as knowing when a quiz needs manual grading and the grade by question feature) are crucial to certain bodies of my instructors, so knowing how likely it is that these features will be applied vs. being scrapped or put off for many years is very important to our institutional planning.
Hi @brycezmiller the timeline is in a progression in columns from left to right. Specifically, Solving now is what is immediately underway, Planned Next is what we are interviewing users about, making prototypes for, etc. and then the Possible Later is what we're considering taking on next. It is titled Possible because there might be movement between that column and the Before July 2023 column based on feedback and what teachers and students really need right away.
It would be helpful to have available both the "End of Life 2024 Quiz Timeline" and "Transparency into Quiz Planning," is a font size and resolution that is readable. Can a version be made available for our low vision users or those that need a bit more magnification. Much thanks!
Ouch. I'm so sorry about that! I'll brainstorm with the team about how we can share this in a better format. It's definitely not available to screenreaders right now either.
I'm looking at this document and I think I see items that could replace surveys, but I'm not sure I see anything that could replace practice quizzes, ie something that gives the students points and can be used to unlock modules but doesn't show in the gradebook. Is that accurate?
Case study: we use a code of conduct in each of our classes that's generated from a practice quiz. Students have to get full 10 points on the quiz (ie, agree to follow academic code of conduct by checking "I agree" next to each statement) in order to unlock the course materials in the modules. Those points currently don't show in the gradebook because that would be confusing. Not so much an obvious big deal if a course has 1000 points, but it would be more obvious in a course that has 20 total points if 10 of the points shown in the gradebook aren't being counted towards their grade.
@SuSorensen , all:
I've exported the Blog posts from New Quizzes: We're listening, cleaned the text and started to summarize it on the attached file. SPSS has a fine text analytics tool set for sentiment analysis, but I'll need to get that reinstalled on my new computer. Something for another day.
Anyhow, when reviewing keywords I noted 'item banks' and 'question banks' were prominent concerns communicated in the posts however neither appear to be addressed in the planning items posted here in this new forum. Or I just cannot find it. Where might I find information that will address the issues for these items?
@SuSorensen - I have remaining questions about items banks and question banks. Several comments relate to these in the prior forum and I cannot find where these issues are to be addressed. For example, ownership of the item banks is set to be with the bank developer and not the course.
I've attached the full list of comments filtered to show only those comments concerned with banks.
Thanks all, Jeff
@SuSorensen - as I asked the question a while back I thought best to circle back to ask if these items are under consideration.
What suggestions are offered to mitigate the challenges presented due to this pending change?
Many thanks, Jeff
@SuSorensen and here is the question I posed ----
I have remaining questions about items banks and question banks. Several comments relate to these in the prior forum and I cannot find where these issues are to be addressed. For example, ownership of the item banks is set to be with the bank developer and not the course.
I've attached the full list of comments filtered to show only those comments concerned with banks.
Thanks all, Jeff
What is meant by the Dec 2022 Planning caption? It would be good to have slightly more transparent target dates for the Transparency feature list
@PamelaImperato You can click the image, or download it and zoom in.
Hi! It looks like I missed the previous thread about top three features to be ready for Classic Quiz sunset. I'll throw them in here. This is coming from the instructor perspective. I adopted New Quizzes in 2020 because at the time the guidance was that Classic Quizzes would be sunset in 2021.
2. Respondus Lockdown Browser Integration is Partially Broken
- Respondus Dashboard Settings do not carry over on course copy. Assuming a dozen assignments that's about 250 clicks per course to set back up.
- Print dialog in Respondus Lockdown Browser does not correctly print quiz or attempt history. This is related to the larger students can't print issue, but is acute in Respondus since screenshot etc are disabled.
I have had two support cases open on these problems. Case 07727405 for 2.a. hasn't had any movement since June 2021. But last night I got an amazing update for issue 2.b., case 08336739.
Brandon, L2 Instructure Support, wrote:
[W]e are declining to resolve this behavior.
I don't even know what to say to that.
Following up and let you know that we have been made aware of the case file you submitted with Respondus about the problem. This will allow us to provide guidance about New Quizzes and allow them to resolve the problem. I know bouncing between companies can be frustrating and appreciate that you've kicked off the process for us by providing each details of the situation. We have reached out to our point of contact. Instructure will be in touch when we have news to report back.
Thank you very much! I don't mind being bounced between companies. That's the nature of the beast when it comes to these sorts of integrations. It's all good, as long as the vendors end up talking to each other, rather than trying to push the problem off on the other party.
I appreciate that Instructure is working with Respondus on finding a solution. I did get a an email from Instructure support about 10 hours after your post.
Monica, L2 Canvas Support wrote:
Unfortunately, our Partnerships Team has declined to solve the behavior at this time because Canvas does not support students printing quizzes.
So I'm not sure if everybody is on the same page yet.
While Canvas allows teachers to print a new quiz, we don't have that feature available for Students. Respondus has communicated that unless we have student printing available, they will also not have student printing available.
However, taking screenshots doesn't work within Respondus because the whole purpose of Respondus is to lock down that functionality. That presents a conundrum for viewing the attempt history by a student. Even if an instructor allows viewing the attempt history, if the quiz is within Respondus; the student cannot share that content with an instructor, tutor, etc. Fundamentally there are two ways to address this:
Now the benefit to number 2. would be that it would also fix other problems with New Quizzes. For example, right now you can't print item analysis; which is terrible if you have to document your work on that front as an instructor. Why can't you print it? Because it doesn't render right just like the quizzes themselves do. If you try to print it, you only get a weirdly truncated version of the first page. I have attached an example on how item analysis fails to print correctly just like all other New Quiz content.
The quiz printing that was added for instructors is a workaround where if you go into that menu, it renders a whole new version of the quiz content, correctly. It didn't actually address the underlying problem with content in New Quizzes.
Thank you, the sunsetting of CQs being pushed back is very helpful! Looking at the transparency document though, it seems like the option to export a course using the canvas common cartridge isn't going to be available until very late in the timeline (last, in fact), and that is a significant issue for content providers trying to re-deliver fully migrated courses to schools that need them. If we can't deliver courses until CQ reaches end of life, there is no transition time for us to redeliver and have the schools transition too.
Please consider moving the exportability of a course with NQs up in the timeline. It would be great if common cartridge were available before July 2023 along with the individual quiz exports.
Thank you for listening.
I have two questions:
(1) At the University of Copenhagen, we are desperately waiting for the possibility to export New Quizzes via CSV. If I read the timeline correctly, it seems that the export will not be possible before 2023. Is this correct? If yes, this is rather late and, hence, a bit disappointing.
(2) Will it at some point be possible to bulk-migrate all classic quizzes for an admin? Is this a functionality that is envisaged?
Thanks for the feedback, Ruth, we will continually review this type of feedback from users to reassess what is needed and when.
The migration tooling, is available on beta instances. We're encouraging admins to give us feedback before we make it available to production.
In addition what @SuSorensen said, we are working closely with third party vendors and proctoring tools to ensure we are providing the right set of APIs for them to able to integrate their solution to New Quizzes. However most of the tools won't need the full API to be finished in order to work, so the vendors will be able to start implementing the solutions as soon as our team release the appropriate API endpoints.
@SuSorensen I'm not sure if this is the correct place to address my concern, but I was wondering if EITHER Classic Quizzes or New Quizzes will have the option to select "First Response" in grading multiple entries when allowing multiple times to take a quiz. I allow students access to the quiz responses on secondary or tertiary attempts, but NOT the first attempt. Some students simply enjoy going over the homework quizzes to help them study, though I believe this to be a misguided way to study, as my questions are designed to make students use critical reading and thinking skills before answering. This prevents memorization of definitions and more focus on true conceptual problem solving.
That is a long way to point out that many of us would like to allow multiple attempts at certain quizzes with the ability to either select the FIRST response rather than the HIGHEST or the AVERAGE of the quizzes taken for an assignment. In the educational field, flexibility is a very important characteristic, and it often seems there is little flexibility in our options, without a VERY convoluted series of steps to arrive at an extremely simple outcome.
I have only used new quizzes and I really love them…the only thing I would really like to see updated on new quizzes is for when a new quiz is taken by a student that it shows up on the to do list that it was submitted…I rely on that to do list for grading and it’s wonderful. I really want to see new quizzes submissions on the to do list….please!!!!
Once I learned how to use the new quizzes, I much prefer it over the classic. It has so many great features like hot spot and the stimulus for multiple questions. The one thing I prefer with the classic, however, is the log feature. It was much easier to read and to know when a student left the platform.
@SuSorensen I think this is a really good topic to discuss. Our institution handles our own Tier 1 support for students and instructors. One request that we commonly receive is to investigate student claims about why they didn't take a quiz or claims about errors during the test that caused hardships. For these investigations, we have used Admin's ability to see student activity, such as when students click the button to start an exam (url ends with /take), as opposed to when they look at the quiz overview page. Unfortunately, using an LTI tool means that we will not be able to see this and other similar kinds of interaction, which is completely understandable. Would it be possible for your team to look into ways that admins can track student activity beyond the Moderation log. or if it's possible to relay more information in the Moderation log? (I don't mean to imply this should be a "right now" goal, but I would appreciate if this could be investigated sometime in the future.)
New Quizzes needs to have the monitoring feature that Classic Quizzes has. I began using New Quizzes because I assumed Classic would be ending soon, and when a colleague told me what I was 'missing out' on due to using New over Classic, I was quite disappointed.
I would love to see that feature added to New Quizzes sooner rather than later. I am very disappointed, like others, that we began working with New Quizzes due to an announced deadline that has been continually pushed back, and therefore features are not the same in the two quiz types.