Is there a plan to make the new Quizzes LTI compatible with LockDown Browser?
I think you missed this part
the User Group: New Quizzes has a clear timeline that is also being updated more frequently! * By the way - Respondus Lockdown Browser is on that Timeline *
the User Group: New Quizzes has a clear timeline that is also being updated more frequently!
* By the way - Respondus Lockdown Browser is on that Timeline *
I don't know if there are any Respondus representatives in the Canvas Community, but I'll share what Respondus' Nick Laboda has said via email when I asked him a few weeks ago:
"Right now, the ball is in Instructure’s court. We are ready to do the development work but are waiting on them. I would encourage you to reach out to your Instructure rep and let them know how important this is to you. Hopefully it will push things up on their development roadmap."
We are aware of this need and are working to finalize some large features, and to put the right pieces in place for third-party integrations with our new assessment engine. We have had many discussions with Respondus and other partners about what we need to do to get the right end-points in place for their use. This is on our radar, but it does take some time. We appreciate your patience as we continue to bring you a world-class assessment engine.
Assoc. Product Manager, Assessments
Any updates? It has almost been a year. I work at the largest university in the USA, and someone made the foolish decision to switch from BlackBoard to Canvas which caused enough migration headaches just getting everything into Quizzes. Formula-questions were not handled properly and had to be manually touched, as well as any questions that used images or mathematics in the answers or student feedback text. We're all dreading the forced move to Quizzes.Next, which will be yet another migration. At the moment, Quizzes.Next is a non-starter for a wide range of reasons. However, for our on-line portfolio (which is also very large), those long list of reasons ALSO include this issue with not being able to properly lock-down the browser.Given how much of a headache was the botched BlackBoard-to-Canvas migration, we would all LIKE to just start using Quizzes.Next so that we don't have to worry about another botched migration from Instructure, but we can't even do that because we cannot ensure that our online students won't be abusing this particular hole in Quizzes.Next.
Canvas largely seems half baked. Does it really take this long for the product to become usable at scale? This isn't a priority at Instructure? What is more important? Being able to drag cards around the Dashboard was never something that I cared about... and yet... that's the big feature we get while these other feature requests and bugs just become an everyday part of life working with Canvas...
Yeesh, Ted...sounds like you got off to a bad start with Canvas. Hopefully, the past 8 months have improved. We love Canvas and this is far and away the most negative thing I've heard about Instructure and their products.
Cheers to a better Canvas future!
All of my old opinions still stand. Things have not improved. Those who prefer Canvas simply must not know what they are missing.
I'm wondering, Ted, if you and your institution have a communication channel to your Customer Success Manager (CSM). We meet with ours as a consortium fairly regularly and I've been asking them for regular updates about New Quizzes. She made us aware of the slated Q2 timeline for RLB + Monitor support for New Quizzes several weeks before this updated published timeline: User Group: New Quizzes
All the CSMs we've had over the course of our license for the past 8 years have been fabulous. We've been with Canvas/Instructure since almost the earliest days of the product. There have been some bumps along the way, but they've always strived for transparency when bumps have been encountered.
The support our institution receives (which, again, is the largest university in the USA and certainly a significant client of Instructure, I would imagine) seems to be limited to CSM's suggesting that faculty continue to submit feature requests (even when the requests are clearly bug reports) and hope for enough votes for those feature requests to even get on Instructure's radar. Knowing timelines for future deployment is simply not useful. We teach classes now, and we are not going to go back and overhaul a class just to take advantage of some feature we asked for 2 years ago. The problem seems to be that Instructure has dumped all of its investment into public and customer relations and apparently very little into development. I would be happy hearing even less from Instructure if it meant that "features" are implemented (and bugs are fixed!) in a timely fashion. I don't want to be handled by Instructure; I want Instructure to handle continuous development and improvement of their software.
I've been sent down the "request a feature" route before too. That's no fun if that's what your CSM is constantly doing. This specific issue with New Quizzes is frustrating because in order to get a not-fully baked feature developed, is that you have to get some real user usage in order to get feedback to the product team.
New Quizzes has created this love-hate relationship for Canvas users because on the one hand it solves some shortcomings with the original quizzing engine, but has this feature parity gap that has taken much too long to bring into fruition. In my opinion, Instructure put this on themselves and made their clientele pay for it by putting this out to users as part of the main feature set, but not as a 100% complete solution. I see lots of questions on the community that relate to fixes needed on the classic quizzes only to be told to wait until New Quizzes is complete.
I can only repeat what our CSM has stated: that hopefully the Respondus feature will be ready sometime in the April, May, June time frame. This will make our nursing instructors feel more confident in the way they prepare their students for NCLEX testing.
There are certainly a lot of practical problems when a development team does not include instructional professionals and requires its paid users to do the testing and ideation for them, but that's why other software developers include users among its ranks and require a bit of "dogfooding." So that suggests the question – why aren't there instructional professionals on the development side of Instructure?
In this particular case, being able to lock down Quizzes Next/New Quizzes is not an advanced feature; it is so very basic that not having it is a non-starter.
There are minor flaws in New Quizzes – like the fact that general feedback on an essay question is unreadable until that essay question is graded – but there are workarounds for those. There are no workarounds for New Quizzes being unable to be secured in testing environments.
Two other related headaches on the horizon that I am surprised I have heard very little about:
You would think that Instructure is listening to academic professionals when it comes to initiatives like New Quizzes. I was invited to their headquarters for their Khaki 2017 event. This was a focus group of several customer demographics (K-12 and Higher Ed) that were given some insights to the development process in April 2017. We were asked to create user stories for what existing features needed to be enhanced or what features were missing. We came up with 50 things that all have since had some developer attention or have since become new features. At this time, I don't think New Quizzes was on the table of what was most important to work on. In fact, I can't even remember if it existed at the time. The Khaki 2018 event (which I wasn't invited to) seems to have spawned twice that many: Khaki 2018 Update
Possible (but not confirmed) delays in the current development process may be related to Instructure's engineers having to be quarantined, and the recent sale to Thoma Bravo: Thoma Bravo Instructure Acquisition
We greatly appreciated your participation in the Khaki II (Khaki 2018 Update) , and we appreciated the diverse perspectives from those that attended the first Khaki (Memories from Khaki 2015) It was so fun to be a part of those!
We are always seeking to improve feedback opportunities! No system is every perfect, but as you've experienced we enjoy trying new things. There are pros and cons to a robust feature idea system like the one we moderate, but the pro that outweighs them all is the opportunity to hear direct ideas and problems from the direct users! Votes are good an dandy, but as we've said before (and as you saw at Khaki) the use cases and stories are what matter most!
Our Product and Engineering teams are doing some pretty awesome work as they all work from home (fun fact: some are remote even when are offices are open). The Ideas page has been getting more regular updates, more frequent blogs are being posted to help explain more of the 'why' of features, and the User Group: New Quizzes has a clear timeline that is also being updated more frequently! I think you already know all this, but it's nice to see in one place!
Ted, it is correct that your direct lines to share your ideas are either through the feature ideas process and/or through your local Canvas Admin. I hope you find power in knowing that your voice gets to be heard directly and individually through this awesome Community and not filtered!
This response illustrates exactly the kind of public/customer relations that I was referring to above. Unclear why Instructure needs more feedback to know about the importance of Respondus support. But apparently we just need to keep voting?
At no time did I say that Instructure was not planning on getting New Quizzes and Respondus LDB working together. The point here is that it is unclear why it has taken this long for the integration to happen. Respondus has stopped implementing features and fixing bugs in classic Quizzes because it has shifted its development effort to New Quizzes, but it should have started with this feature -- not left it until the end.
RLDB compatibility should have been an item blocking item on the schedule for Instructure because it is definitely blocking implementation for the rest of us.
In this case there are two parties involved here. Respondus as the 3rd party, and Instructure. For a while we were going back and forth with both of them trying to get a sense of whose court the ball was in. It wasn't helpful if they are/were pointing the finger at the other company.
Anyhow, it's a tough sell when this feature was left out of the first available version of New Quizzes. It came out of 'beta' status earlier last summer. Not sure why it had to happen then. Like I said, Instructure sort of brought this issue on themselves and hurt their customer's expectations by letting it out as unfinished and early as they did. But I think they are more on track to get it finished now that they've had some user feedback by letting actual assessments take place that didn't need Lockdown Browser support.
Just adding here. The relationship between Instructure / Canvas and 3rd parties is critical to the success of Canvas deployment. Based on experience so far, that needs more work. This statement is made on the back of the message we receive that Canvas is a 'low feature' platform - that has been created to deliver what the customer needs through 3rd party integrations and local development ('Build your own awesome').
If there isn't rock-solid alignment between feature release and associated 3rd party tools - this will fail.
Equally - we seem to sit in a group who have struggled to 'Build our own awesome' because of poor / outdated / missing documentation. But that's another discussion altogether
In addition, unless I'm missing something, why should the basic quiz functionality be implemented with LTI? If this was built into Canvas like classic quizzes, wouldn't we be in good shape? The LTI bottleneck makes interaction with quizzes during building them ugly and far less seamless, and it introduces this problem with Respondus that certainly should have been anticipated ahead of time.
I agree with you about maybe creating it as an LTI was not the smartest move. My guess and what I think I've heard from others is that by creating it as an LTI the team hoped to push the development of APIs within Canvas needed to support the tool. So building this way would be a benefit to other LTI integrations in the future as well.
I'm also thinking it was being done for scalability. I know Canvas already does a bit of spinning servers up and down through AWS to match current demand, and I imagine having the quiz engine built outside of Canvas core could also help with scalability during peak traffic. I don't know if these two reasons are true, but if so I can understand the desire to stay the course with keeping it an LTI. Then again, I also wish it was ready, as in really ready, two years ago and we were not being pushed to something many of us can't use adequately yet.
Sharing your frustrations and observations. We began doing large-scale (+1000), LMS, BYOD, wifi, inclass midterm and final exams in 2012 in WebCT Vista. We survived migrations to Bb relatively unscathed and during our RFP selection process, the assessment engine was of highest priority and we were assured that all would be fine. It is not. No doubt Instructure changed the market with their front-end UX and instructor 'ease of build' but the lack of attention to the 'back of house' functionality such as the assessment tools and the gradebook now need the same. If I could wave a magic wand, I would resurrect the WebCT Vista assessment engine. Many may shudder at that as it wasn't easy, but once you figured out how to fine-tune it, it was solid.
Hey Ted, that's pretty much it in a nutshell for us too. Enough already with the 'shiny shiny' feature releases - and start dealing with the fundamental system-level issues (particularly in assessment) - that mean the platform can be used robustly, programmatically and at scale.
When you consider that our business basically boils down to 1) teaching and 2) assessment - it's amazing how far behind 2) is.
Is there any update to this issue? It would be nice to roll out New Quizzes along with New Grades this winter, but without LDB New Quizzes isn't a consideration.
I share the apprehension our canvas trainer reported that starting this fall the standard quiz may not work as canvas pushes for adoption of quizzes lti. Honestly from what I've seen it's a great program but without LDB no one on campus will utilize it. I'm in nursing and out department is in full blown panic. The online quizzes have helped our students score higher on the NCLEX by getting them use to computerized testing but students are so likly to screenshot an exam or try to log in from home or another computer on campus during a review or while the exam is open that we can't risk losing the integrity of our exams. My Dean is even talking about looking for other options.
David, you may rest assured that this:
starting this fall the standard quiz may not work as canvas pushes for adoption of quizzes lti
is not accurate. We have not yet set a date for when Classic Quizzes will no longer be available, but we will provide plenty of notice and lead time for this change. Generally speaking, Upcoming Canvas Changes is a good resource to follow to stay on top of major changes.
Looks like they'll need to sort it out before then:
Whenever possible, please use links over images so that people always have the most accurate information. Here are two helpful links as people seek to know more on the New Quizzes timeline.
User Group: New Quizzes
New Quizzes to replace Classic Quizzes July 2021
Fair point - it was taken from the Upcoming Canvas Changes link Stefanie posted. Deprecation point is the same.
Retrieving data ...