I am the project manager for our transition to Canvas @ UW-Madison. We have identified some quiz question types and behavior gaps with the Canvas quiz tool and were wondering if anyone is using another tool integrated w/ Canvas that meets these needs? We have been in contact w/ Instructure around these needs but wanted to ask the Community as well.
Thank you for your help.
Non-STEM Question Types
Audio/Video Recording questions
Instructor asks a question in audio (or video) format and student must answer in same format. Used by foreign language classes.
Drag and Drop questions
Student must drag some object onto a target surface. For example dragging a marker onto a map or a text label onto a mathematical graph.
STEM Question Types
Questions where the answer is a number. A range of correct answers may be specified, e.g.within 1% or +/- 10.
Questions where the answer is a mathematical expression or equation. Quizzing engine must correctly evaluate the student’s answer, even if it is stated in a very different, but mathematically equivalent form. Also the system must include some kind of “equation editor” for instructors to create and students to answer these questions.
Calculated questions with variable datasets
These questions require a numeric or formula type answers, but the specific question asked is based on one or more randomly chosen input variables. The quizzing engine must use the values of these variables in calculating the correct answer to the specific question asked. Variable values can be generated from a range, sequence, or table where each row represents a “sensible” selection of related variables.
Calculated questions with shared variable datasets
These questions are like Calculated questions with variable datasets, but two or more questions need to share the same randomly selected variables. This could be a “story” problem where multiple questions are asked about the same story.
Question Behaviors / Options
Option or Behavior
Award partial credit on multiple attempts
Student gets full credit for correct answer on first try and reduced credit for correct answer on later tries.
Reattempt individual questions
Student can re-attempt questions without starting the whole quiz over. Important when quiz is assigned for homework and student needs to understand one question before moving on to later questions.
Regrade after submissions
When instructor discovers a mistake on the correct response for a question after some students have taken the quiz they need to be able to fix that mistake and have system re-grade all completed quizzes using the corrected response.
Grade scale or Pass / Fail
Instructor supplies a grading scale so students see a letter grade or pass/fail in the gradebook instead of a raw score
Organize question banks
Organize question banks into hierarchical system of categories. Hierarchy can be as deeply nested as needed.
Control of feedback
Feedback may be provided to students that is based on their specific response to a question, whether they got a question correct, or for their performance on the whole quiz. For homework, early and frequent feedback is appropriate, for exams that the instructor wants to use in subsequent semesters, the feedback must be tightly controlled.
Instructors are able to see a breakdown of the quiz scores across a quiz, including some visual representations of distribution and other significant statistics. May include question by question breakdown as well. Students may be able to see (with instructor control) some statistical information (class average, high/low score, etc.) when they look at their own grades.
Quiz behavior analytics
Instructors are able to access data suggesting how students approached the quiz including:
-attempt information (how many attempts it took students to complete the quiz/how many attempts they have remaining and the grade history for those attempts)
-timestamp information for questions/quiz (how long did students spend per question, per whole quiz, when did they start the quiz)
-order of answering (students generally skipped question 4 until the very end)
-progression of answers for a single question (first the student answered B which was correct, then they changed their answer to C which was incorrect)
@nicole_olthafer , welcome to the Canvas Community! While you're in the process of evaluating the current quiz tool, I hope you're also following the massive quiz refactor project underway at https://community.canvaslms.com/community/ideas/quizzesnext?sr=search&searchId=da6b7824-66e0-4dd5-bb....
I do all sorts of crazy things to adapt to the current limitations of Canvas. I'll start with one issue, then see if I can come back and address more.
One of the problems you mention of the variable carried throughout the problem is addressed in the following tool:
The tool has common kernel questions (allowing you to use a common variable through multiple questions), and the items created can be easily integrated with Canvas. You lose a little of the analytics within Canvas this way, but you can see the data in MyOpenMath. MyOpenMath is specifically geared toward math-based fields, and it may also meet more of the STEM-type question issues you are trying to address.
Here's also some of the items you can do now (perhaps you can clarify how the current behavior doesn't match your needs, so we can brainstorm solutions):
Our foreign language teachers do the recorded questions with a essay question. While the students could type an answer, they just have to follow the instructions and use the media tool. It will allow them to record the answer that way.
I would love to see Canvas support H5P (h5p.org) in the Gradebook. What needs to happen for that to happen? Is it just an LTI that someone needs to build? @nolthafer , have you looked at this as a potential solution?
Thanks @MattHanes ! This will be an interesting are to watch! I agree that the openness makes it a beautiful opportunity! Do you (or anyone?) know if Instructure is working with H5P to integrate rather than compete with similar interactions?
FYI @nolthafer I'm following a great discussion on Moving from the Transactional LMS to the Transformational LRM in which @kmeeusen and @dwillmore talk about how Canvas is working as a shell that plays well with others:
the LMS should simply be a shell that provides a simple way to post content of various types, keep a list of students and teachers, provide security to the LMS course shell, a grade book, simple assignment, and quizzing. Keep it as simple as possible, but then open the LMS to integration with other web tools. Allow that integration to provide single-sign-on (SSO) and grade push back to the LMS. Make that integration secure and easy.
This is, and always has been one of the primary goals of Canvas, and to date nobody does this better. Canvas is even using the LTI tool model to build its new Quizzing Engine. This is one of the features that sold me and our state system on Canvas - the ability to customize Canvas quickly and easily using integrated tools. Now almost every education tech vendor in the country is clamoring to integrate their tools with Canvas, and the ones who aren't are watching their market share plummet.
In my mind, this de-coupling and decentralizing of "big [built-in] solutions" can offer a lot of agility and flexibility, provided it stays open (vs proprietary). Definitely worth a read!