Search the Community
In 2015 a new idea was requested and 5 years later it doesn't seem to have been implemented.
When submitting a quiz, it seems that a warning only appears when there are some unanswered questions however, it would also be very helpful if Canvas could please confirm submission even when ALL questions HAVE been answered because when there is time remaining on the clock students (like myself) may want to go back and check their answers. This would greatly help when a student inadvertently clicks submit. Let's face it, in a face-to-face exam a student isn't going to 'accidentally' hand in their exam!
If this has already been implemented and I have missed it please let me know.
Thanks so much.
Basically a title. If you've already written the question and formula and generated answers before, you should be able to go back and make a small edit to the question stem without pressing generate again as long as you don't edit any of the formula parameters in the stem.
Sidebar: Why isn't the list of generated answers in a scrollable window? That functionality was lost in the transition from Classic Quizzes and it's incredibly annoying. Instructure is going to owe me a new mouse for all the extra scrolling.
Idea suggestion: Create a way to edit question settings in bulk so that we can set all multiple-answer question types to exact match.
Background: One of the benefits of New Quizzes is the ability to make multiple-answer questions exact match as opposed to awarding partial credit if a student selects some correct answers but not all correct answers. Currently you have to edit each multiple-answer question individually to change the settings from partial credit (which is the default) to exact match. The workflow is tedious and time-consuming if you are using item banks with hundreds of questions.
When you open your item bank and filter by multiple answer questions, make an edit to a question, and return to the item bank, there is no way to see which questions you have changed to exact match and which ones you haven't.
There should be a way to set all multiple-answer question types to exact match.
Hello,
I'm a university student and I recently took the DAT exam and stumbled upon a feature that would greatly improve Canvas' test/quiz system. The DAT allows students to right-click on an answer choice in multiple-choice questions. This "eliminates" that choice by striking through the text. This feature would be incredibly useful, especially now during COVID-19 when almost all tests and quizzes are online nationwide. Students have lost the ability to eliminate wrong answers, which is a key test-taking strategy.
Here is an example of what this could look like:
Thank you for your consideration.
Please fix the numbering of the answers shown when looking at a students submitted quiz. The relationship to the names/numbering in the problem seems random, and some are simply omitted! EG, on one question, there are 16 matrix entries to fill in. I am shown "Answer 1" though "Answer 12", and their relationship to the 16 which should be shown is totally unclear.
Here is a screen shot of a much smaller version: this question has three answers, which the student answered "a", "a+b", and "c". The grading then shows "Answer 1" and "Answer 2" only.
These were actually the second and third answers to be entered.
(screenshot displaying the Classic Quizzes interface removed to preserve student privacy)
- Tags:
- poezd
Now that New Quizzes and Respondus LDB are working together, more faculty are trying the Quzzes->New Quizzes migration tool. This is uncovering many new interesting hidden features.
When you create a Matching question in old Quizzes, it is possible to have multiple question stems match to the same answer. That is done by simply repeating the answer across the different stems. Then, when the question is rendered, old Quizzes only shows the repeated matching answers once in the list.
Faculty have learned that when the migration tool creates a New Quiz question out of these matching questions, the New Quiz will repeat these answers multiple times.
This seems obviously to be a bug in the migration tool. But I am pretty sure a customer support rep on this forum is going to ask me if I have submitted a support request through my institution to lobby them to submit an official support request to Instructure through back channels. Why Instructure cannot just accept bug reports directly here (i.e., why reps don't have the ability to submit action reports directly) seems to be a larger problem. For now, I post this here just to warn people if they have matching questions on their old quizzes and attempt to use the migration tool to create matching questions in New Quizzes.
- Tags:
- poezd
Hello,
With respect to Canvas Quizzes, I really like using the multiple answer format in my University class. The problem is that I don't like the grading methodology. I understand the grading works as explained below. Here is an example: I give five possible answers. Two are correct answers. If a student chooses one correct answer and one wrong answer, then Canvas grades as follows:
1. Since there are two possible correct answers every answer is worth 0.5 points. The student gets 0.5 points for every correct answer and has 0.5 points deducted for every wrong answer.
2. So in this case, the student's grade is 0.5-0.5=0 points.
I would argue that the student got the question half right. If it were up to me, I would give this student half credit (0.5 points) for this question. I would appreciate an option allowing the instructor to choose this grading option.
To elaborate, this is how I think this alternative grading option should work using the same example from above:
Recall there are five possible answers and two are correct answers. If a student chooses one correct answer and one wrong answer, then Canvas grade should be as follows:
1. Since there are two possible correct answers and student chose two answers, then student gets credit for every correct answer: so student gets 0.5 points for the correct answer but doesn't get penalized for the wrong answer.
2. So in this case, the student's grade is 0.5-0=0.5 points.
3. However, it is important to penalize the student for too many incorrect answers. So the methodology would need to provide that so long as the student only selected as many answers as there are correct answers (i.e. two in this case), then there would not be a penalty for the wrong answer. However, if the student selects more answers then there are correct answers (i.e. if the student chooses three answers in this example), then the student would get a point deduction as follows:
1. Student gets 0.5 points for the correct answer and no penalty for the first incorrect answer. But student gets a 0.5 point penalty for each additional answer beyond the number of correct answers. So a student that picks one correct answer and two incorrect answers in this case would get a grade of zero (0.5-0-0.5=0).
I would be happy to discuss this further with a member of your development team or product management team. I would be grateful for your consideration of this alternative grading approach.
Thanks very much,
David (david.geffen@eccles.utah.edu)
Thank you, Canvas, for deploying scientific notation in Numerical Answer questions. However, the system is rounding and truncating in a way that sets too many values to zero. For example, create a numerical question, and enter "1E-5" as one of the Exact Answers. Canvas will substitute zero for the answer. It is common in physics and chemistry for answers to be much smaller numbers than this. For example, the gravitational force between two humans separated by 1 meter is in the range 4E-7 Newtons. Other questions might yield answers much smaller than that, say in the range 1E-20 to 1E-30 (I would suggest a limit in the range of 1E-40 to 1E-50). Can Numerical Answer calculations be changed to double-precisions variables or whatever it takes? Thanks!
Hello,
Since it can be difficult to maintain test bank integrity in an online program, it would be wonderful if the students could receive only the comments on the quiz question they missed and not the entire question and answer. Students are currently not getting a test review unless they request one, we only provide a topic/content along with the book, and the page number of the answer on a question they missed.
I would love for all students to be able to receive the comments only to questions they missed after item analysis is completed and not the entire question and answer to help maintain test integrity.
Is there a way this can be done in Quizzes?
Thank you.
I teach Field Biology, where students learn to identify different species of plants and animals. There are many times when I would like to add images as answer choices in multiple choice or matching questions. For instance, having students match pictures of animal tracks with the names of the animals who made them. Or, give them a question about plant adaptations and then have multiple pictures for them to choose from in order to answer the question.
Actually, this would be cool in categorization questions and sorting questions too! It would be helpful to be able to insert images that students have to sort or drag and drop into set categories.
- Tags:
- images
- new quizzes
