New Quizzes: Let students have a choice of essay questions

0 Likes
(2)

As a faculty member in the humanities, I always pose essay questions in exams.  I prefer to pose more questions than students need to answer, and to give students the choice of which questions they would like to write about.  Right now I can post, say, 5 possible questions, each of which is worth 100 points, and instruct students to choose 3 to answer.  But Canvas assumes that if I post 5 questions, the grading scale will be 500 points.  Canvas does allow me to have it randomly select 3 out of 5 questions for any individual student to answer.  This will create a 300-point grading scale for the exam.  But I would like the students to be able to have that choice, and for Canvas to know this, so that the grading scale is correct.

5 Comments
Stef_retired
Instructure Alumni
Instructure Alumni
Status changed to: Open

@eandersn Thanks for sharing this idea. We've moved it forward for further discussion after making a slight edit to the title to align it with New Quizzes, as our product teams are no longer developing new functionality on the code base for Old (Classic) Quizzes.

I too used to teach humanities, and like you, I liked to give students a choice of questions from which to choose in quizzes as well as assignments. The interim solution for users of Classic Quizzes is to place, say, five essay prompts in a Text (no question) item type and include instructions in the text entry to respond to three of those prompts; the instructor would then create only three blank essay questions where the students would submit their responses. This results in accurate grading, but only works when an instructor chooses not to randomize questions, as the three blank questions need to display immediately after the Text (no question) item.

Users of New Quizzes could do something similar by using a stimulus that contains five prompts but only three attached essay responses—and in this case, questions could still be randomized, because the attached essay responses will stay with the stimulus in the randomization process.

Hope this helps a bit. 🙂

eandersn
Community Member

Thanks!  I did think of your suggestion.  But from a grading perspective, it is suboptimal.  The best way to grade a bunch of essay exams is to grade all the answers to question 1, then all the answers to question 2, etc., across all students.  That way your mind is focused on just one question at a time for a decent stretch, and you get a good idea of what the average student's answer looks like, and hence what a really outstanding or a subpar answer is for that class.  This helps maintain consistency and fairness of grading standards.  With each question given its own essay box, it is easy to scan each exam to see who answered which question.  When students write, say, 3 essays in 1 box, it takes more hunting to learn which questions they answered, especially given that they might choose to answer them in a different numerical order.

Stef_retired
Instructure Alumni
Instructure Alumni

@eandersn To clarify, the students would write each essay in a different box (as you'd create three, each of which contributes 100 points to the score), so grading the questions one at a time is still feasible under this workaround scenario.

Stef_retired
Instructure Alumni
Instructure Alumni

...but on re-reading your comment, I have more fully absorbed what you're saying. Point taken. 🙂

ProductPanda
Instructure
Instructure
Status changed to: Archived
Comments from Instructure

As part of the new Ideas & Themes process, all ideas in Idea Conversations were reviewed by the Product Team. Any Idea that was associated with an identified theme was moved to the new Idea & Themes space. Any Idea that was not part of the move is being marked as Archived. This will preserve the history of the conversations while also letting Community members know that Instructure will not explore the request at this time.