To Our Amazing Educators Everywhere,
Happy Teacher Appreciation Week!
Found this content helpful? Log in or sign up to leave a like!
I'm looking for the best way to extract all user quiz responses across my entire organization. For example, I have a post-course survey that collects feedback in the form of multiple choice questions and essay questions. I want to see the individual user responses per user and course. I've tried using {{domain}}/api/v1/courses/:course_id/quizzes/:quiz_id/submissions/:id/events and {{domain}}/api/v1/courses/:course_id/quizzes/:quiz_id/statistics to pull info out but it seems like there must be a better way to get to this data. Any help would be appreciated, thanks!
I am having the same problem. I know this thread is old but have you got an answer eventually?
I've not been able to do this using CD1 so where needed, we also use the API method to extract the data for the student analysis report and place it into a table.
Alternative ideas include use of a tool designed for end of course surveys such as EvalKit WaterMark or even a Microsoft Form. As we have Office 365 for everyone a link to such a form can be created. The Watermark route is a more secure method of ensuring only those persons enrolled in the course are posting responses. Another benefit of Watermark is that the responses are anonymous.
To retrieve quiz results, I needed to use the API
- List Terms
- List Courses for desired Term
- For each course, get list of Quizzes
- (Determine if the Quiz matched the critiera)
- For each quiz, get submissions
- For each submission, get details (events) Note that there might be multiple responses for a question
I've replied to this question in other places, but missed seeing it here. Sorry for the delay and to help make up for it, I will try to include a more complete response than I've given in other places.
For classic quizzes, all of the responses for each quiz and for multiple students can be obtained using the Submissions API . The trick is to add the query parameter include[]=submission_history. and to use the assignment_id not the quiz_id.
The answers are coded in that multiple-choice responses don't give the actual response, they give the a code for the response, which is then compared to the questions given to identify the response.
When you include the submission history, you get an array called submission_history that includes an element for each submission made (multiple attempts means multiple submissions). Each of those elements in the array includes a property called submission_data, which is an array with one entry for each question that appears to be in the order that it was asked on the quiz, but I won't state that definitely because I answered the questions in the order they were presented in the example I'm looking at.
For example, here's what response looks like:
{
"submission_data":[
{
"correct":true,
"points":2,
"question_id":61125947,
"answer_id":"5172",
"text":"red",
"more_comments":""
},
{
"correct":"partial",
"points":2,
"question_id":61125950,
"text":"",
"answer_for_verb":"saw",
"answer_id_for_verb":"1423",
"answer_for_noun":"texas",
"answer_id_for_noun":null,
"more_comments":""
},
{
"correct":"partial",
"points":2.4,
"question_id":61125956,
"text":"",
"answer_3288":"1",
"answer_6891":"0",
"answer_549":"1",
"answer_5821":"0",
"answer_5900":"0",
"more_comments":""
},
{
"correct":false,
"points":0,
"question_id":61125957,
"text":"",
"answer_for_noun":1572,
"answer_id_for_noun":1572,
"answer_for_food":6061,
"answer_id_for_food":6061,
"more_comments":""
},
{
"correct":true,
"points":3,
"question_id":61125983,
"answer_id":4705,
"text":"0.1250",
"more_comments":""
},
{
"correct":true,
"points":2,
"question_id":61125986,
"answer_id":8243,
"text":"11.0000",
"more_comments":""
},
{
"correct":"defined",
"points":0,
"question_id":61125987,
"text":"<p>peace</p>",
"more_comments":""
},
{
"correct":"defined",
"points":0,
"question_id":61125988,
"attachment_ids":null,
"more_comments":""
}
]
}
Here is what was in the quiz
I just realized that my quiz designed to test all the different types of questions doesn't have any multiple choice questions on it. Can you tell I teach math and not a social science? Anyway, it includes an answer property with an answer_id value that matches the list of possible responses.
All of what I wrote is explained in the documentation, but in the appendix at the bottom of the Quiz Submission Questions API page. The placement of the information is weird since you can't actually get the student responses from this API. You can get the ID of the quiz submission using the Quiz Submissions API (use the id field) and then use that ID in the Quiz Submission Questions API.
A much easier way to get the information about each question is to use the Quiz Questions API and the list the questions used in a quiz or submission endpoint. It is for the entire course rather than individual students so it involves fewer API calls. This uses the quiz_id not the assignment_id. It works most of the time, but it has issues showing information on questions that are linked to a question bank. It may also have problems if the responses were modified (a mistake was corrected), but it does allow for you to specify the quiz_submission_id and quiz_submission_attempt in those cases.
To participate in the Instructure Community, you need to sign up or log in:
Sign In