Showing results for 
Search instead for 
Did you mean: 

Quiz API missing functionality for setting HTML answers

Adventurer II
1 4 785

The current quiz API seems to be missing functionality to insert HTML answers, as one can only set an answer_text and not an answer_html.  This is despite the fact that the actual answers support an html element, as can be seen below:

"answers": [ { "id": 3603, "text": "answer1", "html": "", "comments": "", "comments_html": "", "weight": 0 }, { "id": 3070, "text": "answer2", "html": "", "comments": "", "comments_html": "", "weight": 100 }, ... ]

It is possible via the RCE GUI to cut the text and then switch to HTML mode and past it in. As shown in the example below:

[{'text': '', 'id': 2400, 'html': '<pre>sum(5,10)</pre>', 'weight': 0.0, 'comments_html': '', 'comments': ''}, {'text': '', 'id': 7243, 'html': '<pre>from math import *<br>sum(5,10,0)</pre>', 'weight': 0.0, 'comments_html': '', 'comments': ''}, {'text': '', 'id': 7766, 'html': '<pre>from math import<br>sum(5,10)</pre>', 'weight': 0.0, 'comments_html': '', 'comments': ''}, {'text': '', 'id': 5120, 'html': '<pre>from math import *<br>sum(5,10)</pre>', 'weight': 100.0, 'comments_html': '', 'comments': ''}]


I think that it should always be possible to set values via the API as opposed to having to use the GUI.  While it might take only a few minutes to do this for a single question, when there are large numbers of questions - this is when it should be possible to apply power tools.

[I am aware that there will be a new quiz engine, but as I do not have it - I have to work with what I have.]

Tags (1)

 @maguire ,

I think what you're asking for is there, but perhaps not clearly documented. I only tried it with a multiple choice question and you mentioned the answer_html, so I may be missing something. Let me know if I am.

The trick is to use answer_html instead of answer_text when you send your data. It's kind of confusing because when you get the data back, the answer_ is not part of the field, but when you send it as part of the POST, it needs to be.

I just created this multiple choice question using the API.


Here is the payload I sent to do it.

  "question" : {
    "question_name" : "q1",
    "question_text" : "This is a test question",
    "question_type" : "multiple_choice_question",
    "points_possible" : 10,
    "answers" : [ {
      "answer_html" : "<p>This is <em>emphasized</em></p>",
      "answer_weight" : 100
    }, {
      "answer_text" : "<p>This should not be <em>emphasized</em></p>",
      "answer_weight" : 0
    }, {
      "answer_text" : "This is plain text",
      "answer_weight" : 0
    } ]

Here is the answers portion that was returned when I made the POST


As you can see, there is no automatic conversion between HTML and text. Furthermore you should only put a non-empty value into one of the two. I did some testing and if you supply both answer_html and answer_text, then the answer_html wins out. They both get put into the question when you send the POST through the API, but then if you pull the question up in the GUI and save it again (without making any changes), the answer_text will get lost.

Adventurer II

 @James ‌,

Thanks - this works well! I used it for the multiple choice questions and I have tried it for all other other forms of questions; however, I have not testing it very much yet for these alternative types of questions.

I'm sorry to be slow in responding, but I have been having a lot of fun with the package js2py (GitHub - PiotrDabkowski/Js2Py: JavaScript to Python Translator & JavaScript interpreter written in 1... ). I have been using this package to execute some of the javascript that has been written (by others) to randomize their questions. I'm now able to automatically generate numerous variants of the problems based upon augmenting the code to return the values of all of the variables in their code by to my python script, then I can use these values to either automatically generate a set of acceptable answers in the case of multiple choice questions and short answer questions with a variable defined as the answer or I mark the question with the two characters: (Caution and Umbrella with rain drops - hoping that these characters are unlikely to be in any questions).I this way an instructor can easily search for questions that I have not generated answers for and manually write answers to the question. I think I can automate some of this, but having been inspired by the progress I made with some of the questions with javascript, I am going to try to augment the javascript that I execute with some additional functions to see if I can handle a large subset of those that have been used by other instructors in their questions.


There's a lot of amazing stuff out there. I never would have imagined that. Given that I don't know Python, I would have imagined that you would have had to execute some external code and capture the output.

Adventurer II

The process turns out to be rather easy (once you learn how js2py works -  I want to acknowledge the help of the package's author (Piotr Dabkowski) for answering two of my questions about it).

I get the javascript for the question into javascript_for_question, do pattern matching with regular expression to find all of the variables (putting them in all_variables), then  create a line of python code that will return the variable values
return_variable_values='\nreturn {'
for i, v in enumerate(all_variables):

    if i > 0:
        return_variable_values=return_variable_values + ', '
    return_variable_values=return_variable_values + "'"+v+"': " + v
return_variable_values=return_variable_values + '}'

define some extra javascript functions that might be used by the code in a string: extra_javasscript

Define an initial string to turn the existing javascript into a function:

intital_string=extra_javasscript+'\n'+"function gqmjr() {\n"

Combine the line that collects the values of all of the variables and close the function's definition:

Cut off some parts of the javascript that are used for checking the answers in the earlier LMS as these are not useful for generating different randomized instances of the problem:

Then past the pieces together:

Convert it all to a raw string:

js="{}".format(j1) # needed to convert to raw text for evaluation

Finally call the js2py eval_js function to execute the created javascript and get the values:
result = js2py.eval_js(js)

I simply loop copying the question data structure to make a new instance of the question, execute the code, and replace the variables in the instance's question and answer elements with their values from the values that I got from executing the code.

I feed the resulting new question instance to the earlier code that I wrote to process questions and insert it with answers (when I can calculate/guess them) into a Canvas course.