cancel
Showing results for 
Search instead for 
Did you mean: 
Learner II

Will any of our custom CSS or JS be ported over to Quizzes.Next?

Currently themes and custom CSS and JS do not export into Quizzes.Next, but clearly several elements from the main instance of Canvas do (the assignment descriptions, for example).

It seems wrong to allow admins to customize the user experience except in the quizzes and tests.

We wanted to have JS create a button by every question item, so students and tutors could easily click if they wanted to report an error. Unfortunately, the custom JS can't reach the LTI, and the questions don't have their Item # when students view them anyway.

It would also be nice to stamp quiz items with a trackable number.

5 Replies
Learner II

Furthermore, even in the hidden HTML coding, there doesn't seem to be any mark or traceback to the question's item bank or item #, except in an encrypted form only parseable by the LTI server. If we are ever allowed JS plugins into the LTI, it would be very useful to flag items in a way so instructors, authors, and admins can find them if there's a reported problem. 

Instructure
Instructure

Hi ken.i.mayer@gmail.com,

As you pointed out, some things do come across from Canvas during the LTI launch but custom javascript from Canvas does not. Please share a bit more about how you want to use this functionality. If you have a sample quiz at your institution that you are comfortable with me looking at, please direct message me with the URL.

We wanted to have JS create a button by every question item, so students and tutors could easily click if they wanted to report an error. Unfortunately, the custom JS can't reach the LTI, and the questions don't have their Item # when students view them anyway.

  • What types of errors have been reported?
  • What types of errors are you most concerned with?
  • How is the report being logged?
  • What do you mean by "Item #?"
  • What fundamental problem was this javascript implemented to address?

It would also be nice to stamp quiz items with a trackable number.

  • How would this number help and be used?

I hope you are able to share a bit more about your use case. Thank you for bringing your concern to the community.

-

Kevin Dougherty

Assoc. Product Manager, Assessments

Instructure

Kevin,

Thank you for your thoughtful response.

The errors and suggestions we want to collect are on whether our questions have typos or outright mistakes in them. We are developing a huge # of question banks to enable drilling and self-study. Over time I imagine (if Quizzes.Next reports out item level data) we would find out that an item has something wrong in it based on the responses. (For example, if we have an incorrect distractor marked as correct accidentally, or just typing in the Greek wrong so no correct answer is listed.).
However, we would like to empower our students and tutors to report to US directly when they see a problem with the questions we have authored. If we make it easy for them to report, we'll get more and better data. Since we will have 30-50 question items in each bank, but the students will see only 5-8 each time they do the assignment, we might have a hard time finding the specific question. When we edit a question in Quizzes.Next, a convenient (hopefully unique) item number appears (see screenshot below). It would be helpful to transmit that number, so we can be sure that we're fixing the right question item.

Thanks

.The Quizzes.Next item number in edit mode

Hi ken.i.mayer@gmail.com‌,

we would like to empower our students and tutors to report to US directly when they see a problem with the questions

I think this is a good idea. Instead of relying on someone to communicate that number and issue, let's say you could capture the issue on button click and the system could do something with that. Where would would you surface the information?

-

Kevin Dougherty

Assoc. Product Manager, Assessments

Instructure

Well, I was thinking of letting institutions pass on their Canvas custom JS to Quizzes Next, so they could build whatever reporting module they like.

But if Instructure wants to set up a feature for everyone, then every institution will have different wants and needs. We are a rather small college, so I would have just put in an email interface, to the Canvas admin, to the quiz bank owner, and perhaps to the instructor of the course in question (Hopefully some of the data involved populates into Quizzes.Next, but I'm guessing not all of it?). But even a small institution would benefit from a Dashboard within Quizzes.Next that would keep an action item list of (1) issues reported by students and instructors and (2) item analysis that flags questions that consistently fail to discriminate (everyone gets them wrong or everyone gets them right). A decade ago I set up a similar reporting feature in a homegrown LMS we ran and it was very useful. Even a quick run (with N>10, barely), we found bad questions that had been live for 4-5 years in 100s of high schools and no one had reported them. Again, it would be great if the issues could bubble up to the dashboards of either the Canvas admin, a designated Chief Designer, the "owner(s)" of the quiz banks in question, and perhaps the instructors of the courses using the quiz banks.