Showing results for 
Search instead for 
Did you mean: 

Item analysis results unreadable/unusable

Jump to solution

The item analysis results I get are unreadable/unusable. Some examples:

293791_Screen Shot 2018-10-18 at 10.32.00 AM.png

293809_Screen Shot 2018-10-18 at 10.32.48 AM.png

293810_Screen Shot 2018-10-18 at 10.33.18 AM.png

293811_Screen Shot 2018-10-18 at 10.35.15 AM.png

In all of these cases I created the questions and answers entirely in Canvas and they show up correctly when students take the quiz. As you can see from the questions I ask, given that it is a Discrete Mathematics class, I make heavy use of Canvas' math equation tool in its editor. The answers also have math equations in them and the item analysis appears to have 3 random? ways of displaying the results:

  • showing the image link (annoying, but at least I can tell from the alt tags which answer choice they correspond to)
  • showing "No answer text provided" (useless)
  • showing absolutely nothing (blank - also useless)

Note that the second and third ways give me completely useless information since I don't have any way to match up the order Canvas is showing the possibilities with the order I gave when I created the question (for example, the one with the blank answer options, in the version of the question in the quiz itself, "none of these" is the last option).

This happens in both Firefox and Chrome.

Is this a known bug of Canvas? Is there any plan to fix this? Is there any workaround for this, e.g., can I download the item analysis so I can try to match up what Canvas is showing me with actual answer options (if so, where do I find that within Canvas)?

1 Solution

Accepted Solutions

This is a bug in Canvas. No one knows how to fix this.

View solution in original post

0 Kudos
9 Replies
Community Coach
Community Coach


Interesting.  I've never heard of this being an issue but can understand now the equation editor functionality might not have been built in to the item analysis view. Have you tried clicking the button at the top of the page for Item Analysis to generate and download the report in csv format?  It wouldn't be as pretty as the quiz statistics page, but I'm curious how the answer choices show up in that export.




Clicking on the Item Analysis button produces the results I showed (parts of) above. As far as I can tell, there is no way to download Item Analysis information. Clicking the Student Analysis button causes a csv file to be generated, but that just gives me each individual student's response, not the cumulative analysis of all student responses.


I wasn't able to get the Item Analysis button to create a csv file when I tried it last fall, but I just tried it again (on the same quiz from last fall) and did download the Item Analysis file Canvas provides me. It is not of any help; it does not show any info about individual answer choices but instead has only high-level statistics for each quiz question:

Question Id
Question Title
Answered Student Count
Top Student Count
Middle Student Count
Bottom Student Count
Quiz Question Count
Correct Student Count
Wrong Student Count
Correct Student Ratio
Wrong Student Ratio
Correct Top Student Count
Correct Middle Student Count
Correct Bottom Student Count
Standard Deviation
Difficulty Index
Point Biserial of Correct
Point Biserial of Distractor 2
Point Biserial of Distractor 3
Point Biserial of Distractor 4

0 Kudos
Community Coach
Community Coach

Hi there,

I have been reviewing older questions here in the Canvas Community, and I stumbled upon your question.  I thought I'd check in with you because there hasn't been any new activity in this topic since your reply to on October 22, 2018.  Are you still having issues with this?  Have you been able to come up with any solutions on your own since October that you'd be willing to share with us here in the Community?  Also, have you reported this to the folks the Canvas Help Desk so that their engineers might be able to investigate this further?  If so, what was the outcome of that conversation?  Would you be willing to share that here, too?  For the time being, I am going to mark your question as "Assumed Answered"...simply because there hasn't been any new activity in this topic for almost six months...not because we've necessarily found an answer for you as of yet.  (However, if you have found an answer, please feel free to provide an update below.)  This won't prevent you or others from posting additional questions and/or comments below that are related to this topic.  I hope that's alright with you, Beck.  Looking forward to hearing back from you soon.

0 Kudos


I only teach the class that those quizzes are from in the fall, so I have no current quizzes to compare against.

When I go back to my Fall 2018 class, the bad output is still there.

Downloading the "Item Analysis" just gives me overall statistics; it does not give me any info on particular answer choices selected. Moreover, the csv file generated by the Item Analysis has the very unhelpful field "Question Title", which for the quiz I showed above looks like this:


As far as contacting the "Canvas Help Desk" to get some engineers on the case, I wish I had that option. Here are my help options:


DoIT Help Desk just searches the guides for answers (and their search techniques are no better than mine) and maybe escalates the question to another UW person. This is a Canvas problem, so "local UW-Madison assistance and documentation" is not appropriate. I was hoping that by interacting with the Canvas Community that the actual Canvas people (i.e., the ones paid for doing this) would become aware of this.

Maybe after the semester is over, if my eyes will still focus, I'll have some time to go down to start following the trail of consecutive contacts to bring this horrible bug to someone else's attention (who maybe might know someone else who can talk to someone else who can put a message out for someone else ...)

In the meantime, once again, a question I've asked will get marked "Assumed Answered", not because it actually was answered, but because there's a problem with Canvas that cannot or will not be addressed. There really should be some other status option like "Not answered and, after several months, no-one has any idea what to do" (which is actually more accurate).

0 Kudos
Community Coach
Community Coach

Hey there,‌...

Thanks for replying back.  Based on the screen shot you've provided, it appears that UW-Madison has chosen to use their own help desk for Canvas support?  (You could check with your school's local Canvas administrator to be sure.)  Maybe UW-Madison chose not to get that level of support through Instructure when they decided to partner together?  I'm not sure that I have an answer for you, but I wanted to let you know that I'll share your question with a handful of groups here in the Community in hopes that your question will get some additional exposure: Data and AnalyticsCanvas Admins, and Canvas Developers.  If you aren't following these groups, please use the links that I've provided, and then click on the "Follow" button at the top right corner of each group page.  Also, next to each of these buttons is another button: "Actions".  Click on that button, and then choose "Join group".  Hopefully someone with more experience with analytics will be able to chime in soon.  Best of luck to you!



Just to build on what‌ said: if your institution does not purchase the Canvas Tier 1 Help Desk support, then end users are not able to submit tickets or bug reports to Instructure directly. In that case there are normally two or three people at the university (sometimes more) who are authorized in the contract to contact your account manager and/or field support. If you want to pursue, this, you'll need to identify who those people at your institution are and contact them. Academic Technology or Instructional Design is often the point of contact, if not they can usually point people in the right direction.

That said, what you're asking for may not be possible. Since it is impossible to render equations directly in HTML, the equation editor inserts javascript that renders the equations as images, and the actual LaTex formula is obscured by the script. That is not something that will ever display in things like aggregate data (which relies on plain text SQL indexes for speed) and csv exports (which do not support any type of scripting). 

Whether specific bits of code are rendered in output as nothing, no text provided, or an image with an alt tag seems to be related to whether how the LaTex was designated in the advanced editor (e.g. inline {\(…\)} or display {\[…\]}, etc.).

While it's certainly worthwhile to put in a feature request, the most expedient approach may be to prefix your answers with text that makes them identifiable in the out, or to construct the equations themselves in a way that ensure a useful alt tag. There is a useful discussion on options for building equations here:

Highlighted ,

Every single one of my math formulas embedded in Canvas (whether in a Canvas page, a quiz question, or as a quiz answer option) is always generated the same way: using Canvas' equation editor. And those math formulas DO have a useful alt tag -- the LaTeX code; that's what happens automatically when you use Canvas' equation editor -- it generates an image with Latex code as the alt tag and MathML code embedded in it for the use of screen readers, i.e., for accessibility (that's why I always use Canvas' equation editor).

In the examples I posted above, every single answer choice for every question shown was generated the exact same way and every answer choice has a meaningful alt tag. Yet, in some cases, Canvas shows the entire image tag, in some cases, literally nothing, and in some cases "No answer text provided". I would be ecstatic if the quiz analysis results showed the alt tags for quiz answer options that are not text but are, in fact, images, but alas, as my examples above show, they only sometimes do that.

0 Kudos

This is a bug in Canvas. No one knows how to fix this.

View solution in original post

0 Kudos
Top Kudoed Authors