New Analytics: How well do you know us?

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

carlycurran
Instructure
Instructure
4
1890

TLDR New Analytics gives instructors some amazing, detailed course-level information. We want to know: Have you seen it all? How do you use it? 

I’m relatively new around here — yes, I’m going to invoke that newbie card for a few more months! — and I recently had the opportunity to watch one of our solutions engineers demo New Analytics for a new customer. WOW, I thought, there’s a ton of data here. Do our users know about everything they can access?

So, I’m going to give you an overview of what data you can find with your New Analytics tools. In the comments, I’d like to hear: 

  • Did you know you could find all of this?
  • How are you using these data visualizations and tables? 
  • If you aren’t using New Analytics yet, why not?
  • If you’re a power user, please point out anything I missed that you love.

A caveat: I'm using a course with demo data. Hopefully, your students are more engaged than "mine."

Course Grade

In the Course Grade tab, you see a visualization of average grades for all sections in the course for assignments, discussions or quizzes. You can compare section averages and student grades with course averages for each assignment by using the search bar. You can also select which assignment types to display.

 

carlycurran_0-1626794493711.png

 To view a student’s grade summary for an assignment, click the corresponding point on the graph. 

carlycurran_1-1626794493591.png

Try searching a specific assignment by typing the name into the search bar. Clicking the corresponding point on the graph will show you the grade distribution for that assignment overall or by section. 

 

carlycurran_2-1626794493602.png

You can also message students who meet certain criteria for the assignment by clicking the message students who button. Note that students will not be able to see other students included in the message. 

carlycurran_3-1626794493622.png

 

Weekly Online Activity

In the Weekly Online activity Tab, you see average page views and average participation on a weekly graph. You can compare sections or students to the course average by typing the section or student name in the search bar. Hover over a point on the graph to view details about the data. To download weekly activity data as a CSV file, click the download button. 

carlycurran_4-1626794493601.png

Click on the point for the week to look at total page views and participation, including how many students participated or viewed a resource. 

carlycurran_5-1626794493595.png

You can also view this weekly activity data by resource. Click on the number of students who viewed or didn’t view to message those students — same goes for participation.  

carlycurran_6-1626794493734.png

 

Students

In the Students tab, you can view a table summary of student grades and participation data. This overview gives you an idea of how well students are doing in relation to their activity in the class

carlycurran_7-1626794493912.png

Click on a student’s name to view individual student data, including course grade, weekly online activity, and communication. 

carlycurran_8-1626794493703.png

Interested in more details about New Analytics and the new Attendance reports we’re rolling out next month? Subscribe to the New Analytics user group. And don’t forget to let me know in the comments how you use these data points. 

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

4 Comments
venitk
Community Champion

Thanks for asking this. We don't advertise the use of New Analytics to our instructors because it's not 100% accurate. 

This is compounded by that it's hard to tell what in the known issues list has been resolved and what's still open. ie, this known issue seems to be tagged both completed and open (see below). 

Screenshot 2021-07-22 111409.png

So the problem is both being wary of whether the data shown in NA is accurate, and also being wary of whether what's in the known issue list is accurate. Just too many uncertainties for something as important as data. 

If instructors are going to base grades or disciplinary decisions on what's shown in NA, it has to be 100% accurate. There are other ways for instructors to use NA, of course, but we're a little afraid that if we just tell them about those "safe" ways, they'll explore the tool and start taking what it reports as gospel without realizing the room for error in the data reported. We want to avoid a situation where they might know enough to be dangerous, but not enough to know what not to do.

So we just haven't told them about it, and if they ask us about it, we tell them not to rely on any of that information. Luckily, the ID admin team has a lot of oversight over our courses, so we can have this control over what's used for grades, discipline, etc. 

venitk
Community Champion

I've been thinking about this more, and it would be helpful if, in addition to exquisitely accurate data, NA also added some disclaimers to help instructors, who are unlikely to check out the Canvas Guides to figure this stuff out. 

Example 1

This is a screenshot from a recent course. It says there have been 0 participations in this discussion board. But, the board shows there were over 100 messages posted. I'm guessing those participation numbers don't show up in NA because that discussion is a small group discussion and the participation in those small groups are broken out separately, but the instructor might need some help realizing that. So perhaps the rollover tip should have some sort of disclaimer that it doesn't include small group discussion. 

 

Screenshot 2021-07-23 081512.png

 

Example 2

Same course. The same (whole class) graded discussion is shown twice, once with an assignment icon and once with a discussion icon, with different numbers. This is confusing and, if I were an instructor, I wouldn't know what to make of this. Actually, I don't know what to make of this as an instructional technologist, either. Some sort of tooltip explaining this would be helpful. Not saying it's a bug, just that I don't understand what I'm seeing, so I can't use this data for anything.  

Screenshot 2021-07-23 083206.png

Screenshot 2021-07-23 083132.png

 

Example 3

NA shows page views for pages. But, no matter how accurate the data is on Canvas's end, it can't record page views if the student downloaded the course and viewed it offline. But the instructor may not realize that's something that students can do, or even if they know that's possible, they may not remember at the time when they're trying to determine participation when looking at page views. A tooltip or statement warning instructors that this is the case would be helpful. 

erinhmcmillan
Instructure Alumni
Instructure Alumni

Hi @venitk 

Regarding the labels, they are accurate. This article is a carryover from our previous platform, and the article included the open tag (which I believe you are referencing). However, we are (unfortunately) unable to remove the tags. As soon as we get past all these older articles we'll get past the carryover of the old tags as well. We won't be putting statuses in tags anymore.

Thanks,

Erin

dsweeney2
Community Participant

Thanks for the questions.

We also don't actively promote New Analytics, but for scale reasons. We make heavy use of sections at our institution where each student will have several enrolments in a single course. The loading time for New Analytics in many of our courses means that there are significant delays between each action as the system tries to load what are essentially multiple copies of the same data. It would be helpful to have a default of 'All students' rather than 'All sections', but still have the option to search for specific sections as required.

This would also be helpful in the Course Activity report. For many courses this function simply doesn't work. The underlying query fails and the option to filter on date or section never appears. If we could pre-filter the query that would be helpful.

Also, we find 'Message student who' to be limiting in that we can only select one criterion as the basis for the message. Instructors often want to be able to provide progress feedback in a course based on a combination of factors rather than a single mark or progress score.

When the messages fail it's also unclear whether messages were sent to some and not all students or if the message failed for all students.