A Major Step For Better Media Analytics

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

AkosFarago
Instructure
Instructure
11
2711

3.png

If you follow the Release Notes or already looked into the insights of one of your media in Studio today, chances are you noticed that something has considerably changed. We made this change in the belief that better understanding how students watch course videos directly correlates to student success in many respects. And there’s more: looking more ahead in the future, we reworked the underlying pipeline as well. Actually, a significant part of the change the Studio engineering team spent time with in the last few months is working behind the scenes.  You won’t see it. The truth is that it is the foundation of any media analytics coming to Studio later.

But let’s not jump the gun. Let’s go through the redesigned page first.

table.png

 

First, the previous representation of viewership data took unreasonable time to interpret it student by student. We did not change the way we structure this data. But we moved viewers to a table, extended it with completion rates and made it sortable by both columns. Had trouble finding students who missed course videos? Tick. Had trouble gathering those who watched less than 80% of each video so they may not pass the course? Tick. Or struggled with finding a certain student when looking at the insights of weekly announcement videos sent out to 400+ students? We covered it too!

export.png

 

Second, we know that in some cases teachers are glad to work with such reports using their own tools. Extract it, and you’ll have access to this data in a CSV format including a few more data on the viewers (role, email address). 

metrics.png

Third, usage metrics is key to understanding and comparing the consumption of different videos in a course but can also be crucial if such numbers must be reported to a body. Even though these metrics are only available for each media separately today, it will serve the basis of course level usage analytics in the future. Let me walk you through the three new metrics:

  • Views: it describes the approximate number of times viewers have interacted with the selected video based on viewing patterns. Keep in mind that this is not based on an exact condition like Youtube saying that one view counts if someone watches a media for at least 30 seconds in one session. Educational materials are different. We’ll try to figure out the number of views based on viewing patterns. It’s a metric we’ll constantly look for feedback about
  • Time Viewed: the total amount of time viewers spent watching the video
  • Unique Viewers: A simple number showing the number of students who watched a video. Interpret this number together with the above two metrics and you’ll get a grasp on understanding how your video performs among students

charts.png

Fourth, the chart on the number of plays. Some parts of a video might be consistently skipped as seemingly not relevant to students. Or the opposite, revisited multiple times if a complicated math equation is explained. This chart aims to give teachers an answer for the question how did students interact with a particular video? You will see peaks where students revisited it for some reason. You will see bottoms where students feel those parts are less important for their learning.

Frankly, this is a big change for us too. And there are still caveats we will address later. One of the first ones will be a filter for students with only active enrolments in the Canvas course. We are aware that viewership data of students remained here even after they completed a course, but that’s something we already have on our radar.

 

So why is this a big change for us? 

Think about the last time you wanted to cross-check a student’s performance with their video usage patterns. Was her average completion rate of all the videos in the course below 30%? Or was it high and wanted to share positive feedback with her so that you could reinforce the habit of watching lecture videos? Or you wondered if you could build up your course in a different way so that students in general spend more time watching your presentations? Today, we can’t give you a straight off answer for these questions in Studio (well, you can still gather the necessary data with some manual effort). But answering most of these will be relying upon the newly introduced metrics, so we must be absolutely sure we make this right. For that very reason, this is not only a redesigned Insights page, but the foundation of student-level, course-level and account-level media analytics.

One more behind-the-scenes improvement is tracking usage in places with poor network coverage, rural areas as an example. In the past, it could happen that tracking viewership stopped when the device lost connection. Sometimes, it did not even continue for a particular video without refreshing the page. In January, the engineering team did some magic to gather offline viewership data from the browser and send it through once again the connection is intact. Yay!

 

Let us know what you think!

Akos

 

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

11 Comments
James
Community Champion

Akos ( @AkosFarago ) this covers many of the problems I had mentioned to you regarding the previous insights. Really impressive job.

Previously I had to download all this stuff myself through the API (one student/video per second) and prepare the reports and I still wasn't able to tell which portions of the videos were watched multiple times.

I love that it only shows the students who have watched a portion of the video.

I love almost everything about it except that I will still need to download all of the information through the API (if it's available) to figure out which students watched portions of the videos multiple times. That is, the thing you are calling "number of plays" is only provided for the whole class and not for each student. When a student tells me "I watched it multiple times", there's no way to independently verify that.

It would also be nice if the CSV export included information about the video. If a student watches 100% of a 30 second video or 50% of a 5 minute video, it looks like they've spent more time on the first video when that isn't the case. The CSV right now is largely useless to me.

Great first step.

Jeff_F
Community Champion

@AkosFarago - this is a very nice development. I tip my hat to the team.

Note this page in the Studio Guide needs an update:    https://community.canvaslms.com/t5/Studio/How-do-I-view-user-insights-and-analytics-for-Canvas-Studi... 

When reviewing please make note of the initial sentence on that page as I believe this is not fully accurate:

If you are the owner of a video or audio file, you can view analytics and see how many users have viewed the video or audio file.

I believe that if a video or audio file is embedded to a course page, etc. and if the Media Tabs are not visible, that the course instructor, even if they are not the media owner, can access the media analytics by going to Studio > select the hamburgers (settings/ navigation button) > locate the course (newer courses are at the bottom of the list) > click to access the media files to the course > click 'view' to open the media > and lastly, click the Insights tab.  Am I correct?

Accessing Studio Course MediaAccessing Studio Course Media

AkosFarago
Instructure
Instructure

@James - we found the individual engagement data difficult to represent in a CSV-like data structure. We are probably going to make this data queryable through the API later in the future!

@Jeff_F - you have a valid point, you don't have to be the owner of the video to see these Insights. Let me reach out to the documentation team right away!

Feel free to send me a direct message anytime you have feedback from teachers who might not be active on the Community. Thank you!

James
Community Champion

@AkosFarago 

Even though individual engagement information is difficult to represent in a CSV, here are some things that could make it more helpful.

  • Identifiers. The filename contains the date that the insights were generated, but there is no course or video information in the CSV itself. Presumably, someone wanting to analyze this would want to combine multiple CSV files together to get a complete picture of engagement, not just for a single video. Right now, that would have to be manually added to each file and then copy/pasted down for each student. Studio uses video IDs and perspective IDs, not sure which one is more appropriate. Both would be mostly useless to the typical user (but then so is the Canvas ID that most CSV files from Canvas come with) but it's a unique way of identifying the video so results can be combined. The title is not necessarily unique.
  • The length of the video so that we can find the amount of time a student spent. Decimal minutes could be okay, I think more people would find that helpful than seconds (although I like the accuracy of seconds). As I mentioned before, percentages are not directly helpful without knowing the time. This would be the same for each student, but then so would the course information.
  • The time that the student spent watching the video. Again, this could be in decimal minutes. This is a big one that is missing right now. I looked at one of my videos and saw that at the 2:40 mark, there was a huge spike in number of plays (14.4 plays vs 8 unique views). My question, that I cannot answer, is whether that was six students who each played it twice or one student who played it 7 times. Adding the time each student spent viewed would not  directly answer the 2:40 mark, but it would let me know whether there were multiple students who struggled or just one. This is useless without the time of the video (well, a lot less useful. We could scan through all of the data looking for people with 100% and then take the smallest time spent and use that as the length of the video -- hoping that there was at least one student who didn't replay any of it -- but it would be easier for you to just put the length of the video in there).

One could make the case that the individual binary completion graph showing the engagement over time is less helpful than the number of plays chart for the individual student. If you show the number of plays, we could still tell whether it was viewed at all (a non-zero views) and we could also incorporate the number of plays which would help me answer the question about whether it's a significant number of students struggling or just one.

James
Community Champion

A question came up today as I was listening to someone play a video at 1.5 times normal speed.

You define "Time Viewed" as the total amount of time viewers spent watching the video. That implies that if a student watches a 15 minute video twice that it would register 30 minutes of time viewed.

The question is about the person that spends 10 minutes watching a 15 minute video. Does that count as 10 minutes or 15 minutes in the timed viewed total?

AkosFarago
Instructure
Instructure

Great question, @James

The "Time Viewed" is always representing the part of the video that was being played for rather than the actual time the user spent watching the video. If a student watches the 15 minutes long video in 10 minutes on 1.5x speed then it will be added as 15 minutes to the "Time Viewed" metric. 

Probably, it is worth updating the corresponding Studio guides with this.

Thank you! 

SpiceWang
Community Member

Thanks for the update @AkosFarago 

Will the team work on developing a single page-view summary data for the class as a whole?

I also wonder whether it is possible to have subfolders within a course so we can analyse the view data in different types of videos, e.g. view on videos for learning materials VS view on videos for assessment explanations.

Thanks

AkosFarago
Instructure
Instructure

@SpiceWang 

On the long-term we'll certainly focus on course-level analytics, that was one of the main goal with this change. One possible way is a single-page view but let's talk about it once we get there!

imerricks
Community Participant

Hi @AkosFarago,

Colleagues have told me they can no longer easily identify which students failed to watch a video. That is, the list of viewers excludes those with a completion rate of 0%. Is that right? If so, is this something that could be rectified in future releases?

Kind regards, Isobel

AkosFarago
Instructure
Instructure

Hi @imerricks,

Even before we implemented the new Insights page, there were some inconsistencies with the viewers. You could have seen students with 0% appear while others did not show up who failed to watch the video. Your colleagues are correct about how we changed it. For now we decided to implement a consistent solution where only students with viewership data appears. 

Our mid-term plan is regularly synchronising course data with media insights and therefore showing all students with active enrolments disregarding the completion rate (perhaps with additional filters). That will eventually provide the full picture with the course context and open up a lot of new possibilities to interpret media analytics on course level.

Thank you for calling this out!

Best,

Akos

imerricks
Community Participant

Thank you @AkosFarago for explaining the reason for the change, I will share this news with my colleagues. Otherwise we are very pleased with these improvements.