How we're protecting privacy with the Canvas Skill for Alexa

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

jared
Instructure Alumni
Instructure Alumni
11
4775


I've talked with some folks both f2f and in the Community about the new Canvas Skill for Alexa (included in our InstCon0017 Product Announcements) and how it might impact student privacy (e.g. FERPA in the USA). Privacy is a hot and important topic in modern society, so it's worth thinking it through deeply.

 

tl;dr

Normal usage of the Canvas Skill for Alexa will not violate student privacy laws or policies, primarily because students have choices in whether and how they use the Skill.

 

--

 

Before we get into the question of privacy, let me share some background on the Canvas Skill for Alexa:


How the Alexa Skill fits The Canvas Way


We want all users of Canvas to be free to access Canvas through the devices and services that are a natural and important part of their everyday lives. This is why we integrated Canvas Notifications with email, SMS, and social media services like Twitter, giving the users themselves choice over how Canvas uses those services. This is why we've been progressive with native mobile apps and integrations with Google Docs and Office 365. We live in a connected, integrated world, and we believe people will use Canvas more -- and subsequently be more engaged in teaching and learning -- if Canvas is also connected and integrated.

 

That's our philosophy, but we also are committed to protecting users privacy and helping customers adhere to laws or institutional policies as we design, develop, and deliver software. And I think that shows in how the Canvas Skill for Alexa works.


How it Works

As with every Canvas integration, Canvas passes only the minimum amount of data required to accomplish a user's request, when they make the request. Canvas does this securely and in a fashion that allows the user to revoke access to the third-party application at any time.

 

The Canvas Skill for Alexa is a new service that we've built and will maintain that acts as a "middleware" between Alexa and Canvas. When a user asks the Canvas Skill a question, that service asks Canvas via our existing open API. There aren't new, secret end-points, and there aren't direct hooks into Canvas. The Canvas Skill uses the same method of getting data from Canvas that our mobile apps use, and, indeed, many core Canvas functions use as well.


Student Privacy and FERPA

Does the Canvas Skill for Alexa violate student privacy laws, specifically FERPA in the US?

No. We can discuss different laws or privacy protections in different regions case-by-case, but I think FERPA is a great starting point. FERPA says that parents or eligible students have the right to...

  1. review the student's education records
  2. request that a school correct records
  3. approve the release of education records (with some exceptions)

 

The Canvas Skill for Alexa doesn't apply to 1 or 2, and does not (nor enable anyone else to) release education records without the student's express permission. We designed the Canvas Skill so that...

  • Users must both choose to use Alexa and also enable the Canvas Skill, making them the agents of any data exchange.
  • Users have the power to disable the Canvas Skill and even revoke the Canvas Skill's access to Canvas at any time.
  • Users have other options of accessing the same information without the Canvas Skill (browser, mobile, etc).

 

It's there that I know we come into some nuance, so let's explore a hypothetical scenario:

A student has an Echo and adds the Canvas Skill to their Alexa account. Anyone (roommates, friends, colleagues) who is in proximity of that Echo can then ask Canvas about that students' grades.

Yeah, this could happen. But though Alexa may be new technology, this hypothetical problem isn't itself new. And there are ways to avoid it:

 

On the one hand, I think it's fair to expect the owner of the Echo in that scenario to have thought about this on their own. If they are concerned about this, they can simply choose to not use the Canvas Skill for Alexa. If they've already enabled it, they can at any time disable it or revoke permissions.

 

Even if students do choose to use Alexa in that situation, there are options that can help students maintain control, e.g. using an alternate wake word, securing their Echo when others are around, even if just by muting the mic. 

 

I think the Alexa hypothetical is not very different from a student who has left their computer on and logged in to their campus portal in a place where others might see or access it. Let's go low-tech, too: Students may leave a printed copy of their transcript on their desk where others can see it. Have you ever walked into a Starbucks and seen a student looking at their grades on their laptop? That's a less likely scenario, perhaps, but one that deserves similar consideration.

 

In short, students have control over if and how Alexa is used. They also have the same personal responsibility to protect their data and information that they would with all other media. This is also why it's important -- for education staff, for institutions, even for teachers and students -- to engage with the companies and organizations that offer these services when we are concerned about how our data is being used. It's true that the Amazon Echo challenges us to think deeply about personal privacy how we interact with information and our online accounts. The way the world is changing, we need the kind of discussions we're having here : )

The content in this blog is over six months old, and the comments are closed. For the most recent product updates and discussions, you're encouraged to explore newer posts from Instructure's Product Managers.

11 Comments
kyle_johnson
Community Novice

We've been discussing this on Twitter.  While I appreciate this post, I think it's incomplete, as it doesn't address faculty enabling the service.  In addition, I think saying "students are opting in, so we're fine" is maybe technically true but ethically insufficient.  Here are some thoughts I had when this was first announced:

Big Data, Little Ethics

kmeeusen
Community Champion

I actually strongly agree with  @kyle_johnson ‌ on this.

As a privacy advocate I believe in informed opt out, rather than opt in, and I think this product should come with a strong advisory disclaimer describing the privacy risks associated with the use of this product.

This really sounds too much like the same kind of side-wise  rationalization, illogical spin-control reasoning we hear from too many IT tech companies these days. Instructure is not just a  one-of-the-pack IT company, and it should do better than this.

I actually like this product, and plan to use it, but my students will be very well informed in advance so that they can choose wisely before playing.

Kelley

scottdennis
Instructure
Instructure

Hey Kyle,

For more specific information on what teachers can do, please see: Digital Assistant: Canvas Skill for Alexa 

You will also find additional resources here: https://community.canvaslms.com/community/instcon/2017/blog/2017/08/08/instcon0017-product-announcem... 

jared
Instructure Alumni
Instructure Alumni

Thanks for the feedback, Kelley. I like your distinction between informed opt-in and opt-in.

My explanation isn't just an attempt to rationalize this or spin-control; we did think through these issues and understood that there was no perfect answer as we considered different student use scenarios.

That said, we will figure out how to get better at ensuring students are well-informed about potential pitfalls in using the Skill, whether that's simply communicating more directly during Skill installation, or adding another layer of protection with a pin capability.

jared
Instructure Alumni
Instructure Alumni

Hi Kyle, I read your post. A few quick responses:

  • Even though it did originate as a Hack Week project, The Canvas Skill for Alexa was in fact product managed by a Canvas product manager, and we did have very direct conversations with the Alexa team around data privacy, FERPA, etc.
  • I also agree that we can do better about informing students of what happens with data. Even if we believe it to both legal and ethical, yes, students should be well-informed.
  • The faculty capabilities are deliberately more limited specifically because we judged some privacy-related issues at greater risk when faculty use the Skill.

Happy to talk about this more. We know that this kind of new technology presents new challenges, and so we have and will continue to weigh the benefits vs risks. 

Thanks for your input,

Jared

jbrady2
Community Champion

 @kyle_johnson ‌ I found your post very informative. I saw the Amazon Alexa tent at InstructureCon this year, but I did not fully understand the services that they are now offering or the implications thereof.

One question that your article made me consider is that since, to my understanding, Alexa does not perform voice recognition (I believe this to be true as my girlfriend uses the Alexa I have at home though it is tied to my Amazon account.), then anyone who had gained access to a student's Alexa could query information about his or her Canvas data without the student's permission. If this is true, then it is imperative that the student understand fully what he or she is opting into by enabling these features.

jared
Instructure Alumni
Instructure Alumni

Yes, that's true. I think we all agree that there are some risks associated with this kind of technology, though we may disagree on the severity of the risks vs the benefits for students.

But as I suggest in the post above, we will be looking for ways to increase student control over access to their information, whether by adding a PIN feature, or student toggles for data delivery -- several ideas for stop-gaps until voice recognition is perfected and delivered in Alexa Smiley Happy

But we do like the idea in this thread of at least adding some additional language to the Skill itself in the Skill store to increase awareness, and will make that happen in the next update.

kmeeusen
Community Champion

Thank you  @jared ‌

These are very thoughtful responses. I know I was a tad bit direct in my earlier responses (okay, I was a total curmudgeon), but this is a federal law that educators and educational institutions are mandated to support, enforce and inform. While I can advise our faculty to inform students of the risks associated with this Alexa Skill, I cannot mandate them to do so, and the reality is that many won't. Let's face it, many won't even open or read my advisory. A technological communication using Canvas is also problematic.

So I really do appreciate these new responses and especially ...............

we will be looking for ways to increase student control over access to their information, whether by adding a PIN feature, or student toggles for data delivery -- several ideas for stop-gaps until voice recognition is perfected and delivered in Alexa

Thanks, Jared!

jared
Instructure Alumni
Instructure Alumni

Based on your constructive feedback. we've just updated the Canvas Skill for Alexa so that users can set a personal PIN for greater security. The Canvas Skill will then require that a user provide that PIN before Alexa answers any requests.

It's pretty cool. Hope this helps!

robmoores
Community Novice

Alexa can now recognise voice profiles

scottdennis
Instructure
Instructure

If you are following this topic, you might want to check out the blog post posted today:

https://www.instructure.com/canvas/blog/data-privacy-our-current-and-future-commitment 

Thanks