ORIGINAL ISSUE, AS IT STILL STANDS — The Canvas Google Apps LTI authorization is failing on me in Chrome in one course (not others), and observed with faculty. Anyone else? Workarounds?The chat with Instructure support helped determine that it worked in Firefox but not Chrome. They suspected it may have something to do with being logged into multiple accounts. His boss suggested that the Google Apps LTI might need to be re-setup (by our Admin).
Solved! Go to Solution.
John Martin wrote:
And forgive me for hoping that the Community forum was a place that Instructure monitored, or student accessibility was a thing they cared about .
I'm sad to read this. I can tell you from my own experience that Instructure employees DO monitor many parts of the Canvas Community, and they do respond. (Personally, I've received feedback from Community Managers, Project Managers, and other Instructure staff on many of my postings including questions here in the Community and Feature Ideas that I've proposed.) For example, I can see that you've been participating in a recent conversation, Idea System Broken?. Any time that you see a red panda icon (and I believe a gray panda, too) next to someone's name, they are Instructure employees. I'm not sure if you've had a chance to read email@example.com's response in that thread (also found here: https://community.canvaslms.com/thread/16197-idea-system-broken#comment-67091), but at the very end she says "We read them all" (speaking to Feature Ideas).
Regarding this "Find Answers" space of the Community website, it's been my experience that, from time to time, Instructure employees will chime in here, too. And, as you've noticed, there are many, many people (non-Instructure employees) who will reach out and help if they can provide input. That help may include things like links to Guides, how-to steps, and work-arounds. Now, there are also times where a question like yours may also sit unanswered for quite a while. sadpanda In those cases, it's not necessarily that Instructure employees haven't seen the question...it's just that there may not be many people in the Community that have a solution at the time. In those cases, I've come back to those threads maybe a month or two later and to check in and say something like, "It looks like you may have stumped the Community with your question. Have you come up with a solution on your end since your initial posting?"
Instructure takes accessibility very seriously in their development process. Check out firstname.lastname@example.org' comment here: https://community.canvaslms.com/thread/16197-idea-system-broken#comment-66970
I hope that you will continue to come back to the Community whenever you are seeking assistance, John.
Thanks email@example.com. I do know Instructure employees do monitor the forums. They're put in a terrible position by their management where they need to placate the posters, but apparently, are not empowered to change the Canvas UX.
What they (Canvas decision-makers vs forum monitors, who can't do this) don't do is a good enough job sharing evidence of taking issues seriously. When my faculty members run into significant issues and get on the Community forums (as I encourage them to do), and their questions and issues are ignored for months, they feel unheard, ignored, and abused by an uncaring system.
Here are three things Instructure can do to alleviate this:
I'll stay on the forums. I tend towards optimism even when I express the pessimism I hear. I see the forums as a great way to find workarounds and tips and hints from other users. I am not finding it to be a place where Canvas decision-makers listen and act.
Let me assure you that we do read the things people post in the forums and we do work together to address issues. We highly value these open and honest interactions in the community. We have ample evidence that Canvas as a product and Instructure as a company are better because of it. As an initial example, I worked with both our Director of Product and the related Product Manager to gather the information for this response. I’ll start by saying...we agree that the issue you have identified should be better and we are working to make it better.
First, a bit of context…
As you know, every three weeks we release new updates to our code that can include:
...all three largely directed by the input we receive from users through various channels.
We released the new Google LTI integration with the January 7 release. In the four releases since that first release we have fixed 14 bugs and enhancements to the Google LTI, which were all user-reported.
Part of our methodology is to get functionality into the hands of users and then iterate based on user feedback. It’s a constant challenge to find the right balance between releasing too soon (not enough refinement) and waiting too long only to find out it’s not right and having to start all over again. For this priority (a larger project) specifically, it’s important to keep in mind that it is a collaboration between Instructure and Google (we also worked on the Microsoft 365 integration priority at the same time). Because this priority is a collaboration, iterations become necessarily more complex than they would be were we writing all the code ourselves. However, we’re confident it’s worth the extra effort because the possibilities with combining Instructure and Google functionality are super exciting.
Our long-term plan is to replace the older, more-limited Google Drive integration (aka Docs) with this new LTI that more fully takes advantage of Google’s capabilities for education. The expectation in January when we initially released this LTI (please see the January 7 release notes) was that it was good enough to add value for many users and could be used as an option for those interested in trying it and providing feedback. For this priority, that’s the balance I mentioned earlier.
With the continuous release of user-reported bugs and enhancements to date, we’re making progress and we continue to review feedback and act on it. As it happens, just today our product manager for this LTI was meeting with her counterpart at Google about the very same authentication issue you made reference to. They are both actively working with their respective teams to determine where the issue resides. The number of possible scenarios across our two systems, combined with browser variation, means that there are a many possibilities that must be considered and tested. She also mentioned that another aspect we want to improve is making our error messaging more explicit, such as with problems related to multiple Google accounts, in the event we can’t solve every possible scenario.
As I mentioned before, this is an ongoing priority. As we work through the issues and get to solutions you will be able to read about them in the release notes. I hope this explanation gives you a better picture of what is happening behind the scenes.
Thank you Scott! This is a fantastic example of the kind of information I had hoped to see in February or March! I know employees read the forums, but I kept hearing from level 1 and 2 support that it worked fine, and worked as designed. I felt brushed off and wasn't sure how to respond to faculty who were complaining to me about it. I couldn't even say "Yes, Canvas knows about it and is doing something about it; please be patient". I *had* been telling faculty that the Community and Canvas were responsive and listen, but the evidence for that had dried up. And I couldn't make the case at all to our Computer Science folks — many of whom were on here and left in frustration (my sense is that they know what changes are easy, and what changes are not).
Anyway, I appreciate the information you shared, and I really do appreciate that Canvas has people who pay attention to their users. Naturally, I want Canvas to be even more forthcoming and responsive than it is. Thanks again!
Reading your reply this morning, it occurs to me that 'functioning as designed' and 'definite room for improvement' are not necessarily contradictory. After talking with multiple people in Product on Friday and Saturday I know that we are continuing to develop this integration. As far as communication and transparency, we put a high priority on that there is always room for improvement.
Per your request, I have removed the "Assumed Answered" label from this question for the time being. However, just so you know, my statement above from April 19th would still hold true even if we kept the "Assumed Answered" label:
Because there hasn't been any new activity in this particular discussion topic for over a month, I'm going to mark your question as "Assumed Answered", but that won't prevent you or others from posting additional replies below.
But I do understand why you would like it removed for now.
Thanks for understanding Chris! It still shows up as "Assumed Answered" for me. I worry that people will see this thread and, per the label, assume it has been answered rather than assume that it's been 1) hard for Instructure to solve, 2) not deemed worth their time to solve; or 3) simply ignored by fix-makers. "Assumed Answered" isn't nearly correct. Maybe other labels could be developed and used?