Canvas Release -- Sour Notes

Community Member
38 21 2,045

Before I begin this blog, I do want to send huge kudos to erinhallmark who kept me sane through a more-than-trying Canvas release week.  Erin, you are a hero who stuck with us and helped throughout the week. 

However, I must articulate my frustration and concern over the latest Canvas release – which I can only consider a debacle.

I spent almost every waking hour of last week with my finger in the dike, so to speak, trying to help users from all of the four institutions I support.  It appeared that every time we identified an issue and created a workaround, another issue sprang up.  Although engineers worked on resolutions diligently, the issues were just too far-ranging and too impactful – on both old and new gradebook users -- as well as students. Not one communication about the new release had any indication whatsoever that students (or old gradebook users)  would be impacted in any way – and yet they were.  We are still waiting for the resolution to the broken Plagiarism Framework, which is not working as I write today.

I realize that sometimes things do not work out as planned, and I have a very strong heart for the challenges of software engineering. However, I need to articulate some issues that I truly believe Canvas must address if they wish to maintain their customer loyalty.

  • Listen to your Beta Testers. Reading back through the comments from beta testers (https://community.canvaslms.com/docs/DOC-16958-canvas-release-notes-2019-07-13), the issues with this release were identified well in advance.  Tester kmatson@uoregon.edu even said, “Why did you make muting grades so complicated? It was simple: they are on or off.  Now we have so many layers we can't figure it out. Please remove this feature.”  Why did no one really listen to feedback received throughout the beta testing process?  What purpose does Beta Testing serve if it the comments you receive are not taken seriously. Folks even knew there were problems with the Plagiarism Framework. Why did this release get pushed to production when issues were not fully addressed? Is there anything that can stop a Canvas release?
  • Do NOT go dark when being transparent is truly important. How very frustrating it is when avenues for communications are closed or hidden.  You can see from the Release Notes site that commenting was turned off.  The ostensible reason was that people needed to submit tickets. What that did was close off a single avenue of communication and force comments out to myriad different threads – so the right hand did not know what the left hand was doing. This was exactly the wrong thing to do. There needed to be a single place where issues were uncovered, workarounds posted, and communications recorded.   Your greatest ally in service is this community, and you sent us all out on crazy wild goose chases to figure out what others were finding and try to get answers to questions. Again, I can only shake my head and wonder what you were thinking. This is not the transparency one would hope for.
  • Be open to the WHY question. I am feeling very strongly that Instructure is operating on some hidden logic – rather than using the community to determine direction.  This latest release does not seem to have added any value to the student or instructor experience – and turned a simple and elegant functionality into something convoluted and ungainly. It is very hard for us to understand WHY this change was more important than the myriad of requests voted up by the community that never seem to get done. Just explain yourself. I don’t even care what the reason is – I just want to know that someone at Instructure is actually thinking about why what they are doing actually makes a positive experience on the user experience – in any way.
  • End-user test your customer communications. I do not know who writes the notes called “Canvas Status Updates” that I receive in my inbox. What I do know is that these communications rarely articulate the issue in a way that our end-users could possibly understand. For example, “Allow instructors to grade in the submission details page with manual post policy.”  There is no way any of the users I actually work with will understand what this means.  I have to go through each item and explain exactly how this impacts them. To my users, something like, “ The grade entry box you used to see on the submission details page is now restored.  You may once again enter grades by clicking on a student’s assignment” (probably with a screenshot).   I think this is related to the post I saw indicating that Instructure would be well-served by having its customers provide user stories.  These kinds of updates seem to have no connection to the user stories that surround this issues, and therefore I have to spend large amounts of time translating and explaining.

 

I still believe that our organization made a good decision in moving to Canvas. I just want Instructure to realize how disturbing things are looking to some of us out here working with real students and real professors and trying to hold our fingers in a dike that is growing increasingly scary. 

Please, others, reassure me by commenting on this blog.

Image: Photo by Nery Montenegro on Unsplash

21 Comments
Surveyor

Well put.  We switched to Canvas a year ago, and we love it.  However, this past week has made us hope that the nature of the company and customer support is not changing.  Beta testers were ignored, and nothing extra was sent out to warn us of the implications of the change.  We were not prepared, and it hit us during finals week for the summer term.  Not happy.

Learner II

Thanks nancy.lachance@adtalem.com‌, this captured a lot of what I was feeling and dealing with over the past week.  I also second the kuddos for erinhallmark‌, her comments were life savers and helped us stay on top of what was going on. 

I agree that having a central location for communication would have been really helpful.  One area that comes to mind is the Known Issues list that school admins have access to, I don't know that this was used in the communication strategy here.  For the most part, I felt like I was jumping between different threads on the community, while playing "whack-a-mole" with responding to different issues as they appeared. 

One useful technique we use when responding to major incidents is to split out the roles of people involved in the response, where one person is leading the technical/operational response (working with developers, server admins, DBAs, vendors, etc. in getting things fixed) and a second person is leading the communicating about the response.  The communications lead is in charge of communicating in every direction across the organization (upwards, outwards, downwards, laterally, to end users, etc.), and is in the loop on all of the technical/operational aspects of resolving the problem.  This division of duties helps the technical/operational people focus on fixing the problems, while the communications lead fields questions about status updates, and connects information coming from end users with those responding to the incidents.     

Community Member

Thanks dave-long@uiowa.edu.  This is why I love the community - such helpful and proactive suggestions Smiley Happy  

Community Member

YES! YES! YES! Heart

I cannot stress enough how much I agree with EVERYTHING stated in nancy.lachance@adtalem.com's post! I'm not sure why it wasn't rolled back immediately when the first "problems" were identified instead allowing the problems to continue & compound. We're still dealing with fallout, and as far as I'm concerned, the problem has not been fixed. The so called "solutions" (to have to manually adjust every assignment/quiz) are not manageable for faculty.

I agree, I think Canvas is a good LMS choice; however, I hope this situation is used as an opportunity to learn from and doesn't become the standard practice for problems of this significance.

Community Advocate
Community Advocate

I must say that I completely agree with this. Testing for Beta--especially during the summer for those of us in the northern hemisphere--may not be as complete as at other times due to limited numbers for some campuses.  But even with the likely limited numbers doing active testing, the fact that b.w.reid@ljmu.ac.uk, bless her, showed a total of 7(!) cases submitted indicated that things were clearly amiss.  On the other hand, thank goodness this wasn't done during the fall term!  Indeed, kudos to erinhallmark‌ for being on top of this, but clearly this should have been put on "hold" for a bit given the number of issues that were brought forward.  As Nancy said:  "Listen to your Beta testers."

Surveyor II

Wow, this is a disturbing thing to see as we begin our Canvas implementation and move away from Blackboard. A big reason we are switching is because Blackboard didn't test their updates and new releases well; in fact, their updates frequently "broke" our production instance.

Please, Canvas. Don't become the next Blackbord.

Adventurer III

Spot on, Nancy.

For the past year or so, I've felt Canvas has done a good job with the production releases and roll out of new features with limited issues and breakage. In the year prior (my first year with Canvas), there were several production updates that caused an unforeseen break somewhere.

Right now, we are fortunate that we're only running a hundred or so summer school courses in Canvas and have not seen any impacts by this change. Through the chatter last week, I came to understand that there were a lot of issues in production at other institutions. Given the extensive feedback in the release notes regarding this feature, it's frustrating that it was promoted to production. I would say that the gradebook is the most critical part of Canvas for end-users and Canvas must be extra judicious when promoting feature updates that impact that part of Canvas. But really, whenever there are that many concerns from your engaged admins around any new feature, it's time to reconsider if it's ready to go into production regardless of the part of Canvas that's impacted.

I also disagree with the decision to turn off commenting on these release notes. For users like myself that were engaged in or following that conversation, we were left with unanswered questions, and were in the dark as to what other users were encountering in their testing. 

Community Advocate
Community Advocate

Such a well written post which very eloquently expressing the frustrations of many. Comes in a week when many community members felt very disappointed at the latest decisions made concerning a number of long standing, heavily voted for/commented on/popular Ideas. 

The strength of Canvas is the Community. The Community is largely made up of educators and instructional designer willingly giving up their time (and sharing their resources) in support of others and it is important that communication shared with them is transparent and timely. Sometimes there does seem a disconnect between the ideas and thoughts of the Community and developments/improvements made. As someone who has spent over 15yrs with a number of different (open source) learning platforms, Canvas is a great product and its Community second to none. It can be even better and will only be so by listening to and engaging with the Community.

Community Team
Community Team

Comes in a week when many community members felt very disappointed at the latest decisions made concerning a number of long standing, heavily voted for/commented on/popular Ideas.

Gideon.Williams@britishschool.nl

Just wanted to clarify that in no way has any of our process changed for noting ideas that are prioritized for the next quarter or not.  The same process occurred this week as has occurred since the Adaptation: Feature Idea Process Changes‌ blog post of last fall.

The only difference is that the Community platform now sends out a notification whenever an idea is edited.  It's a lot of manual edits that we do each quarter in order to be very transparent - so it does feel overwhelming (for all of us) to get each one of those as a notification.

Community Team
Community Team

Hi, everyone,

Thank you for sharing your thoughts with us. We recognize this release was definitely abnormal and we have already had several discussions about what went wrong—and still discussing. Smiley Sad We have called this feature our perfect storm—too many things going on at the same time, including InstructureCon. In retrospect, yes, we should have done some things differently. We cannot take back what happened, only try to make things right the best we can. It's situations like these that we absolutely want to avoid in the future. I'd be happy to explain a few of the questions presented in various replies.

Re: Beta testing. Each ticket noted in the comments for the last release was triaged. The Community is only one aspect where we receive feedback, so as much as receiving several cases from one amazing person is helpful, it doesn't show the width of the potential problem (which is why we usually encourage multiple people to report the concern, even if one person already has).

However, most of the cases related to information from the release notes that caused confusion about the intended functionality—we concluded the information to be incorrect and immediately updated the details in the release notes Most of the concerns related to the behavior of the visibility icon for assignments using the manual post policy.  Additionally, there was a concern about the Total Grades column being hidden, which was ultimately corrected by our engineers, and one of the student view concerns ended up being caused by custom CSS. No other tickets had been brought to engineering's attention by support or community enough to cause significant warning about what would ultimately happen with the deploy from the beta to production environments. One lesson learned is that the nature of the code base can be questioned when various teams are working together on similar projects, such as when quizzes interact with the Gradebook. Our leadership teams have taken note of this experience for future communication and code improvement.

Re: Release notes comments. Release notes are designed to announce the functionality coming in the release, and we do keep comments open to capture feedback about your thoughts. To help channel comments, we hope that they'll go in one of three directions. The first are the comments, where you can tell us if you love or hate something... Second, for the ideas you have for enhancing an introduced feature ("I like the concept but X and Y would also be cool"), we direct you to the Ideas space. And third, for contrary behavior—aka bugs, where features are not cooperating as intended—the best outlet is via our support team, as cases not only channel you to the people employed specifically to be on the front lines of helping you out, but cases also provide more accurate count of how many customers are actually being affected by a specific use case. More importantly, we don't have adequate means to triage bug reports via release notes (support cases also include helpful information such as your institution, course URL, and other important details that we'd never ask you to post publicly in the notes). Having said that, we realize that if we've set up an intended workflow path that doesn't seem to be followed naturally, we probably need to revisit that idea! So release notes comments is now on that list.

Additionally, please know that closing comments is a rare event. We infrequently close release notes, and the only reason we have is because of a limitation with our platform where comments become unthreaded after 100 replies. We've learned that having comments unthread is more disruptive to everyone involved, and when they get close to 100, we pay particular attention. The direction in the comments to file support cases was intentional as support cases will always give you updates on what is happening, but we soon recognized afterward this situation was becoming  different and immediately changed our internal communication efforts. Again, this is a learning experience we'll apply to our future communication plans.

Re: CommunicationDavid, I appreciate you sharing your communication strategies. This situation has raised a new level of concern that we are already reviewing and creating a new plan for communication, both internally and externally.

Let me share a few things previously in the works and how our new plans will apply:

  • Status page. A month or so ago, we began a project to renew the status page, which will include domain-specific historical status reports that display specific services. We don't have a timeline for the completion of that project, but it is in process. Currently the status page is only used to note outages to the platform and managed by our support team, but now we want to put the status page into our communication strategy for situations such as this (although we hope infrequently).
  • Release notes page design. As part of updating our release process, our Release Notes page will be getting a facelift to differentiate information for both releases and deploys. That change will happen next Wednesday, July 31.
  • Release data protectionWe are encouraged that our new Canvas release schedule and feature flag strategy will help us be more aware and specific as an organization about features and how they are functioning before they are announced. Feature options as they existed before July 13 worked fine for most use cases, but our learning with them has evolved. With our new strategy, our teams will be able to manage them more specifically and strategically. If we had the new flag capability available five months ago when the Grade Post Policy was initially built, our engineers would have been able to turn that flag off immediately after we realized the Gradebook was amiss. However, in the existing state of feature options, enabling that feature also required our engineers to run backend migrations and data that could not be pulled back, which is why so many additional fixes were needed to correct what had occurred. The new release strategy offers more protection for your data and allows engineers to turn flags off just as easily as turning them on.
  • Release communication. The Community team is making some adjustments in our structure so I can give more time to product and engineering and be more involved in our feature processes earlier. Our teams also intend to introduce more beta testing opportunities to ensure proper rollouts. As a company, we will ensure additional preparation for features, both internally and externally, and we have plans for additional resources to help everyone feel more prepared for feature rollouts. We'll also work more closely with our support team to know what concerns are being reported so we can see how to help as well. And, as I also noted above, we'll be revisiting how to better manage release notes comments and give you the feedback you need to help avoid ever approaching 100 comments! We don't want anyone to feel that we aren't listening because of that infrequent limitation. I'll announce updates to all of these lofty ambitions as they're available. Smiley Happy

Thank you again for your feedback. It makes us continually push to be better, and we want you to keep having confidence in us and our commitment to support you. Please let me know what other suggestions you may have at any time in this post or via direct message if you're more comfortable.

Thanks!

Erin

Explorer

Nancy,

I totally echo your comments, we spent a significant amount of time testing last week and raised a question about the TII issue early on, but were told: "Post policies doesn't affect any current behavior associated with LTI tools." We use the TII framework tool alongside the TII LTI, and we have been in communication about Canvas and TII previously about the issue relating to muting assignments and the ability for students to access their originality reports when assignments are muted. We weren't able to test this until the system was released. Do Canvas not have the ability to test against LTI's in their instance?

 

We felt blindsided by the release and were unsure how the new grade post policies improved the previous workflow. I understand this came from a request to mute for sections? We don't use sections within our institution and if we could, we would switch this new functionality off via a feature option. Changes around assignments and grades are high impact and we were baffled as to why this has not been flagged earlier with users, before it appeared in the release notes-advanced warning of changes like this is critical to impact management. If, as Erin pints out, this was considered a particularly busy period in the Instructure calendar, why was more thought not given to postponing the release?

 

After the communication on the thread was forcibly closed we also felt that we were cut off from discussing issues with other institutions. Each encountered issue was diligently reported to support, but we also use this space to communicate with other institutions to ensure problems were not local, before raising the issue with Canvas support. If comments hit a 100 replies maximum then it's a good indicator that your users aren't happy and the functionality should be pulled or postponed.

 

I think we must have spent about five days testing last week. A mixture of issues and misinformation on the release notes caused us to raise 9 separate tickets. Some of these turned out not to be issues, but I still feel that grade post policies is a really confusing layer on what used to be a really simple workflow. We moved to Canvas because it was intuitive and easy to use-we don’t feel that grade post policies aligns with this ethos.

 

We have generally been happy Canvas users, to date but we are customers and our understanding was that Beta testing was designed for us to explore new functionality and fixes-we were not aware that we were to be employed as full-time Beta testers. We only had four days to test this before release because it was unexpectedly pulled.

Bethan

Community Advocate
Community Advocate

Well said nancy.lachance@adtalem.com‌ and all,

Problems happen, we all accept that, it's how they are dealt with and what's learned from them that makes the difference.


Would it be useful going forward if the known issues section of the New Feature User Group was updated to include issues reported. For example the issues over the past week could have been added to the New Gradebook Users Group known issues (which states "The New Gradebook doesn't currently have any known issues notable to the community" and was last modified Dec 2017).

Again, thank you erinhallmark‌ for all your efforts keeping us up to date (as best you could) with the issues.

Thanks,

m.mccooey@qub.ac.uk

Explorer

Hi m.mccooey@qub.ac.uk‌,

I'd totally agree, it's how Canvas move forward that counts. We can all appreciate mistakes happen from time to time. Thanks for putting forward your suggestion, but I'm not sure how easy it would be to manage different 'new features' issues across multiple pages for a particular upgrade. Communication needs to be simple and straightforward, and I'd be concerned that conversations could get lost or diluted across multiple locations. I'm guessing the same issue with 100 replies would continue to be an issue within the community regardless of where it sits? I'd maybe recommend fixing this bug, or maybe opening a new page specifically for the upgrade where comments can continue?

Thanks b.w.reid@ljmu.ac.uk

Community Member

m.mccooey@qub.ac.uk -- Usoing the "Known Issues" area to form a community response to a release is a great idea. When we have high impact releases, we generally have a "War Room" approach that allows us to manage feedback and issues.  I am not sure every release needs this, but I love the idea of having a great way to form a community response to a New release that has not gone as well as planned - with Canvas having a place to keep everyone informed as issues are resolved.  I know that tickets are critical and are a way to provide concrete information to those working on the issue, but having a community that can put all of this into user-centric words is also critical.

Surveyor

We have been using Canvas for 2+ years, counting implementation time, and this is the first issue we have had with any of the updates.  I guess that is why this issue caught us by surprise -- normally it is very smooth, good communication, etc.  I would not worry about it moving forward, but we are paying much closer attention to the release notes!

Surveyor

Would it be possible to bring beta release notes back? For the limited time Post Policies was in beta, I tried to submit several tickets based on assumptions we had from conversations with our CSM and Product folks. The support team had no documentation and dismissed my issues because they said there was no expected behavior without the documentation. 

Community Team
Community Team

Hi, Diane,

My previous comment in this thread noted that I'm going to be helping teams internally prepare better in advance for feature changes. We'll be adjusting several of our internal processes to enable more advanced preparation for support in particular. Again, post policies was not exactly our best feature rollout and we're working hard to make changes to ensure we do not repeat that type of situation!

Hope that helps,

Erin

Learner II

Wonderfully written post with lessons for Instructure and any ed tech company. Canvas QA has been reliable for us since our migration began in 2017. It's clear here that internal deadline rules were bent to ship features in a rush. This has put Grades into a worsened state that cannot be reverted. We were planning New Gradebook workshops in time for September and will need to waste time going over more "gotchas" than features. Faculty are asking us why we did this to them, something I've never heard since switching to Canvas.

Surveyor

Poking around in Beta today, and noted that the gray "eyeball" is now being used on the left course menu to indicate hidden links.  That is, instead of the grayed out link indicating hidden, it now using the gray eyeball to indicate hidden.

Unless something has changed in the new gradebook symbols -- where using the "manually" posting policy puts the gray eyeball at the top of each grade column to mean the column is NOT hidden, the gray eyeball on the left menu items means it IS hidden.

This is not consistent.  Or maybe I have missed something.  All these gray and orange symbols for show and hide are getting confusing!

Community Team
Community Team

kburkes‌, the icons you're seeing in beta relate to a11y. If it remains in beta, it will be in the notes that come out this Monday, August 26; keep in mind there is a one-week delay between when beta deploys changes and release notes come out (the schedule is in https://community.canvaslms.com/docs/DOC-14787-what-is-the-canvas-release-schedule-for-beta-producti... ).

Community Team
Community Team

Hi, Kate,

Our team isn't finished reviewing the current UI with post policies; they do have some recommendations that they'd like to put to the community soon.

Thanks,

Erin

About the Author
Nancy began her career with DeVry 25 years ago as an English teacher with a Bachelor's degree in education and a master's degree in English from Arizona State University. In 1985, she became interested in the potential of the Internet for the practice of English teaching, became the webmaster for the DeVry University-Phoenix campus, and then moved rapidly into technology management, serving as Dean of Information Technology at the Phoenix campus. In 2001, Nancy was promoted to the role of Director of Academic Technology Services for the DeVry Education Group’s Information Technology department. As Director of Academic Technology Services, Nancy managed the development and delivery of student lab experiences, as well as various centralized eLearning resources. Notable projects were the development of a Citrix server farm for remote access to student lab applications, development and management of a student software program, system-wide adoption of a centrally managed eLearning platform (eCollege), and management of the Level 2 student technical support team. In this role, Nancy managed the initial research and development of content management processes, project that was awarded an IMS Learning Impact Gold award for research and development in 2010. In 2011, Nancy moved to the Online Services organization. She led a team of instructional technologists, course producers, and multimedia developers. During this time, Nancy oversaw her team’s development of a collaborative tool for faculty input into online coursework, the integration of a new, updated web conferencing system, universal implementation of a new e-book reader, and the development of HTML5-based tools for students to self-assess their knowledge. In July 2013, she returned to her focus on content management, becoming the technical lead of Project Independence, a project that extracted course assets from over 800 unique courses. Her team rebuilt them using web-accessible HTML-based course content templates, and published them to a searchable content management system. She now works with the Course Development Strategic Projects Team, currently focusing on the migration of 1500 Master Courses, about 4,000 faculty, and about 50,000 students to the Canvas LMS in July 2017.