Key Takeaways from the R1 Customer Discovery Session in Salt Lake City

annalindsay
Instructure
Instructure
0
273

Canvas.png

Last week, we held a Customer Discovery Session (CDS) in Salt Lake City, bringing together valuable feedback from various R1 institutions. The event brought together a diverse group of attendees and covered a range of topics, aiming to gather feedback on their Canvas LMS experiences. This blog post provides a summary of what we learned.

Roadmap and Top Priorities:

The day began with a review of the Canvas LMS roadmap updates. While participants expressed excitement about upcoming features—such as API and Canvas Data support for New Quizzes, Peer Review, Grading Enhancements, Differentiation Tags and Accessibility, they remain committed to ongoing collaboration on initiatives that align with additional priorities.

Exploring Assessment Data:

A dedicated assessment data session explored how assessment and quiz data are utilized across different institutions. It became clear that institutions access this data through various methods; directly through the UI, Canvas Data extracts, and custom reports or dashboards. These varied practices require different approaches to data extraction and highlight the need for consistency in where assessment and complementary data can be found. Continued investment in these capabilities is essential to support the diverse ways customers access data and to drive adoption of New Quizzes.

STEM Item Types:

During the Future of Assessment session, users tested upcoming STEM item types by building and taking assessments using new STEM-focused question types within the New Quizzes experience. Participants were excited about the potential applications, particularly for math, science, and economics departments. Advanced equations, graphing, and highlighting features were well-received. Through our discussions, we learned that institutional leaders continue to prioritize accessibility and ease of use, stressing the importance of keeping even complex item types simple and accessible for all learners. CDS participants also highlighted the need for clear guidance, practice tests, and the ability to limit available question types by department to avoid overloading faculty. These insights will inform the continued design and development of the STEM experience. 

Design Feedback: Differentiation Tags and Peer Review Enhancements

Once available, Differentiation Tags will enable educators to create custom tags for learners, allowing them to assign differentiated content, organize learning experiences more effectively, and filter the traditional gradebook based on those tags. The demonstration of Differentiation Tags sparked conversation around potential use cases such as managing accommodations, creating ad-hoc small groups, identifying student athletes, and aiding in rolling enrollment courses. 

It was also noted that introducing a third grouping option could confuse some users. This concern, raised by some participants, emphasizes the importance of clear communication and a strategic release plan. Our communications must clearly articulate the benefits and use cases of each feature: Differentiation Tags, Groups, and Sections.

We also explored Peer Review prototypes and gathered feedback on allocation methods. CDS participants offered a variety of suggestions for enhancing the Peer Review assignment experience, but all emphasized the importance of maintaining a cohesive and familiar interface consistent with the rest of Canvas. As we continue developing areas like Peer Review, aligning with established workflows from recent improvements helps familiarize faculty, reduces the need for extensive training or change management, and ultimately delivers a more intuitive experience.

Enhancing System Functionality:

A hands-on workshop aimed to refine the way Instructure handles usability issues that have a directly negative impact on Canvas’ teaching and learning experience. We discussed the concept of user experience (UX) bugs, and gained insights into processes that would enhance system transparency and functionality.  Stay tuned for more updates on this process, coming soon!

Overall, the Customer Discovery Session provided Instructure with valuable feedback on the future of peer review, differentiated learning, assessment, and LMS stability, directly from key stakeholders. We extend a special thanks to the attendees who joined us virtually and onsite at Instructure’s HQ, and appreciate their time and willingness to meet. Attendees reported that they value the opportunity to connect with one another and mutually contribute to Canvas LMS’ product strategy and development, with one attendee stating,  “I’ll always rearrange and reschedule meetings to make this visit work—I always get so much from these meetings.”

We will continue the discussion and look forward to iterating on the features covered to position Canvas LMS as the best Learning Management System.