“Ask Technology & Engineering Leadership” has been a popular and highly attended session at InstCon for the past 2 years, so let’s do it again. This session is your opportunity to ask your burning questions of Instructure’s engineering leadership team.
Deactivated user, VP of K12 Strategy
@jared , VP of Higher Education Strategy
Deactivated user, VP of Quality Assurance
@mloble , SVP of Customer Success & Partnerships
@sptownsend , VP of Canvas Engineering
This year, just like years prior, you'll have the opportunity to submit your questions in advance, as well as vote on submitted questions that you’d like to see answered. Here’s how:
Submit a Question:
Vote on a question you’d like answered:
About one week before InstructureCon we will post a survey of all questions. You’ll be able to rank the questions listed by importance to you.
Attend the Session:
Check the schedule for the time, date, and location—and we look forward to seeing you there!
What are some question examples?
The leadership panel is a high-level strategy conversation that is not intended to address tactical questions about specific product development. So what are some examples of strategic, high-level questions? Here’s a few:
Strategic: Why do you release as frequently as you do?
Overly tactical: Our school needs you to release code every 6 weeks instead of 3; will you make that change, please?
Strategic: I hear you guys do hackweeks. What is it and what has come from it?
Overly tactical: Will you please assign one of your engineers to develop [this one feature] during their hackweek?
Strategic: I hear a lot about cyber attacks around the world. What does Instructure do to prevent unauthorized access? Have there been any breaches?
Overly tactical: How do I sign up to participate in your Bug Crowd hacking program?
Strategic: How much money and time do you spend on R&D? How much is spent on new features? How much on maintenance? How much on HigherEd? How much on K12?
Overly tactical: How many engineers do you have on the Gradebook project?
Strategic: How do you prioritize and allocate resources for your product roadmap?
Overly tactical: Why isn’t Quizzes.Next at parity with Canvas quizzes yet?
We know that many of our students, particularly disadvantaged students, are mobile-only or mobile-first, so a tool that doesn't include mobile on equal footing is a non-starter on our campus. Why weren't interactions from mobile included in Analytics 2 when it was first released? And when will they be included? How are decisions about releasing product that's mobile-ready (or not) made?
This is a great set of questions that many would like clear answers to.
The closest thing I've seen is a series of answers to questions about reporting and mobile activity that was given June 11-13, 2019, by Kevin Turco, Director of Product - Data and Analytics.
I think the specifics in that thread are a great start to the answer. One take-away is that mobile is hard. I get that, and it raises the larger question around balancing "hard to do" and "worth doing" when building out features in Canvas. Mobile is a Big Deal, so is it worth the effort (which means hours and funds, of course) to be mobile-first with features? I think I'd like to hear more about the philosophy that drives decisions like that.
Thanks so much for linking to these @James . It's nice to know where Instructure is at with this, as we have so many mobile-first students and when we're pulling reports in Tableau I would certainly hope they (Instructure) see how valuable having accurate data on our end will be.
Considering we have an entire new online college launching in California this year that is designed to be mobile-first (Online Community College - California), I would hope Instructure does see the significance of this data being accurate.
I have a question!
Given the nature of feature ideas and voting up for product development, how are you prioritising development ideas to Canvas that support the different and unique education standards across newer, smaller customer bases (based on your recent rapid global growth) that are potentially competing with ideas of your most extensive customer base in North America?
I have a question!
We have recently added on to our Canvas instance features like Arc and Catalog. How does leadership decide the development process and engineering load for these type products that are tangential to the main Canvas system?
That is an excellent question. That is good to know for clients who are deciding whether to use Instructure for a particular service vs. buying a license from another firm that occupies a similar space (Arc/Studio is what I have in mind). Cost vs. integration vs. feature set - that equation becomes easier to calculate if we have a better idea where the Instructure tools are going.
How does leadership justify allocation of Canvas licensing dollars to initiatives unrelated to maintenance and innovation of your flagship application (LMS)? It seems that more resources are being put into new services requiring additional licensing instead of improving the core product used by the majority of your customers.
That's an interesting question @millerjm , I would be curious to hear a response on the prioritization of dollars from Instructure. Reading their year end disclosure, for 2018 I see revenue:
$188.5M for --subscription and support (at the cost of $46.7M)
$21M -- pro services and other (at the cost of $15.1M)
So that's $209.5 Million in revenue. How much of that is generated directly or indirectly by only the Canvas LMS? We have hosted Canvas Data and Canvas Studio (Arc) at Los Rios that are costs directly correlated to Canvas, and those would both fall into the $21M category (and shoutout to Jason Rock who is not in the community to link but is awesome to work with). I would love to see a breakout of exactly what percentage of that 209.5 is directly attributable to Canvas (as well as what percentage of the $61.8 million that is the cost of revenue is attributable to Canvas).
Conversely, we can see in the disclosure that the operating expenses were $192.4 Million: $97.4M -- sales and marketing, $59.3M -- R&D, $35.6M -- general and administrative. Of that $192.4M in operating expenses, how much of that can be correlated to Canvas, and how much is used for non-Canvas LMS specific activities (such as marketing and sales for Bridge, etc.).
It would be nice to have a bit more clarity on exactly how big a piece of the pie Canvas is for generating Instructure's revenue and for allocation of dollars spent.
Thanks for that information, Ken. We don't subscribe to Arc and consider that to be one of those "extra services" like gauge and bridge. I'm sure they have the funding to support it but it's seemed like the bulk of development resources goes into these new services.
Quizzes.Next has been in development for years and still doesn't have feature parity and still is not included in Canvas Data. Canvas's main quiz engine is very outdated and improvements and bug fixes have been on the back burner for quite a while because "Quizzes.Next" will fix that.
Other feature improvements important for the day-to-day work of admins, while not fancy and shiny, are really important. Despite years of asking for Granular Permissions, we still do not have this feature that's essential to ensuring the security of our system and courses.
Thank you Joni for mentioning Quizzes.Next in your reply. I am teaching a class this summer and had gone through the steps of using that tool until... I realized that the Previous and Next buttons do not appear when a Quizzes.Next assignment is part of a module. And even if students figure out that they have to click the button to go back, it goes to the Modules page and not the next module item. I too find it frustrating that ideas to improve Quizzes are rejected because Quizzes.Next but Quizzes.Next does not yet have feature parity.
I'm with you on how long Quizzes.Next progression has taken @GregoryBeyrer . I still have my Quizzes.Next Beta Phase II acceptance email from Jason Sparks that was sent 4/19/2017....
That said, I'm really happy that Instructure didn't do what many companies do and just kill the product because it wasn't working out. They dramatically underestimated the complexity, and had some turnover from what I understand, but they are sticking with the tool and working on getting it better. We all know that they need the new quizzing engine, so hopefully at InstructureCon 2019 they will provide some further updates on the progress and a little bit more clarity on the road map. I know I am definitely going to be attending most of the Instructure led sessions and the product team always does a good job of saying what they are allowed to say.
+1 @millerjm for mentioning granular permissions. We've been waiting and waiting for this, and it will be a huge benefit when we have the ability to drill-down a little bit deeper into the permission sets. Coming from D2L/Moodle to Canvas that was (and still is) one of the biggest drawbacks, as we have so little control when configuring new roles because there are so many with add/edit/delete grouped together, in addition to the dependencies that aren't readily available to be seen on the permissions screen.
Even something like having the ability to cross-listing courses requires turning on the add/edit/delete course sections permission. I like my faculty to have the ability to manually cross-list prior to the term start, but I wouldn't necessarily want them to have the ability to add or delete sections without some additional training (they should go to one of our faculty coordinators if they aren't familiar with the process). Our SIS process loads into the sections, so there are just some considerations we'd like to make.
I am happy Instructure is working on this still, but I'll agree the pace has been exceeding slow.
Many customers have come to rely on Instructure's Professional Services due to individual institution needs and custom reporting. Has any consideration been given to improving the ProServ experience?
And give clients an option to be identified as references (with client characteristics like student/course count, type of institution, etc.) so that those of us interested in purchasing a service can do a reference check.
I have a question about Instructure data-mining, and it comes in two parts:
1. How has the new push for machine learning and AI as announced by Dan Goldsmith in March 2019 altered Instructure's data collection and retention practices? What changes do you expect to see in data collection and retention going forward as a result of the new emphasis on data mining and predictive algorithms?
e.g. Dan Goldsmith's claim that machine learning and AI "has the potential to double our TAM in education" as reported here: Instructure: Plans to expand beyond Canvas LMS into machine learning and AI
2. What options will users -- students, parents, instructors, institutions -- have available so that we can retain control over our data? Just speaking for myself, I am hoping for some kind of opt-out that will allow me to withdraw my data from Instructure's data-mining efforts so that my work as an educator will not be exploited in the development of predictive algorithms and other efforts to automate education.
I spoke with Jared Stein and Cory Edwards about this back in May, and they had mentioned the possibility of some clarifying blog post at the Instructure blog; I've been watching the blog, but nothing has materialized yet. This is an important issue; I hope it can be addressed in some way before the school year begins because I need to let my students know that Instructure is collecting data about their learning and how it now plans to use that data.
My question is instead about the exploitation of student and instructor data to build predictive algorithms, and what students and instructors can do if they do not want their data to be used for those purposes. In other words: my question is about control of our data; even if the data you have gathered about me has been de-identified, it is still my data (data about me, my work, my behavior). I would like to be able to opt out of the use of my data in Instructure's new emphasis on machine learning and AI products, and I would like for my students to also have the ability to opt out of Instructure's use of their work to build new products.
I also have a follow-up question from last year's panel re: search.
At the Project Khaki event back in 2017, Instructure made a big commitment to improving search options in Canvas. In the Khaki update from Mitch Benson in March 2018, we learned that search was "deferred" with no further details provided. When I asked about this at InstructureCon 2018, Chris Hunter explained that they ran into problems with searching and course permissions.
Other companies have tackled the problem of searching and permissions, so I am sure there is some kind of solution, and I would like to ask if any progress has been made re: search in the past year. In particular, I would like to know if there is any prospect in the near future of students and instructors being able to search course content, i.e. Pages.
Thank you all for your submissions. We'll be taking the questions from this thread, adding them to a few others, and return with survey that will allow you to rank them in order of importance for you.
Just a reminder, the panel is Engineering and Technology, so very product specific questions might not make the cut or we may need to modify them a little to make them applicable. Thanks.
The ranking survey has closed. We appreciate all of the feedback. We'll post the top 5 ranked questions here in this thread and we'll be sure to answer them during the session. There will be another opportunity for attendees of the session to rank the remaining questions!
Here are the top 5 questions. The entire ranking has been posted below for those that are curious.