Bodankers and Botanical Gardens – Why innovative products can’t be parity products
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Printer Friendly Page
- Report Inappropriate Content
A few years ago, a colleague and I were brainstorming how we could help the product we owned be more innovative. At one point, she said that we were going to have to cut the lines to some of the bodankers in the product. The term “bodanker” was not one that I’d heard before. I was confident that it wasn’t a new software engineering term. I knew our product well enough to know that “bodanker” wasn’t a descriptor of anything in the code. I deduced that it must be some sort of colloquialism, and from the context, I interpreted it to mean something that gets in the way of progress. After she finished articulating her line of thinking, I said, “Before we carry on, I want to make sure I understand something particular you talked about. I’ve never heard the term bodanker. What does that mean?” She looked at me, a bit confused and responded that she had no idea what I was talking about. I recounted that she said we were going to have to cut the lines on many of the bodankers, which I interpreted to mean things that were holding us back from innovating. I felt more than a little sheepish when she laughed and laughed, saying, “I said ‘boat anchor’! Y’know, like an anchor...on a boat!”
I still think that bodanker is a much more evocative turn of phrase. And it’s funner to say — bodanker.
But for the purposes of thinking about creating innovative products, botanical gardens could be a much more instructive analogy than boat anchors.
Gardens have many of the same characteristics of software products. I want to focus on just two of those here:
- Gardens typically require gardeners to care for them.
- Gardens typically have boundaries.
Botanical gardens need gardeners to take care of them. “Taking care” of a garden can include obvious things, like removing unwanted plants, sowing seeds and setting bulbs, thinning out garden beds when appropriate, pruning trees and shrubs. Taking care may require building of infrastructure—for watering, for retaining soil, for helping people move through the garden.
Gardeners are constrained by boundaries around a garden. They also constrain themselves through the layout of boundaries within a garden.
Because of boundaries, gardeners have to make trade-offs when evaluating how much to change a garden. If a gardener has a goal to help visitors learn about plants native to the area, they might explore the costs required to enlarge the garden's total footprint. They might look to wholly rearrange some parts of the garden by moving garden beds and moving plants from one section to another.
Gardeners AND Boundaries
When a garden changes, either by expanding the area that is cared for and cultivated, or by rearranging the boundaries of areas and planting beds within the existing garden space, or by changing plants are contained within which areas, gardeners will have to learn new techniques, or hire expertise (temporarily or permanently) in order to make desired changes AND to then care for the “new” garden in new ways.
We are always looking for ways to make the Instructure garden better. However, “better” can’t always be bigger. We cannot always be adding more capability because for every feature, service and component, we need people to ensure its security, scalability and utility. The more features, services and components we have, the more capacity we must give, as a whole, to maintenance--to making sure our garden is well tended. In order to be able to appropriately maintain and innovate, sometimes we have to rearrange the “garden beds” of our products. And sometimes we will make decisions to remove features of the current garden in order to accommodate new elements that can provide more value to more students, more instructors, more leaders.
We are committed to making a better garden for the whole Instructure community, which is going to mean that the products in 2023 and 2025 and 2030 are not going to do the same things they do in 2021. In the coming months and years we will be doing a lot of planting and as well as some pruning throughout our products. Canvas in 2023 will not be Canvas from 2021 with more stuff. It will not be “parity plus more stuff”.
I recognize that “pruning” will not be welcomed by all. Invariably, Product leaders will make some decisions to remove capabilities and features that some of you use regularly. We will do our best to make things go as smoothly as we can, balancing many considerations. We are invested in the future of teaching and learning and we will always strive to deliver products that support and inform an evolution toward better learning outcomes for all.
If you have questions or comments about principles or pragmatic considerations that inform product decisions related to “parity or not parity,” please feel free to engage in the comments section below. I’ll ask that you hold feature-specific questions such as, “Will feature ‘A’ ever be ‘pruned’ from a product?” for discussion in posts related to that particular feature and product.
Unfortunately, not all innovations are improvements. In EdTech, innovations that gain acceptance and move onward and upward to their final destination of "good old tech", must improve, or contribute to the improvement of...
- Learner opportunity to learn,
- Learner ability to engage in that opportunity,
- The educator's ability to provide that opportunity, and
- The educator's ability to support the learner's engagement with that opportunity!
Anything less is a software engineer's self-aggrandizing pat on the back for finding his or herself so clever!
@kmeeusen, I appreciate the reminder to distinguish between improvement and innovation. I'm often wont to collapse the two ideas. You're reminding me of a statement by Raymond Loewy that:
The smart industrial designer is the one who has a lucid understanding of where the shock-zone lies in each particular problem. At this point, a design has reached what I call the MAYA (Most Advanced Yet Acceptable) stage.
If I try to synthesize what your saying with Loewy's language, innovation is often just advancement without impact because it hasn't figured out how to match with opportunity and ability.
I believe, to follow along with the garden analogy, that you are headed "down the garden path." The behavioral economics principle of "loss aversion" is pretty well established - that people feel losses ("pruning" in your analogy) much more deeply than equivalent gains. I have at times offered gmail as an example of software that undergoes continuous improvement with incremental changes that are unobtrusive. The evolutionary changes don't get in the way, but are helpful when they are discovered (if they are even noticed, since they are often integrated so seamlessly). And I don't recall any feature or function ever having been "pruned." If that happened it was done is such a way, or to a feature so little used, as to make it unnoticeable. As a counter-example, there is New Quizzes. I believe you are due for an awakening in July '22 (if you are able to meet that milestone) when the cutover is enforced. For just one example, each semester our faculty creates hundreds of surveys. The day they discover those are broken or missing, I expect my inbox to be smoking. It seems to me that your exposition gives insufficient regard to the importance of legacy and tradition. And academia (in some cases for better, in others for worse) is well known for its adherence to legacy and tradition.
I can certainly understand and appreciate the garden/gardening analogy, as it is certainly appropriate for software development and evolution, as well as many other things in life.
I certainly would not like to use or support software that becomes bloatware, where the core ages poorly with oh too many bolt-on shiny rhinestones. For me, I certainly hope that the team at Instructure can focus on the intrinsic functionality that we fell in love with and have come to appreciate over the years. A garden is useless unless you have quality soil, watering infrastructure, shovel, trowel, hoe, and so on.
Make the core tools amazingly functional, beat out any remaining bugs or deficiencies, and tastefully grow the functionality. Avoid acquiring companies that have no hope of adding value. Teachers, faculty, support staff, and to some extent students all appreciate predictability and consistency. They all expect that there will be change, but are not ready to switch overnight to hydroponic gardening.
Thank you Shaun, for being a voice of reason and adding soul to the products.
"If I try to synthesize what your saying with Loewy's language, innovation is often just advancement without impact because it hasn't figured out how to match with opportunity and ability."
What I understood from this statement is that you believe innovation is the precursor to advancement, but this is not always the case. Innovation is certainly important, but I interpreted @kmeeusen's statement differently. I interpreted their statement as a reminder that even though innovation is needed for improvement, it does not always lead to improvement.
I am reminded of a silly viral video I saw yesterday. The video creator showed a cucumber field with small, new cucumbers starting to develop. They then took plastic molds and enclosed a few small cucumbers in these molds and left them to grow. The final result was a perfectly living, fully developed cucumber in the shape of the mold. Whomever created this mold was certainly innovative - but will this product actually lead to agricultural improvements or advancements? I do not believe so. This product is an innovative novelty and, while I could be wrong, I do not think the future of food production will turn towards food grown in novel shapes.
I do not in any way mean to imply that future developments in Canvas will be innovations without advancement in EdTech. Rather, I wanted to emphasize what I thought kmeeusen's post was trying to convey: as Instructure continues to innovate in EdTech, the development teams should carefully consider the whole scope of the project. Specifically, there should be serious consideration of how the end-users will react, what pain points might develop due to changes in workflows, and how those who dislike change can be convinced to adapt.
If an analogy of cucumbers is too far afield, consider Google+. While it had many new and subjectively better features than Facebook, it faded into oblivion because it did not give end users what they wanted and/or what they thought they needed. End users can be forced lose access to old or outdated features, and while they can be given access to new, innovative features, they cannot be forced to use them. Is innovation actually advancement if the end users don't adapt to it?
Yes, that is exactly what I meant, but you said it much more clearly than I.
Bottom line, I feel this is nothing more than thinly disguised social manipulation in the form of, "If you don't like New Quizzes and if you expect it to function as well holistically in Canvas and for learners using Canvas as Classic Quizzes did, then you must be against change and innovation." Propaganda in the form of shaming.
Sorry @shaun_moon , but that is how I read it. However, not being a person who is against innovation and change, I invite you to change my mind!
Yes, new versions can't always be 100% backward compatible with old versions. We don't need a lot of cruft.
But it's also vital that new versions provide the core features of old versions. Otherwise, why should we stick with Canvas at all? If we need to continually be on the hunt to make sure features we've come to expect are still present, it would make sense to migrate to new (and, I'd suspect, more feature-stable) LM systems. Put another way, there's no such thing as brand loyalty if the brand is completely protean.
I'll point to the go-to example, New Quizzes. The entire point of giving quizzes is to gain understanding of our students' knowledge. We need the flexibility to analyze that data in whatever manner makes sense to the instructor, and a key part of that is data portability, so that we can analyze the data using whatever tools work best. Classic Quizzes allows the downloading of student responses as CSV files. New Quizzes does not allow the downloading of student responses in any format. This basic feature has been missing despite two and a half years of increasingly desperate pleas by the actual users of Canvas, the instructors. This isn't "pruning" old unwanted features. It's a deliberate crippling of features actively sought.
By the way, I wouldn't bank too much on that "50% by volume is New Quizzes" stat. New Quizzes is now the default, so casual users opt for it whether or not they understand its features. And support calls might be down because it's truly much more stable. Or they might be down because lots of people are like me, who -- after encountering missing features -- will simply punt and go back to Classic without necessarily flagging it for the support team. After all, if the feature is missing on purpose, then it's not really a bug.
I have heard that the new engineering team working on New Quizzes actually cares about the users and making New Quizzes work for users. However, I hear that through a back door to a back room. I would be more inclined to believe that if I heard it in a banner splashed across the home screen of this Community! I would be more inclined to believe this if it were touted to clients by their CSMs, and reverberated through the venue at an InstCon. Canvas has a history of being upfront. open and transparent!
Instead, we are given this "Bodacious Bunglers" (or whatever the hell it is called) propaganda posting telling us our expectations are too high, that if we really understood "innovation" we would not have such high expectations, and if we really understood product development in the 21st Century, we should have no expectations at all!
YES, MY EXPECTATIONS ARE HIGH! And, I am dag-nab-it dangdoodle proud of that. Those same high expectations are what took me to Canvas in the first place. Those high expectations are what drove me to try to convince our entire state consortium to go to Canvas. Those same high expectations are what keep me a fan of Canvas, but not a fan of this truly garbage new quizzing tool.
I expect better from Canvas! Much better! And yes, I expect INNOVATION! Honest innovation that improves teaching and learning!
I'm with @kmeeusen & @b_lockhart-gilr - New Quizzes is not innovative, it's a step in the wrong direction. As the lead admin, I'm not encouraging our faculty to use it. It's cumbersome and unintuitive, it also doesn't have some of the key features of Classic that faculty have stated are vital to them. How it is innovative to remove key features and functions of something as vital as quizzes? You have heard folks screaming on all of your forums, yet your answers are all the same - "Trust us - you'll love it! Once we get it all fixed that is." How is it innovative to push something out that STILL does not work as expected after years of tweaking? We have been promised great things with quizzes and yet we still sit here with nothing but talk and folks telling us we're not innovative enough.
I guess you are going to continue following in Microsoft's steps and testing your "improvements" on the end-users. Our faculty are not going to stand for that. All I can say if when the tickets start rolling in with complaints next July - I will not be responding to them. They will all be escalated to Instrucutre and you can explain this innovative mess to my faculty.
I agree with most of you that the author of this piece simply likes to hear himself talk, so to speak. And I will reiterate my greatest EdTech fear that the new Instructure reminds me an awful lot of the old Blackboard. You remember them, right? They're the guys we left because they stopped listening to their clients and started making changes (or not making changes) that the clients wanted. Instructure promised us that they would always listen and would strive to work for the instructors and for the students. Now we can't even get them to answer our questions.
I am an instructor, and although the new question options from New Quizzes looks very appealing, I am extremely worried about and disappointed with the lack of compatibility between both. Being unable to transfer Question Banks, and make the transition as seamless as possible. I have put many weekend hours for years into Question Banks, and I cannot imagine going through the same thing in Canvas. I like technology, and how it serves instructors and students, but right now, you are just making yourselves (Canvas) sound good to new customers, while disregarding those who have worked with you for many years. Right now, the thought of Canvas creates disgust and disengagement in the platform. I agree with previous users that mention it is time to look for a more reliable LMS.
After attending InstructureCon, I have hope for a smooth migration-transition from Classic Quizzes to New Quizzes. I wish topics on the CanvasLMS Community site presented topics more clearly, with the most up to date news/posts showing first. It is very hard to find updates on New Quizzes, it would be nice to organize by topic, and have something with keywords that relate to Canvas topics. I will keep searching for updates and better ways to transfer questions to Item Banks that do not involve adding one by one questions.
@aleverton I'm sorry you've got questions that we haven't addressed. I'd like to help out there. Could you please share some of those questions here? Depending on the topics, we might branch into some other posts, but I'd like to help out.
@ldietz I'm glad that there were some good things for you at InstructureCon. I'm sorry about the level of frustration you've felt. I hope we can make things better for you and your many, many people using Canvas every day. On the topic of bank migration, there are people that are working on tools to streamline the transfer of Question Banks so that content can move with you when the time comes for sunsetting Classic Quizzes. I'm with you 100% that I hope this can be smooth. We are trying. I assure you, we are really trying. Susan Sorensen, the Product Manager in charge of quizzing in Canvas has this post about migration that could be helpful. She is also keep watch, and responding to comments in this post about the Classic Quiz Sunset Timeline.
@nicole_fleetwoo I think your frustrations with New Quizzes are completely legitimate. I recognize that this has been a very frustrating experience and I hope we on the Instructure side can remove as much friction as possible from the experience of sunsetting Classic Quizzes when that does happen. May I ask that you take a look at the things that Susan is sharing and working on, including the posts linked above with @ldietz as well as the Sunsetting Classic Quizzes Q&A post and the Priority Gathering for the New Quizzes Roadmap to see if there's a topic there where we could collaborate?
And as I look back at all the thoughts, frustrations, expectations shared, I recognize that this is a vast and diverse community of use and while it sometimes seems like it'd be lovely if we could all agree, we know that's not going to happen. I hope that our diversity of perspectives and opinions can create a richer landscape for everyone. Where there are topics and issues where we can constructively and respectfully work together to come to better understand the variety of needs and hopes, I'm all in.
@shaun_moon first of all, we want to know when the bank migrator will be ready? Why can't we have parity in the question types, at the very least? Will we get anything like the blank question back (we use it to add a "use this image for the following 5 questions" scenarios)? Will NQ ever show up in the TO DO list? Why is NQ showing ungraded when all items are auto-graded? Why don't late submissions show in the grade book as late? They show in the slide out panel and in Speedgrader but unless I mark it late in the slide out panel then it doesn't show in grade book. There are more but they are written in a notebook on my desk so I'll have to add those Monday.
@shaun_moon You are coming off as condescending again. You ignored most of @aleverton's comment and focused solely on their last sentence. You ignored @ldietz's concerns and focused on the one statement of hope in their follow-up comment. You seemed to completely disregard @nicole_fleetwoo's concerns with a sympathetic sounding statement and a bold assumption that they easily found your post through means other than the prominent link in the Sunsetting Classic Quizzes Q&A post.
Finally, in your closing paragraph, you completely disregarded the concerns that have been continually brought to you. This is especially obvious when you typed "while it sometimes seems like it'd be lovely if we could all agree, we know that's not going to happen. I hope that our diversity of perspectives and opinions can create a richer landscape for everyone."
You basically just told us that Instructure and the rest of us won't all agree and, since we don't work for Instructure, we don't matter. You're not hearing the concerns behind the complaints in these replies, so your actions make it apparent that the "diversity of perspectives" only extends as far as Instructure's payrolls. (Based on what you say and how you say it, I wouldn't be surprised if the diversity was limited to your own team.)
I get that some of these replies to you can be a bit surly, but the replies that we are hearing from you are not giving us confidence in the New Quizzes transition. If I can make a recommendation: hold back on the incendiary phrasing and maybe stop assuming that you know what is best for the clients your organization serves. (Also see my prior comment about Google+'s innovations and how well that worked out without consumer buy-in.) Please save the condescension for the break room.
@amg10k So well put. I have what I consider the (almost) perfect solution to the "diversity of perspectives" matter - if Instructure really cared about meeting diverse needs: continue to support both Old and New Quizzes in perpetuity; allow instructors to select one or the other to use at all times or allow those who want to to switch between them. Since it's possible to do that now, it can't be too heavy an engineering lift. I have virtually no hope of this happening unless the multitudes rise up, as they well may, when the forced transition takes place.
That's not quite what I had in mind. I was thinking more along the lines of some earlier commenters who expressed worry that the product was being developed with developers in mind, not end users, and how the author seems to be ignoring anyone who brings up that concern. Such an attitude can make the workflows unwieldy and will make instructor/user buy-in that much harder to accomplish. There are small things here and there that demonstrate just how little the end-users are considered in the development process.
For example, one of my personal pet peeves is going through the Assignment edit page, scrolling to the bottom, and clicking "Build" to edit something in the Quiz. I hate that scrolling, even though it is a small thing that I can easily do. Many users will have trouble adjusting to these workflows because they are just not designed with end-users in mind.
Another example: many of the icons within New Quizzes are missing tooltips. We may live in a society that relies heavily on technology and internet access, but there are still plenty of people that have had less resources to learn what many of us take for granted. Making assumptions that everyone should know x, y, or z about computers, or web design, or even common icons is the fastest way to alienate users and create a vocal group who are against a new product.
Adding the tooltips (or even better, clear labeling) could be an easy way to influence user buy-in. Unfortunately, that would not mesh well with Canvas' design, which seems to adhere to current trends in graphic and web design. (Those trends being minimalism at the expense of accessibility.)
[Please note these are only a couple examples to try to make a point and not a complete list of possible UX improvements or New Quizzes improvements. I do not mean to diminish any other user's comments or opinions, but to clarify my previous thoughts.]
Excerpt from an article just posted for faculty/staff consumption at Stanford University about our concerns regarding the transition to New Quizzes. Full article here. Posting it here in hopes that someone ( @shaun_moon ) from Instructure might actually pay attention to them.
Transition Timeline May Be Disruptive
Our biggest concern for the transition to New Quizzes is Instructure’s approach to managing major upgrades. Instructure’s development roadmap makes it clear that some important components are planned to be completed shortly before the enforcement date of July 2022 (which is when quiz creation can only be done in the New Quizzes tool). This roadmap means the tool design and functionality will be in flux during our entire transition period, and that makes preparing the work (e.g., workarounds/customizations, user outreach/training, etc.) we may need to do for New Quizzes very difficult.
Instructure has indicated they may push back the enforcement date to some time in Fall 2022 or Winter 2023 if they don’t make their designated milestones. In our view, it’s not feasible to make such a major transition in the middle of an academic year as it would be highly disruptive for faculty and students. Pushing back the enforcement date could result in a delay of their entire timeline and important features not being completed before Fall 2022. As a result we may be forced to choose between making the transition in Summer 2022, when some important components are still missing, or wait until mid-year, when it would be highly disruptive to make the change.
Important Features Missing
There are also many (missing) features that are a high priority to Stanford schools and departments which are not even on the roadmap to be completed before the enforcement date (e.g., student analysis download, bulk file download). These omissions will leave some departments scrambling to find solutions other than Canvas for very important high-stakes activities and/or the Stanford Canvas team scrambling to create custom solutions. Instructure has given no rationale for why these important features have been given low priority. Considering the development of New Quizzes has been a long term development project, the Stanford Canvas team feels there should be no high-priority features that are dropped or unavailable by the enforcement date.
Confusing User Experience
Finally, the user experience of New Quizzes is of great concern. Because New Quizzes was built as an LTI (external tool), the design is very disjointed and will likely be confusing for users. New Quizzes plugs into the existing Assignments tool and uses the Assignments interface for some quiz settings, then uses the New Quizzes build interface for other settings and features. It also has the submissions/grading buried in the Grades interface with no link or information within Quizzes to make clear what instructors should do to view submissions. It will likely be a very difficult task for faculty and staff to learn how to use this tool, made much more difficult by the fact that the interface will be changing in potentially significant ways during the transition time.
@cdoherty , I don't know that Shaun is on the team for New Quizzes. If you'd like to get your concerns directly in the hands of the NQ PM, I'd suggest posting here, in the new quizzes group: https://community.canvaslms.com/t5/New-Quizzes-Users/gh-p/quizzes
I've found the New Quizzes PM, Su Sorenson, to be responsive to feedback about the tool. I know she can't make all the changes people have been suggesting, of course, but she's been receptive to learning about user experiences.
I'm also concerned about the timeline. Our courses are administered by a team of instructional designers, so we're going to have to make all the changes ourselves, and right now each of our 100+ courses uses a quiz type that won't be available in NQ. I brought my concerns to Su, who listened to my feedback. She made no promises (as she couldn't), but thanked me for my thoughts.
Where did you see instructure saying the timeline might be pushed back? I've been advocating for major changes to be completed at least 6 months before they're enforced (a year would be better) for the reasons you stated.
@venitk Shaun Moon is the VP of Product. He doesn't need to be part of the New Quizzes team. The main complaint is Instructure's deployment practices and handling of changes, so it's relevant to this post. This needs to be addressed at a high level. The pushback of enforcement date was mentioned by Su Sorenson in the R1 Peers conference call on Oct 8, as well as a business school meeting earlier. It's not documented anywhere, because they don't seem to have an official contingency plan if they miss a milestone. It's not enough for me that they thank us for our feedback. I need Instructure to take action to ensure this transition won't be a nightmare for our users.
@aleverton I'm going to try to answer your questions below.
We want to know when the bank migrator will be ready?
We are working to have the bank migration ready by the end of this quarter (Q4 2021). The experience of migration and the scope of capabilities are best articulated in by Susan in Possibilities for User Experience of Classic Quizzes to New Quizzes Migration.
Why can't we have parity in the question types, at the very least? Will we get anything like the blank question back (we use it to add a "use this image for the following 5 questions" scenarios)?
Multiple Fill in the Blank and Multiple Dropdown are the two question types from Classic Quizzes that have a substitute, though not as high fidelity as would be ideal. For both of these, our recommendation is to us the (Single) Fill in the Blank Question Type.
For the Text/Blank Question, we recommend using the Stimulus/Passage Question which allows for a text to be the prompt for one, or multiple questions to follow. As @cdoherty notes in her article for Stanford University Faculty and Staff, "The stimulus function can be used, but a question (or questions) must be attached for it to display. The workflow is more awkward."
Will NQ ever show up in the TO DO list?
We are looking at ways to have New Quizzes show up in the TO DO list for learners and instructors, but we do not have a plan finalized for how that might happen.
Why is NQ showing ungraded when all items are auto-graded?
Why don't late submissions show in the grade book as late? They show in the slide out panel and in Speedgrader but unless I mark it late in the slide out panel then it doesn't show in grade book.
We have to enrich grading statuses for New Quizzes to work more seamlessly with Gradebook, Speedgrader and other components of Canvas, like TO DO list. This is a place where we want to make improvements in the coming year.
Hopefully some of this is helpful.
@amg10k we are trying to make the Classic Quizzes sunset experience better. We are trying to do a better job at taking into account the user experience.
@hesspe we are not going to support Classic Quizzes in perpetuity. It isn't because we don't care, but because it isn't sustainable.
If you, or any of the folks here would like to have a conversation with me, I hope you'll reach out to your customer success manager so we can coordinate some time together.
@cdoherty @venitk we are not going to force a transition from Classic Quizzes to New Quizzes in the middle of the academic year. Academic years around the globe start at different points in the calendar year and we are planning to accommodate those differences between early-calendar-year starts and middle-calendar-year starts.
Based on the voices here in the Community and through customer success channels, we are working to provide more detail in an effort to provide more clarity around the transition plans.
Again, if you need more attention from me, please reach out. Thank you.