Showing results for 
Search instead for 
Did you mean: 

Include a Keep Highest Scores Rule (instead of Ignore Lowest) in Assignment Groups 

Include a Keep Highest Scores Rule (instead of Ignore Lowest) in Assignment Groups 


This is not a new idea (see below). However, since many of the responses appear to suggest creating a Feature Idea... I have tried to title this so that it includes words that will get picked up multiple searched. Interestingly, when I pasted my title, I saw a message indicating there were 'no similar ideas'. My 'idea' is, obviously, to provide an option that lets instructors 'Keep Highest X Scores' when assigning rules to an Assignment Group. The 'ignore lowest' option does not function the same a 'keep highest' would in relation to maintining an accurate running total of accumulated points in the Gradebook. Below are a few links to others who have asked about this. My personal reasoning is presented in "Student Gradebook Issues With 'Ignore Lowest Scores' in Assignment Group.

Tonight I watched Jason Sparks presentation InstCon16 | Quizzes.Next: Modern Quizzing in Canvas - YouTube 
It was a great presentation. It seems lots of excellent changes are coming to Quizzes (soon?). I was clapping even when the audience wasn't. The one thing I didn't see (that I very much was hoping to) is this one here. Hopefully it is included, and was just not specifically addressed in the presentation. If someone happens to know if that option will be coming with Quizzes.Next, I would love to hear from you. Thank you.


Assignment Group settings - *keep* the highest _n_ scores

Question asked by Cordah Pearce on Jun 10, 2015



Add a "Keep Highest" Option to the Groups in Assignments

Idea created by Melissa Ritter on Jan 8, 2016



Assignment Group Settings: Option for "Keep Highest" Scores

Idea created by Emily Hunt on Jan 8, 2016



Keep Highest Scores ???

Question asked by George de Falussy on Sep 16, 2016



IF/THEN Logic on Drop Lowest Score

Idea created by Neal Legler Champion on Nov 3, 2016



Drop lowest grade?

Question asked by Benjamin (Ben) Croucher on Nov 10, 2016



Keep Best Scores Instead of Ignore

Idea created by Kimberley Patterson on Mar 22, 2017



Student Gradebook Issues With 'Ignore Lowest Scores' in Assignment Group

Question asked by Kimberly Smith on Sep 8, 2017

Community Team
Community Team

 @kimberly_smith2 ‌

Thank you for resubmitting this idea.  As you have found, this idea has come in a few times, but never garners a lot of votes (comparatively).  Maybe this is it's season?

Community Member

I asked Canvas support about doing this exact thing years ago and was met with resistance. They didn't seem to care that the current "drop X lowest" rule causes X number of grades to be greyed out throughout the semester, creating confusion and alarm for students concerned about losing credits. For example, after completing just 3 out of 12 quizzes, the two lowest are greyed out as students take them, even though the 10 highest will be kept when all is said and done. If the rule were switched, then the X highest grades would always show up as active grades, as they should.

Community Participant

Hello Jennifer,

Thank you for responding!

Yes, many of us who allow students some flexibility (drop lowest/missed) in an assignment group have been frustrated by the lack of a 'keep highest' option (myself included). One of my posts in the Community include a list of previous requests for this (since archived/cold storage, etc.), along with the number of views for each. I have been trying to find out if the upcoming Quizzes.Next has adressed this issue, but have no answer yet. If it hasn't, then hopefully enough votes on this most recent version will move the request along. Fingers crossed


Kimberly Smith

Community Coach
Community Coach

 @kimberly_smith2 ‌, I just ran into this while working with a faculty member today.  This individual gives weekly quizzes (normally) and wants to keep the highest 10 scores.  The problem is that she doesn't know for sure how many quizzes there will be so can't say for sure how many of the lowest scores would need to dropped in order to ensure the highest are kept.  Either she needs to continue to change the dropping rule, or wait until the semester is near its end in order to set up the drop rule.  Either way, the students can't use the gradebook as effectively as they could if there was a way to just indicate that the highest 10 scores should be kept.  I really hope that this can be implemented with the new quizzing system!

Community Participant

Thanks for your votes!

Community Participant

Hello out there...
I notice that this idea had been viewed 687 times, but only has 91 votes? I am trying to figure out why this might be.

? Is it that viewers feel implementing this idea is problematic in some way? If so, I would really like your feedback.

? Is it that those viewing the idea are not yet familiar with how to login and participate in voting? If so... and you would like to participate, you just need to verify your Canvas account; it takes less than 60 seconds (honest). The easiest way to do this is to: 1) open a different browser window, 2) log into your Canvas LMS then come back to this page, 3) click on Log In (top right), 4) verify your account, 5) participate Smiley Happy

? Is it that those viewing have found a workaround to the issue that this idea seeks to address? (e.g. If you use have used the Canvas Gradebook and allowed students to drop their lowest score(s) from within a given assignment group, then you are likely familiar with the issue associated with this system's inability to maintain accurate gradebook totals until the deadlines for all of the assignments in that assignment group have passed. This issue arises because the only assignment group ‘rule' Canvas offers to accomplish this is the 'Ignore Lowest x Scores' option. A solution would be for Canvas to add a ‘Keep Highest X Scores’ assignment group rule.) I know of no viable workaround for this (other than applying the rule at the end of the semester… which is still a 'not really accurate' problem, and comes with a different set of problems). If you have found a solution, could you please please please share it here for us?

? Is there some other reason why viewing does not lead to voting (either up or down)?
Thank you for your time!!

Community Coach
Community Coach

Good questions  @kimberly_smith2 ,‌ I would like to see this functionality. 

I don't know for sure but maybe to some if you have a group of 10 assignments, then keeping the top 8 seems like the same as dropping the lowest 2 scores.  The issue we run into is when you don't know how many assignments you will end up with within a particular group.  Some of our faculty know they want to keep the top 8 assignments but don't know if they will end up assigning 10, 11, 12, etc., so unless you continually change the rule you can't say for sure that in the end "x" number need to be dropped.

Community Champion

It may also count multiple views by the same person. I have personally viewed some ideas upwards of 10 or 20 times (usually ideas with an active comments section where I keep going back to read and add to a comment thread).

Also, since Canvas recently changed to keep the top percentage of ideas within each period and close the rest (rather than keeping those that met a static vote threshold), there are really three "vote choices" for each idea now: Up, Neutral, Down. Voting Up moves the vote-count up, not voting keeps the vote-count the same, and voting Down lowers the vote count. This means that if a single person voted uniformly across every Idea on the Canvas Community (regardless of whether that vote was up or down as long as it was the same each time) it would have no effect on the Idea process since it looks at relative rather than absolute votes. This in turn motivates some Canvas Community members to vote for fewer things they wouldn't actually use a lot since they'd rather the ones they support the most strongly are in the top whatever percent and they don't want to "dilute" their other votes. So far, I haven't noticed it leading to a lot of new down-voting this year, which would be another logical consequence of the current incentives, but not voting at all may be up.

Community Participant

Thank you Linnea. I hadn't considered the multiple views by the same person - which, of course, seems obvious to me now :-/. However, I was not at all aware of the new 'top percentage' or the 'neutral voting' factor. It seems you are saying that the 'neutral vote' is implied (behind the scenes), by virtue of someone just viewing an idea but not voting on it? So then 'relative' votes means relative to the number of views? I really hope not, but am I getting that right? If so, this seems very problematic to me.

I have more comments on this, but I will start a new conversation elsewhere... since now it seems that broadening the conversation (more comments on my comments) may be effectivly reduce the liklihood that this feature idea moves forward (lower % due to views without votes)? Thank you for the heads up. Would you mind if I pasted your explanation into the new thread I create?

Community Champion

 @kimberly_smith1  ‌,

I believe you have misunderstood what I said. The current Canvas idea process does not, to my knowledge, take "views" into account as part of their metric. I was trying to explain why people who felt neutral-to-slightly-positive (rather than passionate) might choose to read an Idea and neither up nor down vote it.

What Canvas does (as far as I can tell) is keep the top 10%-ish of ideas, as measured by vote score, and archive the rest to Cold Storage‌ on a regular basis (I think it's a certain number of months after submission). (They used to keep just the ones that got 100 votes or more rather than a percentage, but changed the process this year.) 

Let's look at a simpler scenario: A world in which there were only 20 ideas (there are a LOT more ideas than this in the real Canvas Feature Ideas) and the top 10% would be kept. That would mean that only the top 4 vote-getters (since 10% of 20 is 4) could move on, no matter how many good ideas there were. If the scores were:

Idea #1: Score of 300

Idea #2: Score of 250

Idea #3: Score of 249

Idea #4: Score of 150

Idea #5: Score of 125

Idea #6: Score of 100

Idea #7: Score of 100

Idea #8: Score of 100

Idea #9: Score of 100

Idea #10: Score of 100

Idea #11: Score of 100

Idea #12: Score of 100

Idea #13: Score of 100

Idea #14: Score of 20

Idea #15: Score of 15

Idea #16: Score of 5

Idea #17: Score of 2

Idea #18: Score of 2

Idea #19: Score of 0

Idea #20: Score of -2

Then Ideas 1, 2, 3, and 4 would be kept, even though Ideas 5-13 still had a lot more support than Ideas 14-20. 

In "only 20 ideas world", voting for more than 4 ideas would effectively be saying "I don't care which four ideas from this list of 5-plus things that I voted for move on", so people might choose to not vote for something if there were 4 things they liked more.

We live in a world with LOTS more than 20 ideas, so the optimal voting strategy is more complicated, but the current idea promotion system incentivizes not voting for everything that sounds vaguely good but instead voting for fewer things that matter more to you since it takes a relative (percentage) rather than absolute (score) point as its cutoff.

For what it's worth, I don't currently see a lot of the petty strategic nonsense that you'd see in a community where people were being highly strategic with their votes and trying to game the system. With the current system, that would look like doing a lot of downvoting of things they don't care about one way or the other (instead, downvotes are rare and usually seem to be accompanied by comments explaining why, which means people are treating it as a strong objection rather than "part of the game") and creation of lots of junk ideas (since more total ideas would make the top ten percent if there were 20,000 ideas, 15,000 of which were poorly thought out and unpopular, than would make the top ten percent if there were 20 ideas that were all actually good ideas). I do suspect people may be more likely to pass by without voting at all on ideas since every highly-ranked idea displaces another in a percentage-based system like this, but I have no idea if this is actually the case because I don't have access to that data to see if the policy change resulted in community members upvoting fewer ideas.

Community Participant

Hello Linnea,

Thank you so much for clarifying this for me!! I was feeling sad, and am very glad to hear that I misunderstood. I do understand, particularly from a triage kind of perspective, why incentivizing the 'not voting for everything that sounds vaguely good but instead voting for fewer things that matter more to you' may be necessary. I am still a bit concerned that this new process may (as more become aware) lead to increased votes down. But if that were to happened, it sounds like it would be easily detectable. Thank you again!

Community Participant

Hello  @mfp11  

I can't add to/edit the feature idea now that people have voted, but I thought I would add more details (with examples) here near the top - for both individual users and any from Canvas who might be making the decision about adding this feature...

The Continued Plea for a ‘Keep Highest” Assignment Group Rule

When instructors allow students to ‘drop’ n scores in an assignment group (by applying the ‘ignore lowest n scores’ rule), the running total in a students’ Canvas Gradebook does not accurately reflect their grade. I (and others) have been advocating for a ‘keep highest’ versus ‘ignore lowest’ n scores rule for Assignment Groups for some time now. Most of us seem to have envisioned this as something that would be applied when creating assignments, as it seems the easiest fix. But, perhaps it might also be approached from the Gradebook perspective?

A number of people have offered ‘work-around’ comments/suggestions (see more below). I very much appreciate their taking the time to offer these options, but it occurs to me that I may not have clearly articulated the complications I find associated with using the ‘ignore lowest’ option. I have endeavored to do so here.


Confusion about grades, especially early in the semester, and in particular for those who struggle with adapting to Canvas, can cause students to become discouraged at the very beginning. In my own courses, I have four assignment groups in which students are allowed to choose a given number of the assignments to complete (e.g. “ignore lowest” n scores in each group). The number of scores dropped vary by assignment group (3 lowest quiz scores dropped, 1 lowest exam score drop, etc.). As such, the current ‘ignore lowest’ option becomes especially problematic.

Here is how the progression of IGNORE LOWEST works…

In the example below, the assignment group rule is to ‘ignore lowest 3 scores’, and students are told that the lowest 3 scores will be dropped. Note that it isn’t until Week 6 that the Gradebook Total begins to reflect an accurate running total of a student’s grade in their Gradebook.

Weekly Quizzes

Quiz1: score = 8: score used in calculation... Gradebook Total @ 8/10

Quiz2: score = 7: score is ignored in calculation... Total @ 8/10 w/Qz1 score displayed as part of total (Qz2 ignored)…

 Student: "Where are my 7 points for Qz2?”

Quiz3: score = 0: score is ignored in calculation... Total @ 8/10 w/Qz1 score displayed as part of total (Qz2 & Qz3 ignored)…

Quiz4: score is 9: score is used in calculation... Total now @ 9/10 w/Qz4 displayed as part of total (Qz1, Qz2, Qz3 ignored)…

Student = “Wait… now I’m not getting my 8 points – or those other 7 points?”

Quiz5: score is 10: score is used in calc... Total @ 19/20 w/Qz4 & Qz5 displayed as part of total (9+10)/20 (Qz1, Qz2, Qz3 ignored)…

Quiz6: score is 9: score is used... Total @ 28/30 w/Qz4, Qz5, Qz6 displayed as part of total (9+10+9)/30 (Qz1, Qz2, Qz3 ignored)

With a rule of 'ignore lowest 3 scores), it is only after there are six scores that the running total begins to make clear sense.

Quiz7: score is 10: score used... Total @ 38/40 w/Qz4, Qz5, Qz6, Qz7 displayed (9+10+9+10)/40 (Qz1, Qz2, Qz3 ignored)

Quiz8: score is 0: score is ignored... Total @ 46/50 w/Qz1 and Qz4 thru Qz7 displayed (8+9+10+9+10)/50 (Qz2, Qz3, Qz8 ignored)

Quiz9: score is 10: score is used... Total @ 56/60 w/Qz1 and Qz4 thru Qz7 displayed (8+9+10+9+10+10)/60 (Qz2, Qz3, Qz8 ignored


The confusion is further perpetuated by the fact that students will understand from the Syllabus that the ‘Rule’ is to be that their lowest THREE scores in this group are dropped. So, intuitively, one would expect the first 3 scores to count, and then for scores to be dropped (one at a time) only after the 4th score is entered. If this were the case, it would actually make sense to students because 1) their Gradebook totals would be accurate at the onset and 2) they would later see their 4 scores, compare them, and understand that the lowest of those has been ‘dropped’. In a situation where the ‘rule’ is to ‘ignore lowest 3 scores’ I, myself, do not really understand why the Gradebook keeps ONE highest and drops ONE lowest score when there are two scores… keeps ONE score and drops TWO scores when there are three scores, etc. If nothing else, when the ‘ignore lowest’ is needed, might it not be better if NO scores were counted in the Gradebook Total until after the 4th score was entered? The running total in the Gradebook would still not be accurate, but I think it may be less confusing for students.

Here is a specific student example…

In a low stakes assignment group (Reading Quizzes) there are 15 quizzes across the semester; students are required to complete their choice of 12/15. Though all have different due dates (based on when we cover the material in class), all of these quizzes are available (open) at the beginning of the semester... in an effort to offer students some time management flexibility. I have set the assignment group ‘Rule’ to “ignore lowest 3 scores".

The images below are from my Gradebook on 01/21. The due date for Quiz#1 had passed, and the deadline for Quiz#2 is the next morning (1/22 @ 8am). Some students have already completed Quiz#2. This, of course, is a behavior I want to encourage (as opposed to their waiting until the last minute). A perfect score on each quiz is 10.



So this is above has already completed both quizzes: The “x” to the right indicates that the score is currently being “ignored” (dropped). If the student hovers over the ‘x’ to the right, s/he will see the message: “This assignment is dropped and will not be considered in the total calculation.” If we pretend for the moment that this is the only assignment group in the course, Student#1’s running total in the Gradebook would be 10/10, giving credit for the 1st (higher score) quiz and ignoring the 2nd (lower score) quiz. As a student, the first reaction is usually something like “What! I don’t those 9 points?”

Here is another example…



I gave this student a +1 ‘fudge point’ for being the very first of my 100+ students in this course to submit our first quiz. On 01/20 this student completed the second quiz (2 days early) – but what she sees in her Gradebook is that her perfect score of 10 on Quiz#2 “…is dropped and will not be considered in the total calculation.” This student’s running total would be 11/10. From the students’ perspective, the perfect work they did on Quiz#2 is not being credited. Pedagogically speaking, this seems a bad idea.



Here is how a KEEP HIGHEST might work…

In the example below, the assignment group rule is now to ‘keep highest n scores’, and students are told that the lowest 3 scores will be dropped. Note that the progression of scores posted would provide an accurate ‘running total’ in the Gradebook right away, and then also across the semester. The pattern displayed is also likely to make more sense to students.


Using the same scores as before…

Weekly Quizzes

Quiz1: score is 8: score is used in calculation w/Gradebook Total @ 8/10

Quiz2: score is 7: score is used in calculation w/Total @ 15/20, (8+7)/20 (Qz1 & Qz2 scores kept)…

Quiz3: score is 0: score is used in calculation w/Total @ 15/30, (8+7+0)/30 (Qz1, Qz2, Qz3 scores kept)…

THEN, only after there are more than three scores, would any scores be ‘dropped’…

Quiz4: score is 9: highest 3/4 used in calculation w/Total @ 24/30, (8+7+9)/30 (lowest 1/4 dropped, e.g. Qz3)

Quiz5: score is 10: highest 3/5 used in calculation w/Total @ 27/30, (8+9+10)/30 (lowest 2/5 dropped, e.g. Qz2 & Qz3) 

Quiz6: score is 9: highest 3/6 used in calculation w/Total @ 28/30, (9+10+9)/30 (lowest 3/6 dropped, e.g. Qz1, Qz2, Qz3)

Quiz7: score is 10: highest 4/7 used in calculation w/Total @ 38/40 (9+10+9+10)/40 (lowest 3/7 dropped, e.g. Qz1, Qz2, Qz3)

Quiz8: score is 0: highest 5/8 used in calculation w/Total @ 46/50 (9+10+9+10+8)/50 (lowest 3/8 dropped, e.g. Qz2, Qz3, Qz8).

and so on…

Thus… the running total is now working as it should, managing the ‘drop’ rule while also maintaining an accurate Gradebook total across the semester. Again, I think this would make much more sense to students. Their Gradebook totals would be accurate from the start and, when scores did begin to ‘drop’ from their Gradebook, students would see their 4 scores, compare them, and more easily understand that the lowest of those 4 is being dropped.

A bit more clarification?

I am no programmer, and it may be that creating the code for this in Canvas wouldn’t work for some reason, that it is somehow too complex for the Assignment Group Rules, or for the Canvas Gradebook to manage? But however it communicates the ‘ignore lowest n scores’ rule for an Assignment Group, it seems the Gradebook ‘looks’ at the scores for graded assignments in that group and decides which scores from that group to ignore. So, I wonder if a ‘keep highest n scores’ rule (perhaps as an additional option) could be created to instruct the system to ‘look’ at the scores for graded assignments from that assignment group - then decide which scores in that group to keep. Something like…

- IF {current # of scores} in the Gradebook for an assignment group <= {# of scores to keep per group},

THEN do not drop any scores

- IF {current # of scores} in the Gradebook for an assignment group >= {# of scores to keep per assignment group},

THEN keep up to highest n scores

I imagine the problem here then becomes defining ‘n’ before there are ‘n’ scores available. That is, in Excel… SUM(LARGE(B3:P3,{1,2,3,4,5,6,7,8,9,10,11,12})) will keep the highest (largest) 12/15 scores in a list of n scores from Columns B-P in Row 3. However, if there are not at least 12 scores available in the row, I beleive that Excel returns “#NUM!”. So, perhaps Canvas would need some kind of nested 'AND' coding to deal with when {not enough scores available yet}? Again, not a programmer – obviously.



Though it would not be my preference, an alternative to how the current ‘ignore lowest’ option works might be helpful - that is, to not drop any scores until the point at which there are only n scores left in the assignment group to earn (in the example above, 3). Though this is similar to waiting until the end of the semester to ‘drop’ scores (see also below), it would at least not be a sudden disappearance of points - and students could identify changes as they progress. Plus, the accuracy of the running total in the Gradebook would be maintained.

Below are some of the work-arounds that have been suggested. I do appreciate the time others have spent in offering these suggestions, but there are a number of reasons why these do not really address some core issues:

A ‘keep highest’ is the same as ‘ignore lowest’. As detailed above, this is not the case - not in terms of providing an accurate running total in the Gradebook. When using ‘ignore lowest n scores’, Gradebook totals are not accurate until the deadlines for all of the assignments in all the assignment groups have passed – usually not until the very end of the semester.

Replace the ‘dashes’ in the Gradebook with ‘zero’ scores for all assignments at the very beginning of the semester – so that the Gradebook interprets those zero scores as ‘lowest’ and drops them right away. I did actually try this, but it doesn’t fix the inaccurate ‘running total’ issue – since adding these ‘soft’ zero scores to all available assignments increases the denominator in the Gradebook to the total number of points available for the entire semester. For example, say the total number of points available in a semester is 1000; adding ‘zero’ scores for all assignments at the beginning of the semester makes the running total in the Gradebook = 0/1000 points. A student then completes two assignments: earning 45/50 points and 10/10 points. Their Gradebook total now shows 55/1000 points (or a grade of 5.5%). The running total offers no ‘real time’ assessment of how students are doing in the course.

Hide the Gradebook total from student view. This is what I had been doing. Unfortunately, in addition to offering no ‘real time’ assessment for students as to how they are doing in the course, when Gradebook totals are hidden students cannot use the Canvas ‘What-If-Grades” feature in their Gradebook.

Don’t apply these rules until the end of the semester – students won’t mind that some points are dropped because it will make their grade go up (since the denominator is decreased at the same time). This may be true, but presents some problems as well: 1) retention is important in higher education – so I would not want students to think they are doing worse in the course than they are actually doing (and perhaps needlessly withdraw as a result; 2) in a grading system based on points accumulated (rather than a %), students would see points that they had previously counted as part of their final course grade suddenly disappear; 3) this would make grades that advisors access across the semester inaccurate.


PLEASE, PLEASE, PLEASE add a “keep highest n scores” rule to assignment groups. Since code for ‘ignore lowest” n scores is already available in the assignment group rules, might there be some way to modify that code – in order to allow instructors to choose from either option? Thank you!

Community Champion

Those are some wonderful examples of the far-reaching effects of the current limitations,  @kimberly_smith1 . I'm hoping that this need will be re-examined as part of the newly announced Priority: Assignments 2.0.

Community Participant

Since this has reached the 100 up-votes threshold, what next?

Community Participant

Hello all,

The examples above, though long winded, describe the problem EXACTLY!

I keep stressing the point that this does not belong in the FEATURE column,

but in the "Functionality of the LMS" column. The Grading process does not

work correctly in CANVAS. It should be fixed without adding as a Feature,

voting on it, etc. Instructure should just fix it.

On Thu, Nov 29, 2018 at 1:45 PM <

Community Participant
The examples above, though long winded, describe the problem EXACTLY!
I keep stressing the point that this does not belong in the FEATURE column, but in the "Functionality of the LMS" column.  A feature is something like " I'd like to add a special font to make my Quiz more fun". The Grading process does not work correctly in CANVAS. It should be fixed without adding as a Feature, voting on it, etc. Instructure should just fix it.
Community Participant

Hello  @scottdennis ,

I wonder if you might be able to shed a bit of light on something for me - or might direct me to someone who could?

I was scrolling through the list of feature ideas that are open for voting on the Canvas Studio site. I do this from time to time to look for ideas I may want to vote on that I haven't noticed before, and to see where my feature idea requests stand (by comparison) in terms of votes.

As I understand it, the new goal post for feature idea request considerations is no longer 100+ votes, but rather that the focus is on those reaching the 'top 10% of votes'. Is this correct? I ask because I notice that the Canvas Studio site indicates there are currently 1951 ideas open for voting, but only 157 ideas (8%) are available to view when scrolling through those ideas (e.g. pages 1-8 only, with the least # of votes ending at 205). Is the new goal post actually 200+ votes, and not the 'top 10%'?

Importantly, I am concerned that those ideas with fewer votes, like mine here (obviously near and dear to my heart), are not accessible to viewers when scrolling through that site unless the viewer actually does a 'search' for the idea... which seems to me a bit prejudicial against those ideas with fewer votes. Is it not possible to allow the Canvas Studio site to display the entire list of ideas that are open for voting via just scrolling though them? 

Community Champion

 @kimberly_smith1 ,

It sounds like you're looking at the top 10%. To get the full list, choose "Open for vote" instead of "Top 10% By Vote".


I can get all the way to page 74 before it runs out of ideas. That's 1468 ideas that are open for voting and available for viewing -- although it takes a lot of scrolling. That's not quite 1951, but the 1951 refers to the total number of ideas, not the number of ideas open for voting. The number Open + In Development + On Beta + Completed + Archived comes out really close to the 1951.


There are 155 items in the top 10% list, that's a little more than the 147, which would be the 10% of 1468. If you throw in those in development or on beta, then you're looking at 1501, of which 10% would only be 150.

If you go to the list of those open for voting, you will see that there are some more than 200 votes that are not in the top 10%. So, it's not using 200 as the cutoff and it's giving slightly more than 10% of those ideas that are open for voting.

Community Champion

A bit more clarification?

I am no programmer, and it may be that creating the code for this in Canvas wouldn’t work for some reason, that it is somehow too complex for the Assignment Group Rules, or for the Canvas Gradebook to manage?

I don't disagree with the idea that you should be allowed keep the highest grades and I think it makes a lot of sense. I just want to explain some math of what would be involved.

This is more complex than it seems. That's what I thought when I started writing this. As I did the analysis, I found out that it's not nearly as complicated as it originally seemed. I've provided that information at the end, but left the analysis out there in case someone wanted to know what all was involved and why it wasn't harder than it seems.

Canvas does not compute the drop rules based on the percentage received for that assignment or for the points missed on the assignment. It attempts to calculate the grade without an assignment and then drops the assignments that have the biggest impact on the grade.

For example, let's say that your gradebook looked like this:


Now let's look at what happens when you drop each assignment individually.


  • The first row (yellow) was the original score before any grades were dropped and the student gets 77%.
  • The last row (gray) was what happened when you dropped the quiz with the lowest percentage and the student gets 78.89%. 
  • The middle row (green) is what Canvas drops. Even though quiz 2 had a higher percentage (70%) than quiz 4, the student did better with 80% than it does by dropping Quiz 4.

Determining which grade to drop is not as simple as dropping the grade with the lowest percentage.

The astute observer will notice that the student missed more points on Quiz 2 than any other quiz. They missed 9 points there while missing 2, 8, and 4 on the others. You might be inclined to think that Canvas just drops the one where the student lost the most points. That would be the incorrect logic as well.

Let's say that the student got 8/20 (40%) on Quiz 1 so that they missed 12 points. And that they got 1/10 (10%) on Quiz 4 so that they missed 9 points. They still missed 9 points on Quiz 2 and 8 points on Quiz 3.


Now Canvas drops Quiz 4, even though they only missed 9 points on it, rather than Quiz 1 where the student missed 12 points. The extra points from Quiz 1 helped dilute the overall point total more than dropping Quiz 4 would, so even though you missed more points, it was out of a greater amount and the benefit to the student was better by dropping Quiz 4 instead of Quiz 1.

Okay, now you know what's going on. Canvas checks every grade individually to look for the best benefit and it only has to make four comparisons  so it doesn't take very long.

Now let's say that you want to drop the two lowest grades.


There are six possible combinations of assignments that could be dropped. Here we find that dropping Quiz 2 and Quiz 4 gets the student 83.33% and so the best combination is to keep Quiz 1 and Quiz 3.

Most people still don't see a problem. That sounds pretty painless. Six comparisons, that's only two more than it took for dropping 1 assignment.

The numbers 4 and 6 are actually something mathematicians call combinations. It's C(4,1) and C(4,2). C(n,k) It represents the number of ways that you pick k items out of a group of n items without repeating yourself and without caring about the order. It is based off the factorial ! and C(n,k) = n! / ( k! * (n-k) ! ). Luckily most scientific calculators have a built-in function for it. The C(4,1) represents the number of ways you can pick one assignment to drop out of four assignments. C(4,2) represents the number of ways you can pick two assignments to drop out of four assignments.

Let's make it more realistic. I have 34 reading quizzes in my Finite Mathematics course.

  • If I want to drop 1 grade, there are 34 scenarios it has to run to find the best case. This is C(34,1) = 34 ways
  • If I want to drop 2 grades, then there are C(34,2) =  561 ways it could play out.
  • I actually drop the 3 lowest grades, so there are C(34,3) = 5984 comparisons to make to decide which three to drop.
  • If wanted to drop the 4 lowest grades, there would be C(34,4) = 46,376 scenarios to check.

That number is smaller at the beginning of the semester when there are fewer assignments, but as the semester goes along, it takes longer and longer to compute those.

Those calculations need to happen for each assignment group with a rule and each student.

When displaying the gradebook, the grades to drop or keep are not delivered from the server, those are calculated inside the gradebook.

  1. The gradebook display process starts with a list of assignment groups that contain the weights and the rules.
  2. Then it gets a list of the students, which does include the current grade, final grade, current unposted grade, final unposted grade. There are actually 8 values here because it delivers it as a score (number) and grade (when there is a grading scheme). However, this does not include the scores in each category for the students.
  3. Then it fetches the submission information for the students. This happens in batches of students so that it can quickly display the results instead of making people wait. There is nothing in this information to indicate that a grade is getting dropped according to a rule.

The browser computes the grades on the fly once all the information is delivered. Then, when you enter or change a grade, it can compute the effects without having to ask the server to send all the information back.

There are other approaches. There is a note in the source code about the inefficiency of this approach and needing to try something different when there are lots of assignments dropped:

I am not going to pretend that this code is understandable.

The naive approach to dropping the lowest grades (calculate the grades for each combination of assignments and choose the set which results in the best overall score) is obviously too slow.

This approach is based on the algorithm described in "Dropping Lowest Grades" by Daniel Kane and Jonathan Kane. Please see that paper for a full explanation of the math. (

I haven't read and digested the full paper, so I'll start by describing a simpler approach that is faster than trying every combination but occasionally returns wrong results. Then I'll comment briefly on the effectiveness of the Canvas algorithm.

You can find the worst offender and drop it, then look for the next worst offender once the first one is dropped. 

In multiple regression from statistics, we would call this backwards elimination. You eliminate one thing at a time and repeat that process as many times as needed. If I had 34 reading quizzes and wanted to drop 3, then I would need to make 34 initial comparisons. Then I would drop that assignment and make 33 comparisons to find the next worst offender. Then I would make 32 comparisons to pick the 3rd grade to drop. That means 34+33+32 = 99 scenarios to compare, which is much faster than making 5,984 comparisons.  Remember this needs to happen for every student and every assignment group that has a drop rule.

This approach uses conditional choices. The second one eliminated is picked based on the condition that the first one eliminated was the best one to eliminate. However, sometimes this approach doesn't get the best overall solution. The tradeoff is speed.

Let's look at keeping the best grades instead of eliminating the worst grades.

At the beginning of the semester, I don't know how many quizzes I'm going to have, but I do know that at the end, I want to keep 10 of them. For the first 10 quizzes, there is no problem, all of them are kept.

With quiz 11, I need to drop 1. If I could tell Canvas to drop 1, then it would make 11 comparisons. However, since I'm telling Canvas to check which 10 I should keep. Now, if Canvas were smart about it, it would figure out which one to drop and that would probably make most people happy. Mathematically, that's sound since combinations are symmetric and keeping 10 out of 11 occurs the same number of ways that dropping 1 out of 11 does: C(11,10) = C(11,1). However, if it were keeping with the existing logic, would go through and make C(11,10) = 11 checks. 

But then there's that programming decision that there are two many items to look at so they would go into the forward selection approach, which is kind of the opposite of backwards elimination. There are 11 ways to pick the best assignment. Then there are 10 ways to pick the second best. There are 9 ways to pick the third best, down to 2 ways to pick the 10th best. 11+10+9+8+7+6+5+4+3+2 = 65 comparisons to be made. That's worse than the 11 comparisons if you just drop 1.

If there were 15 quizzes, then keeping the top 10 would be C(15,10) = 3,003 combinations to try out. If you do the forward selection, you would have 15+14+13+12+11+10+9+8+7+6 = 105 ways. If you were tell Canvas to drop the lowest 5 grades, then it would do C(15,5) = 3,003 combinations or 15+14+13+12+11 = 65 comparisons, depending on the technique used.

If I had 34 quizzes and wanted to keep 30 of them, then I would have C(34,30) = C(34,4) = 46,376 combinations, forward selection would give 34+33+32+...+5 = 585 ways. Doing backwards elimination would give 34+33+32+31+30 = 160 ways.

I would guess that most people are going to keep more than half of the assignments in a category, so the backwards elimination is going to be better than the forwards selection process. The problem with both is that it misses some of the comparisons. There is another approach in multiple regression called best subsets. That's is the approach I was using with combinations and is the better way to guarantee the best solution for the student. However, it's the more costly (calculation and time) process and people don't like to wait for their grades to appear.

When you're keeping more than half the grades, it's faster to compute the grades not to keep than it is to calculate which grades to keep. The "drop highest" rule makes sense in that same context. Both are about dropping grades, not keeping grades. As long as you're dropping less than half the grades, it's faster to compute the ones to drop.

Canvas doesn't want to do longer and slower if they can avoid it.

Thankfully, none of what I've explained so far is the way Canvas does it. They're using a mathematical algorithm described in the paper which is way more efficient.

There's a note at the end of the Dropping Lowest Grades paper that is linked to in the Canvas source. It says that even with 1000 grades and dropping 300 of them, it never took more than 5 iterations to get the optimal set. That's pretty impressive and a more realistic application of Newton's method from calculus 1 than just finding the x-intercept of a function.

The easiest way to implement this feature idea is to determine the number of grades to be dropped by subtracting the number of kept assignments from the number of eligible assignments. That is, let the number to drop be computed on a user by user basis for each assignment group rather than just using the group rules.

Currently, line 239 of the code has this statement

const submissionsToKeep = dropAssignments(relevantSubmissionData, group.rules);

To implement this feature, change group.rules to rules, where rules is an object that starts off as group.rules. If the box to keep the highest rules is not checked, then it would act as it does currently. If that box is checked, then you could compute the number to drop based off the number that are present. Alternatively, you could modify the group rules to include keep-the-highest and then just do the subtraction within the function itself.

If you accept that when faculty say "Keep the 10 highest grades" that they'll be happy with "Drop the lowest grades so that I end up with 10 grades all together," then there isn't much needed in the way of code changes.

There would still be the added complexity of adding it to the interface and Canvas wants simple without a lot of clutter.

Just because the code changes would be small doesn't mean it would get implemented. There's still the question of whether it's worth it? Is this is a fringe request that would only help a few people? Does it hit the middle 60% of users? We math folks feel that pain all the time when we go to work with quizzes and they just don't have the functionality that we need.  There's also the question of whether it is going to be more confusing to the masses who think there's no difference between dropping 5 and keeping 10 when there are 15 grades.

Community Participant

Hello  @James  ,

You totally rock! Thanks so much for taking the time to lay this out, and to perhaps help others see how a 'keep highest scores' is doable. As to this being a 'fringe request', I think this the idea would have more votes if more people understood that 'keep highest' and 'drop lowest' are not the same thing in the Gradebook. Your post may help others get that. Conversely, if people are not looking for it... Smiley Sad . As to whether it might confuse the masses, I think that it might - at first. But if there were an option to select 'keep highest scores' in Canvas, then those who currently see no difference would have an opportunity to explore the idea (box checked versus unchecked) and then decide which works best for them. Thank you again!