[Gradebook] Include a Keep Highest Scores Rule (instead of Ignore Lowest) in Assignment Groups

This is not a new idea (see below). However, since many of the responses appear to suggest creating a Feature Idea... I have tried to title this so that it includes words that will get picked up multiple searched. Interestingly, when I pasted my title, I saw a message indicating there were 'no similar ideas'. My 'idea' is, obviously, to provide an option that lets instructors 'Keep Highest X Scores' when assigning rules to an Assignment Group. The 'ignore lowest' option does not function the same a 'keep highest' would in relation to maintining an accurate running total of accumulated points in the Gradebook. Below are a few links to others who have asked about this. My personal reasoning is presented in "Student Gradebook Issues With 'Ignore Lowest Scores' in Assignment Group.

 

Tonight I watched Jason Sparks presentation InstCon16 | Quizzes.Next: Modern Quizzing in Canvas - YouTube 
It was a great presentation. It seems lots of excellent changes are coming to Quizzes (soon?). I was clapping even when the audience wasn't. The one thing I didn't see (that I very much was hoping to) is this one here. Hopefully it is included, and was just not specifically addressed in the presentation. If someone happens to know if that option will be coming with Quizzes.Next, I would love to hear from you. Thank you.

 

547 Views https://community.canvaslms.com/thread/2214

Assignment Group settings - *keep* the highest _n_ scores

Question asked by Cordah Pearce on Jun 10, 2015

 

109 Views https://community.canvaslms.com/ideas/4319

Add a "Keep Highest" Option to the Groups in Assignments

Idea created by Melissa Ritter on Jan 8, 2016

 

536 Views https://community.canvaslms.com/ideas/4321

Assignment Group Settings: Option for "Keep Highest" Scores

Idea created by Emily Hunt on Jan 8, 2016

 

987 Views https://community.canvaslms.com/thread/12913-keep-highest-scores

Keep Highest Scores ???

Question asked by George de Falussy on Sep 16, 2016

 

181 Views https://community.canvaslms.com/ideas/7443 

IF/THEN Logic on Drop Lowest Score

Idea created by Neal Legler Champion on Nov 3, 2016

 

3044 Views https://community.canvaslms.com/thread/13886-drop-lowest-grade

Drop lowest grade?

Question asked by Benjamin (Ben) Croucher on Nov 10, 2016

 

194 Views   https://community.canvaslms.com/ideas/8325-keep-best-scores-instead-of-ignore

Keep Best Scores Instead of Ignore

Idea created by Kimberley Patterson on Mar 22, 2017

 

141 Views https://community.canvaslms.com/thread/19118-student-gradebook-issues-with-ignore-lowest-scores-in-a...

Student Gradebook Issues With 'Ignore Lowest Scores' in Assignment Group

Question asked by Kimberly Smith on Sep 8, 2017

34 Comments
kimberly_smith1
Community Participant
Author

Hello Linnea,

Thank you so much for clarifying this for me!! I was feeling sad, and am very glad to hear that I misunderstood. I do understand, particularly from a triage kind of perspective, why incentivizing the 'not voting for everything that sounds vaguely good but instead voting for fewer things that matter more to you' may be necessary. I am still a bit concerned that this new process may (as more become aware) lead to increased votes down. But if that were to happened, it sounds like it would be easily detectable. Thank you again!

kimberly_smith1
Community Participant
Author

Hello  @mfp11  

I can't add to/edit the feature idea now that people have voted, but I thought I would add more details (with examples) here near the top - for both individual users and any from Canvas who might be making the decision about adding this feature...

The Continued Plea for a ‘Keep Highest” Assignment Group Rule

When instructors allow students to ‘drop’ n scores in an assignment group (by applying the ‘ignore lowest n scores’ rule), the running total in a students’ Canvas Gradebook does not accurately reflect their grade. I (and others) have been advocating for a ‘keep highest’ versus ‘ignore lowest’ n scores rule for Assignment Groups for some time now. Most of us seem to have envisioned this as something that would be applied when creating assignments, as it seems the easiest fix. But, perhaps it might also be approached from the Gradebook perspective?

A number of people have offered ‘work-around’ comments/suggestions (see more below). I very much appreciate their taking the time to offer these options, but it occurs to me that I may not have clearly articulated the complications I find associated with using the ‘ignore lowest’ option. I have endeavored to do so here.

THE PROBLEM

Confusion about grades, especially early in the semester, and in particular for those who struggle with adapting to Canvas, can cause students to become discouraged at the very beginning. In my own courses, I have four assignment groups in which students are allowed to choose a given number of the assignments to complete (e.g. “ignore lowest” n scores in each group). The number of scores dropped vary by assignment group (3 lowest quiz scores dropped, 1 lowest exam score drop, etc.). As such, the current ‘ignore lowest’ option becomes especially problematic.

Here is how the progression of IGNORE LOWEST works…

In the example below, the assignment group rule is to ‘ignore lowest 3 scores’, and students are told that the lowest 3 scores will be dropped. Note that it isn’t until Week 6 that the Gradebook Total begins to reflect an accurate running total of a student’s grade in their Gradebook.

Weekly Quizzes

Quiz1: score = 8: score used in calculation... Gradebook Total @ 8/10

Quiz2: score = 7: score is ignored in calculation... Total @ 8/10 w/Qz1 score displayed as part of total (Qz2 ignored)…

 Student: "Where are my 7 points for Qz2?”

Quiz3: score = 0: score is ignored in calculation... Total @ 8/10 w/Qz1 score displayed as part of total (Qz2 & Qz3 ignored)…

Quiz4: score is 9: score is used in calculation... Total now @ 9/10 w/Qz4 displayed as part of total (Qz1, Qz2, Qz3 ignored)…

Student = “Wait… now I’m not getting my 8 points – or those other 7 points?”

Quiz5: score is 10: score is used in calc... Total @ 19/20 w/Qz4 & Qz5 displayed as part of total (9+10)/20 (Qz1, Qz2, Qz3 ignored)…

Quiz6: score is 9: score is used... Total @ 28/30 w/Qz4, Qz5, Qz6 displayed as part of total (9+10+9)/30 (Qz1, Qz2, Qz3 ignored)

With a rule of 'ignore lowest 3 scores), it is only after there are six scores that the running total begins to make clear sense.

Quiz7: score is 10: score used... Total @ 38/40 w/Qz4, Qz5, Qz6, Qz7 displayed (9+10+9+10)/40 (Qz1, Qz2, Qz3 ignored)

Quiz8: score is 0: score is ignored... Total @ 46/50 w/Qz1 and Qz4 thru Qz7 displayed (8+9+10+9+10)/50 (Qz2, Qz3, Qz8 ignored)

Quiz9: score is 10: score is used... Total @ 56/60 w/Qz1 and Qz4 thru Qz7 displayed (8+9+10+9+10+10)/60 (Qz2, Qz3, Qz8 ignored

 

The confusion is further perpetuated by the fact that students will understand from the Syllabus that the ‘Rule’ is to be that their lowest THREE scores in this group are dropped. So, intuitively, one would expect the first 3 scores to count, and then for scores to be dropped (one at a time) only after the 4th score is entered. If this were the case, it would actually make sense to students because 1) their Gradebook totals would be accurate at the onset and 2) they would later see their 4 scores, compare them, and understand that the lowest of those has been ‘dropped’. In a situation where the ‘rule’ is to ‘ignore lowest 3 scores’ I, myself, do not really understand why the Gradebook keeps ONE highest and drops ONE lowest score when there are two scores… keeps ONE score and drops TWO scores when there are three scores, etc. If nothing else, when the ‘ignore lowest’ is needed, might it not be better if NO scores were counted in the Gradebook Total until after the 4th score was entered? The running total in the Gradebook would still not be accurate, but I think it may be less confusing for students.

Here is a specific student example…

In a low stakes assignment group (Reading Quizzes) there are 15 quizzes across the semester; students are required to complete their choice of 12/15. Though all have different due dates (based on when we cover the material in class), all of these quizzes are available (open) at the beginning of the semester... in an effort to offer students some time management flexibility. I have set the assignment group ‘Rule’ to “ignore lowest 3 scores".

The images below are from my Gradebook on 01/21. The due date for Quiz#1 had passed, and the deadline for Quiz#2 is the next morning (1/22 @ 8am). Some students have already completed Quiz#2. This, of course, is a behavior I want to encourage (as opposed to their waiting until the last minute). A perfect score on each quiz is 10.

Student#1: 

292309_pastedImage_16.png

So this is above has already completed both quizzes: The “x” to the right indicates that the score is currently being “ignored” (dropped). If the student hovers over the ‘x’ to the right, s/he will see the message: “This assignment is dropped and will not be considered in the total calculation.” If we pretend for the moment that this is the only assignment group in the course, Student#1’s running total in the Gradebook would be 10/10, giving credit for the 1st (higher score) quiz and ignoring the 2nd (lower score) quiz. As a student, the first reaction is usually something like “What! I don’t those 9 points?”

Here is another example…

Student#2:

292310_pastedImage_17.png

I gave this student a +1 ‘fudge point’ for being the very first of my 100+ students in this course to submit our first quiz. On 01/20 this student completed the second quiz (2 days early) – but what she sees in her Gradebook is that her perfect score of 10 on Quiz#2 “…is dropped and will not be considered in the total calculation.” This student’s running total would be 11/10. From the students’ perspective, the perfect work they did on Quiz#2 is not being credited. Pedagogically speaking, this seems a bad idea.

 

A POSSIBLE SOLUTION

Here is how a KEEP HIGHEST might work…

In the example below, the assignment group rule is now to ‘keep highest n scores’, and students are told that the lowest 3 scores will be dropped. Note that the progression of scores posted would provide an accurate ‘running total’ in the Gradebook right away, and then also across the semester. The pattern displayed is also likely to make more sense to students.

 

Using the same scores as before…

Weekly Quizzes

Quiz1: score is 8: score is used in calculation w/Gradebook Total @ 8/10

Quiz2: score is 7: score is used in calculation w/Total @ 15/20, (8+7)/20 (Qz1 & Qz2 scores kept)…

Quiz3: score is 0: score is used in calculation w/Total @ 15/30, (8+7+0)/30 (Qz1, Qz2, Qz3 scores kept)…

THEN, only after there are more than three scores, would any scores be ‘dropped’…

Quiz4: score is 9: highest 3/4 used in calculation w/Total @ 24/30, (8+7+9)/30 (lowest 1/4 dropped, e.g. Qz3)

Quiz5: score is 10: highest 3/5 used in calculation w/Total @ 27/30, (8+9+10)/30 (lowest 2/5 dropped, e.g. Qz2 & Qz3) 

Quiz6: score is 9: highest 3/6 used in calculation w/Total @ 28/30, (9+10+9)/30 (lowest 3/6 dropped, e.g. Qz1, Qz2, Qz3)

Quiz7: score is 10: highest 4/7 used in calculation w/Total @ 38/40 (9+10+9+10)/40 (lowest 3/7 dropped, e.g. Qz1, Qz2, Qz3)

Quiz8: score is 0: highest 5/8 used in calculation w/Total @ 46/50 (9+10+9+10+8)/50 (lowest 3/8 dropped, e.g. Qz2, Qz3, Qz8).

and so on…

Thus… the running total is now working as it should, managing the ‘drop’ rule while also maintaining an accurate Gradebook total across the semester. Again, I think this would make much more sense to students. Their Gradebook totals would be accurate from the start and, when scores did begin to ‘drop’ from their Gradebook, students would see their 4 scores, compare them, and more easily understand that the lowest of those 4 is being dropped.

A bit more clarification?

I am no programmer, and it may be that creating the code for this in Canvas wouldn’t work for some reason, that it is somehow too complex for the Assignment Group Rules, or for the Canvas Gradebook to manage? But however it communicates the ‘ignore lowest n scores’ rule for an Assignment Group, it seems the Gradebook ‘looks’ at the scores for graded assignments in that group and decides which scores from that group to ignore. So, I wonder if a ‘keep highest n scores’ rule (perhaps as an additional option) could be created to instruct the system to ‘look’ at the scores for graded assignments from that assignment group - then decide which scores in that group to keep. Something like…

- IF {current # of scores} in the Gradebook for an assignment group <= {# of scores to keep per group},

THEN do not drop any scores

- IF {current # of scores} in the Gradebook for an assignment group >= {# of scores to keep per assignment group},

THEN keep up to highest n scores

I imagine the problem here then becomes defining ‘n’ before there are ‘n’ scores available. That is, in Excel… SUM(LARGE(B3:P3,{1,2,3,4,5,6,7,8,9,10,11,12})) will keep the highest (largest) 12/15 scores in a list of n scores from Columns B-P in Row 3. However, if there are not at least 12 scores available in the row, I beleive that Excel returns “#NUM!”. So, perhaps Canvas would need some kind of nested 'AND' coding to deal with when {not enough scores available yet}? Again, not a programmer – obviously.

 

ALTERNATIVELY…

Though it would not be my preference, an alternative to how the current ‘ignore lowest’ option works might be helpful - that is, to not drop any scores until the point at which there are only n scores left in the assignment group to earn (in the example above, 3). Though this is similar to waiting until the end of the semester to ‘drop’ scores (see also below), it would at least not be a sudden disappearance of points - and students could identify changes as they progress. Plus, the accuracy of the running total in the Gradebook would be maintained.

Below are some of the work-arounds that have been suggested. I do appreciate the time others have spent in offering these suggestions, but there are a number of reasons why these do not really address some core issues:

A ‘keep highest’ is the same as ‘ignore lowest’. As detailed above, this is not the case - not in terms of providing an accurate running total in the Gradebook. When using ‘ignore lowest n scores’, Gradebook totals are not accurate until the deadlines for all of the assignments in all the assignment groups have passed – usually not until the very end of the semester.

Replace the ‘dashes’ in the Gradebook with ‘zero’ scores for all assignments at the very beginning of the semester – so that the Gradebook interprets those zero scores as ‘lowest’ and drops them right away. I did actually try this, but it doesn’t fix the inaccurate ‘running total’ issue – since adding these ‘soft’ zero scores to all available assignments increases the denominator in the Gradebook to the total number of points available for the entire semester. For example, say the total number of points available in a semester is 1000; adding ‘zero’ scores for all assignments at the beginning of the semester makes the running total in the Gradebook = 0/1000 points. A student then completes two assignments: earning 45/50 points and 10/10 points. Their Gradebook total now shows 55/1000 points (or a grade of 5.5%). The running total offers no ‘real time’ assessment of how students are doing in the course.

Hide the Gradebook total from student view. This is what I had been doing. Unfortunately, in addition to offering no ‘real time’ assessment for students as to how they are doing in the course, when Gradebook totals are hidden students cannot use the Canvas ‘What-If-Grades” feature in their Gradebook.

Don’t apply these rules until the end of the semester – students won’t mind that some points are dropped because it will make their grade go up (since the denominator is decreased at the same time). This may be true, but presents some problems as well: 1) retention is important in higher education – so I would not want students to think they are doing worse in the course than they are actually doing (and perhaps needlessly withdraw as a result; 2) in a grading system based on points accumulated (rather than a %), students would see points that they had previously counted as part of their final course grade suddenly disappear; 3) this would make grades that advisors access across the semester inaccurate.

Canvas...

PLEASE, PLEASE, PLEASE add a “keep highest n scores” rule to assignment groups. Since code for ‘ignore lowest” n scores is already available in the assignment group rules, might there be some way to modify that code – in order to allow instructors to choose from either option? Thank you!

RobDitto
Community Champion

Those are some wonderful examples of the far-reaching effects of the current limitations,  @kimberly_smith1 . I'm hoping that this need will be re-examined as part of the newly announced Priority: Assignments 2.0.

gdefalussy
Community Novice

Since this has reached the 100 up-votes threshold, what next?

gdefalussy
Community Novice

Hello all,

The examples above, though long winded, describe the problem EXACTLY!

I keep stressing the point that this does not belong in the FEATURE column,

but in the "Functionality of the LMS" column. The Grading process does not

work correctly in CANVAS. It should be fixed without adding as a Feature,

voting on it, etc. Instructure should just fix it.

On Thu, Nov 29, 2018 at 1:45 PM kimberly.smith@sfcollege.edu <

gdefalussy
Community Novice
The examples above, though long winded, describe the problem EXACTLY!
I keep stressing the point that this does not belong in the FEATURE column, but in the "Functionality of the LMS" column.  A feature is something like " I'd like to add a special font to make my Quiz more fun". The Grading process does not work correctly in CANVAS. It should be fixed without adding as a Feature, voting on it, etc. Instructure should just fix it.
kimberly_smith1
Community Participant
Author

Hello  @scottdennis ,

I wonder if you might be able to shed a bit of light on something for me - or might direct me to someone who could?

I was scrolling through the list of feature ideas that are open for voting on the Canvas Studio site. I do this from time to time to look for ideas I may want to vote on that I haven't noticed before, and to see where my feature idea requests stand (by comparison) in terms of votes.

As I understand it, the new goal post for feature idea request considerations is no longer 100+ votes, but rather that the focus is on those reaching the 'top 10% of votes'. Is this correct? I ask because I notice that the Canvas Studio site indicates there are currently 1951 ideas open for voting, but only 157 ideas (8%) are available to view when scrolling through those ideas (e.g. pages 1-8 only, with the least # of votes ending at 205). Is the new goal post actually 200+ votes, and not the 'top 10%'?

Importantly, I am concerned that those ideas with fewer votes, like mine here (obviously near and dear to my heart), are not accessible to viewers when scrolling through that site unless the viewer actually does a 'search' for the idea... which seems to me a bit prejudicial against those ideas with fewer votes. Is it not possible to allow the Canvas Studio site to display the entire list of ideas that are open for voting via just scrolling though them? 

James
Community Champion

 @kimberly_smith1 ,

It sounds like you're looking at the top 10%. To get the full list, choose "Open for vote" instead of "Top 10% By Vote".

299015_pastedImage_3.png

I can get all the way to page 74 before it runs out of ideas. That's 1468 ideas that are open for voting and available for viewing -- although it takes a lot of scrolling. That's not quite 1951, but the 1951 refers to the total number of ideas, not the number of ideas open for voting. The number Open + In Development + On Beta + Completed + Archived comes out really close to the 1951.

299017_pastedImage_5.png

There are 155 items in the top 10% list, that's a little more than the 147, which would be the 10% of 1468. If you throw in those in development or on beta, then you're looking at 1501, of which 10% would only be 150.

If you go to the list of those open for voting, you will see that there are some more than 200 votes that are not in the top 10%. So, it's not using 200 as the cutoff and it's giving slightly more than 10% of those ideas that are open for voting.

James
Community Champion

A bit more clarification?

I am no programmer, and it may be that creating the code for this in Canvas wouldn’t work for some reason, that it is somehow too complex for the Assignment Group Rules, or for the Canvas Gradebook to manage?

I don't disagree with the idea that you should be allowed keep the highest grades and I think it makes a lot of sense. I just want to explain some math of what would be involved.

This is more complex than it seems. That's what I thought when I started writing this. As I did the analysis, I found out that it's not nearly as complicated as it originally seemed. I've provided that information at the end, but left the analysis out there in case someone wanted to know what all was involved and why it wasn't harder than it seems.

Canvas does not compute the drop rules based on the percentage received for that assignment or for the points missed on the assignment. It attempts to calculate the grade without an assignment and then drops the assignments that have the biggest impact on the grade.

For example, let's say that your gradebook looked like this:

299023_pastedImage_6.png

Now let's look at what happens when you drop each assignment individually.

299022_pastedImage_5.png

  • The first row (yellow) was the original score before any grades were dropped and the student gets 77%.
  • The last row (gray) was what happened when you dropped the quiz with the lowest percentage and the student gets 78.89%. 
  • The middle row (green) is what Canvas drops. Even though quiz 2 had a higher percentage (70%) than quiz 4, the student did better with 80% than it does by dropping Quiz 4.

Determining which grade to drop is not as simple as dropping the grade with the lowest percentage.

The astute observer will notice that the student missed more points on Quiz 2 than any other quiz. They missed 9 points there while missing 2, 8, and 4 on the others. You might be inclined to think that Canvas just drops the one where the student lost the most points. That would be the incorrect logic as well.

Let's say that the student got 8/20 (40%) on Quiz 1 so that they missed 12 points. And that they got 1/10 (10%) on Quiz 4 so that they missed 9 points. They still missed 9 points on Quiz 2 and 8 points on Quiz 3.

299024_pastedImage_7.png

Now Canvas drops Quiz 4, even though they only missed 9 points on it, rather than Quiz 1 where the student missed 12 points. The extra points from Quiz 1 helped dilute the overall point total more than dropping Quiz 4 would, so even though you missed more points, it was out of a greater amount and the benefit to the student was better by dropping Quiz 4 instead of Quiz 1.

Okay, now you know what's going on. Canvas checks every grade individually to look for the best benefit and it only has to make four comparisons  so it doesn't take very long.

Now let's say that you want to drop the two lowest grades.

299025_pastedImage_8.png

There are six possible combinations of assignments that could be dropped. Here we find that dropping Quiz 2 and Quiz 4 gets the student 83.33% and so the best combination is to keep Quiz 1 and Quiz 3.

Most people still don't see a problem. That sounds pretty painless. Six comparisons, that's only two more than it took for dropping 1 assignment.

The numbers 4 and 6 are actually something mathematicians call combinations. It's C(4,1) and C(4,2). C(n,k) It represents the number of ways that you pick k items out of a group of n items without repeating yourself and without caring about the order. It is based off the factorial ! and C(n,k) = n! / ( k! * (n-k) ! ). Luckily most scientific calculators have a built-in function for it. The C(4,1) represents the number of ways you can pick one assignment to drop out of four assignments. C(4,2) represents the number of ways you can pick two assignments to drop out of four assignments.

Let's make it more realistic. I have 34 reading quizzes in my Finite Mathematics course.

  • If I want to drop 1 grade, there are 34 scenarios it has to run to find the best case. This is C(34,1) = 34 ways
  • If I want to drop 2 grades, then there are C(34,2) =  561 ways it could play out.
  • I actually drop the 3 lowest grades, so there are C(34,3) = 5984 comparisons to make to decide which three to drop.
  • If wanted to drop the 4 lowest grades, there would be C(34,4) = 46,376 scenarios to check.

That number is smaller at the beginning of the semester when there are fewer assignments, but as the semester goes along, it takes longer and longer to compute those.

Those calculations need to happen for each assignment group with a rule and each student.

When displaying the gradebook, the grades to drop or keep are not delivered from the server, those are calculated inside the gradebook.

  1. The gradebook display process starts with a list of assignment groups that contain the weights and the rules.
  2. Then it gets a list of the students, which does include the current grade, final grade, current unposted grade, final unposted grade. There are actually 8 values here because it delivers it as a score (number) and grade (when there is a grading scheme). However, this does not include the scores in each category for the students.
  3. Then it fetches the submission information for the students. This happens in batches of students so that it can quickly display the results instead of making people wait. There is nothing in this information to indicate that a grade is getting dropped according to a rule.

The browser computes the grades on the fly once all the information is delivered. Then, when you enter or change a grade, it can compute the effects without having to ask the server to send all the information back.

There are other approaches. There is a note in the source code about the inefficiency of this approach and needing to try something different when there are lots of assignments dropped:

I am not going to pretend that this code is understandable.

The naive approach to dropping the lowest grades (calculate the grades for each combination of assignments and choose the set which results in the best overall score) is obviously too slow.

This approach is based on the algorithm described in "Dropping Lowest Grades" by Daniel Kane and Jonathan Kane. Please see that paper for a full explanation of the math. (http://cseweb.ucsd.edu/~dakane/droplowest.pdf)

I haven't read and digested the full paper, so I'll start by describing a simpler approach that is faster than trying every combination but occasionally returns wrong results. Then I'll comment briefly on the effectiveness of the Canvas algorithm.

You can find the worst offender and drop it, then look for the next worst offender once the first one is dropped. 

In multiple regression from statistics, we would call this backwards elimination. You eliminate one thing at a time and repeat that process as many times as needed. If I had 34 reading quizzes and wanted to drop 3, then I would need to make 34 initial comparisons. Then I would drop that assignment and make 33 comparisons to find the next worst offender. Then I would make 32 comparisons to pick the 3rd grade to drop. That means 34+33+32 = 99 scenarios to compare, which is much faster than making 5,984 comparisons.  Remember this needs to happen for every student and every assignment group that has a drop rule.

This approach uses conditional choices. The second one eliminated is picked based on the condition that the first one eliminated was the best one to eliminate. However, sometimes this approach doesn't get the best overall solution. The tradeoff is speed.

Let's look at keeping the best grades instead of eliminating the worst grades.

At the beginning of the semester, I don't know how many quizzes I'm going to have, but I do know that at the end, I want to keep 10 of them. For the first 10 quizzes, there is no problem, all of them are kept.

With quiz 11, I need to drop 1. If I could tell Canvas to drop 1, then it would make 11 comparisons. However, since I'm telling Canvas to check which 10 I should keep. Now, if Canvas were smart about it, it would figure out which one to drop and that would probably make most people happy. Mathematically, that's sound since combinations are symmetric and keeping 10 out of 11 occurs the same number of ways that dropping 1 out of 11 does: C(11,10) = C(11,1). However, if it were keeping with the existing logic, would go through and make C(11,10) = 11 checks. 

But then there's that programming decision that there are two many items to look at so they would go into the forward selection approach, which is kind of the opposite of backwards elimination. There are 11 ways to pick the best assignment. Then there are 10 ways to pick the second best. There are 9 ways to pick the third best, down to 2 ways to pick the 10th best. 11+10+9+8+7+6+5+4+3+2 = 65 comparisons to be made. That's worse than the 11 comparisons if you just drop 1.

If there were 15 quizzes, then keeping the top 10 would be C(15,10) = 3,003 combinations to try out. If you do the forward selection, you would have 15+14+13+12+11+10+9+8+7+6 = 105 ways. If you were tell Canvas to drop the lowest 5 grades, then it would do C(15,5) = 3,003 combinations or 15+14+13+12+11 = 65 comparisons, depending on the technique used.

If I had 34 quizzes and wanted to keep 30 of them, then I would have C(34,30) = C(34,4) = 46,376 combinations, forward selection would give 34+33+32+...+5 = 585 ways. Doing backwards elimination would give 34+33+32+31+30 = 160 ways.

I would guess that most people are going to keep more than half of the assignments in a category, so the backwards elimination is going to be better than the forwards selection process. The problem with both is that it misses some of the comparisons. There is another approach in multiple regression called best subsets. That's is the approach I was using with combinations and is the better way to guarantee the best solution for the student. However, it's the more costly (calculation and time) process and people don't like to wait for their grades to appear.

When you're keeping more than half the grades, it's faster to compute the grades not to keep than it is to calculate which grades to keep. The "drop highest" rule makes sense in that same context. Both are about dropping grades, not keeping grades. As long as you're dropping less than half the grades, it's faster to compute the ones to drop.

Canvas doesn't want to do longer and slower if they can avoid it.

Thankfully, none of what I've explained so far is the way Canvas does it. They're using a mathematical algorithm described in the paper which is way more efficient.

There's a note at the end of the Dropping Lowest Grades paper that is linked to in the Canvas source. It says that even with 1000 grades and dropping 300 of them, it never took more than 5 iterations to get the optimal set. That's pretty impressive and a more realistic application of Newton's method from calculus 1 than just finding the x-intercept of a function.

The easiest way to implement this feature idea is to determine the number of grades to be dropped by subtracting the number of kept assignments from the number of eligible assignments. That is, let the number to drop be computed on a user by user basis for each assignment group rather than just using the group rules.

Currently, line 239 of the code has this statement

const submissionsToKeep = dropAssignments(relevantSubmissionData, group.rules);

To implement this feature, change group.rules to rules, where rules is an object that starts off as group.rules. If the box to keep the highest rules is not checked, then it would act as it does currently. If that box is checked, then you could compute the number to drop based off the number that are present. Alternatively, you could modify the group rules to include keep-the-highest and then just do the subtraction within the function itself.

If you accept that when faculty say "Keep the 10 highest grades" that they'll be happy with "Drop the lowest grades so that I end up with 10 grades all together," then there isn't much needed in the way of code changes.

There would still be the added complexity of adding it to the interface and Canvas wants simple without a lot of clutter.

Just because the code changes would be small doesn't mean it would get implemented. There's still the question of whether it's worth it? Is this is a fringe request that would only help a few people? Does it hit the middle 60% of users? We math folks feel that pain all the time when we go to work with quizzes and they just don't have the functionality that we need.  There's also the question of whether it is going to be more confusing to the masses who think there's no difference between dropping 5 and keeping 10 when there are 15 grades.

kimberly_smith1
Community Participant
Author

Hello  @James  ,

You totally rock! Thanks so much for taking the time to lay this out, and to perhaps help others see how a 'keep highest scores' is doable. As to this being a 'fringe request', I think this the idea would have more votes if more people understood that 'keep highest' and 'drop lowest' are not the same thing in the Gradebook. Your post may help others get that. Conversely, if people are not looking for it... Smiley Sad . As to whether it might confuse the masses, I think that it might - at first. But if there were an option to select 'keep highest scores' in Canvas, then those who currently see no difference would have an opportunity to explore the idea (box checked versus unchecked) and then decide which works best for them. Thank you again!