Clearly document the values for GradingSchemeEntry

0 Likes

It is unclear what the values in a GradingSchemeEntry are and what is the smallest step that can be used.

app/controllers/grading_standards_api_controller.rb says:

# @model GradingSchemeEntry
#     {
#       "id": "GradingSchemeEntry",
#       "description": "",
#       "properties": {
#         "name": {
#           "description": "The name for an entry value within a GradingStandard that describes the range of the value",
#           "example": "A",
#           "type": "string"
#         },
#         "value": {
#           "description": "The value for the name of the entry within a GradingStandard.  The entry represents the lower bound of the range for the entry. This range includes the value up to the next entry in the GradingStandard, or 100 if there is no upper bound. The lowest value will have a lower bound range of 0.",
#           "example": 0.9,
#           "type": "integer"
#         }
#       }
#     }

app/jsx/grading/helpers/GradeInputHelper.js says:

import Big from 'big.js'
import {
  gradeToScoreLowerBound,
  gradeToScoreUpperBound,
  indexOfGrade,
  scoreToGrade
} from '../../gradebook/GradingSchemeHelper'
import numberHelper from '../../shared/helpers/numberHelper'

const MAX_PRECISION = 15 // the maximum precision of a score persisted to the database
const PERCENTAGES = /[%%﹪٪]/

export const GradingSchemeBounds = Object.freeze({
  LOWER: 'LOWER',
  UPPER: 'UPPER'
})

function toNumber(bigValue) {
  return parseFloat(bigValue.round(MAX_PRECISION).toString(), 10)
}

function pointsFromPercentage(percentage, pointsPossible) {
  return toNumber(new Big(percentage).div(100).times(pointsPossible))
}

function percentageFromPoints(points, pointsPossible) {
  return toNumber(new Big(points).div(pointsPossible).times(100))
}

Finally, a comment in app/jsx/gradebook/GradingSchemeHelper.js says:

   // if the two scheme values are less than 1% apart, reduce the offset to 0.1%
    // this is the minimum granularity currently supported for grading schemes
  

So it is clear that the model is incorrect, the value is not an integer. The toNumber() clearly produces a floating-point value rounded to 15 digits of precision, but the comment about the minimum granularity says values must be more than 1% apart. So what is the actual minimum step size?

11 Comments
maguire
Community Champion

I should note that I have done an experiment with more than 500 entries and it is possible to select the "grade" from the list via the gradebook.

 

 

Stef_retired
Instructure Alumni
Instructure Alumni
Status changed to: Moderating
 
Renee_Carney
Community Team
Community Team
Status changed to: Archived

@maguire 

thank you so much for helping us a find a bit of a bug.  We don't need to continue conversation on it at this time so we're going to archive it. Please watch for a future fix!

maguire
Community Champion

What is the actual minimum step size?

 

erinhmcmillan
Instructure Alumni
Instructure Alumni

Hi! This behavior should be resolved in the beta environment and planned to be in production on December 2.

Thanks,

Erin

maguire
Community Champion

There is an asymmetry in the storing of values and the reading of values. The values to be stored are divided by 100 (i.e. in the code they are multiplied by 0.01) - while the values that are returned from reading a grading standard at the stored values and not multiplied by 100.

So one cannot take what is read and put it back (with additions) as a grading standard.

Moreover, the documentation fails to describe the rounding being done - nor what its effect is:

  def data=(new_val)
    self.version = VERSION
    # round values to the nearest 0.01 (0.0001 since e.g. 78 is stored as .78)
    # and dup the data while we're at it. (new_val.dup only dups one level, the
    # elements of new_val.dup are the same objects as the elements of new_val)
    new_val = new_val.map{ |grade_name, lower_bound| [ grade_name, lower_bound.round(4) ] }
    write_attribute(:data, new_val)
    @ordered_scheme = nil
  end

 

jsailor
Instructure
Instructure

Hi, maguire

I would like to help move this to resolution but have a few questions. 

So one cannot take what is read and put it back (with additions) as a grading standard.


Just for clarification, what is the workflow that you're describing here? I'm trying to understand if this is a bug in behavior or if this is a request for a documentation change to more accurately reflect what's happening. The api page for grading standards (https://canvas.instructure.com/doc/api/grading_standards.html) doesn't make it as clear as it could be that values are stored in the format of 0.xxxx and we can make that more clear, but the page also doesn't describe any process that consumes the returned value. 

Changing the api to return a format of xx.xx instead of 0.xxxx might break existing functionality for many of our users. We will need a clear indication of how it is broken for you so that we can investigate if this should be changed or not. Without that, we won't consider changing it.

If it's a documentation change, we can add some language that "you create a grading standard with xx.xx, but the response is always 0.xxxx and stored that way"

Thanks in advance for your clarifications. 

maguire
Community Champion

I am making a "grading standard" that consists of all of the teachers in a course. In this way, I can have an assignment called "Supervisor" and now record which teacher is assigned to supervise a particular student. I have previously done this with custom columns in the gradebook, but there is no API to get back previously stored values and no record of who set the value and when it was set.

My problem was that as the set of teachers might change, I cannot simply read the existing grading standard and add new values and write it back - without rescaling the values that I got when reading the existing grading standard. This was a surprise - as was the lack of documentation about it. Also, the rounding function that is now being applied in the code and its effect on the minimum step size in a grading standard is not documented - except by a comment in the code. The previous version of the function did not do this rounding and simply checked for unique values being assigned.

You can see an example of a program that does the above and can add new teachers at the top of the grading scale (while the initial set of teachers appear in sorted name order). See the code at insert_teachers_grading_standard.py at https://github.com/gqmaguirejr/Canvas-tools

Of course, you could add the additional functionality for storage and retrieval and annotating with who wrote them and when to the Note and custom column values. But lacking this functionality the best I can do was to "adapt" the idea of a grading standard.

 

 

maguire
Community Champion

To show more of the workflow:

I have now added a program that you can run to put the students into sections based on the Examiner that is set via the pull-down menu.

You simply run:

./add_students_to_examiners_section_in_course.py 22156

The program is at https://github.com/gqmaguirejr/Canvas-tools 

 

jsailor
Instructure
Instructure

Thank you for the additional context and examples. I'll review these with our engineers and get back to you. 

maguire
Community Champion

If you want more information about the use case with examples, see https://canvas.kth.se/courses/11/pages/using-canvas-in-conjunction-with-degree-projects-part-1 

Hopefully, this will inspire your engineers.