The Instructure Community will enter a read-only state on November 22, 2025 as we prepare to migrate to our new Community platform in early December.
Read our blog post for more info about this change.
Found this content helpful? Log in or sign up to leave a like!
Hi,
I have an LTI 1.3 external tool that I've created and will be using to pass back grades from my app to Canvas. Are there any rate limiting issues that I should be concerned about? For example, it would be more helpful for grades to be passed back after students submit each question to keep things completely synced, but this would require a much greater number of post requests than if I just did a single grade pass back upon assignment completion.
Solved! Go to Solution.
Sending the information as the students complete a question is not likely to cause a problem. It also helps the instructor see, from within Canvas, that the student is working on the assignment and lets the student know that they are not finished. It does, temporarily, lower their grade.
Of course, you can send the information once as the assignment is completed, but you'll still be rate limited. This might actually be worse. There is a possibility that a student might never finish an assignment, so you would need to send a grade message after the due date, regardless of their completion status. In theory, having the due date pass could trigger the LTI to send a bunch of results at once (in practice, most students complete the assignments before time runs out).
In a large class, passing grades back for each student as they answer each question is less likely to hit the rate limiting than trying to send the overall grade but all at once. I know that sounds counter-intuitive, but sending them as they happen means that they are spaced out.
The rate limiting on Canvas is intended to prevent abuse and one way to trigger that is to send a bunch of simultaneous requests from the same account. Not related to grade passback, but there are some requests that I can send 20 request at the same time and trigger the rate limiting, but if I space them out, even 50 ms apart, then I don't run afoul of the issue. It is related to the number of requests as well, so if I'm sending hundreds of requests, I might need to back off even a bit more, but it depends on the complexity of the request.
It's not the number of posts that affect the rate limiting. It's how those posts arrive. Sending too many in too short of a period of time is problematic. The remaining limit refreshes itself and if they are spread out enough, then you won't run into issues.
Sending the information as the students complete a question is not likely to cause a problem. It also helps the instructor see, from within Canvas, that the student is working on the assignment and lets the student know that they are not finished. It does, temporarily, lower their grade.
Of course, you can send the information once as the assignment is completed, but you'll still be rate limited. This might actually be worse. There is a possibility that a student might never finish an assignment, so you would need to send a grade message after the due date, regardless of their completion status. In theory, having the due date pass could trigger the LTI to send a bunch of results at once (in practice, most students complete the assignments before time runs out).
In a large class, passing grades back for each student as they answer each question is less likely to hit the rate limiting than trying to send the overall grade but all at once. I know that sounds counter-intuitive, but sending them as they happen means that they are spaced out.
The rate limiting on Canvas is intended to prevent abuse and one way to trigger that is to send a bunch of simultaneous requests from the same account. Not related to grade passback, but there are some requests that I can send 20 request at the same time and trigger the rate limiting, but if I space them out, even 50 ms apart, then I don't run afoul of the issue. It is related to the number of requests as well, so if I'm sending hundreds of requests, I might need to back off even a bit more, but it depends on the complexity of the request.
It's not the number of posts that affect the rate limiting. It's how those posts arrive. Sending too many in too short of a period of time is problematic. The remaining limit refreshes itself and if they are spread out enough, then you won't run into issues.
What are the current value of High Water Marker? Is it still 700 as when implemented back in 2017?
Also, is that possible for the client to buy a higher HWM?
It is still 700 (at least it was 2 days ago).
As far as purchasing more, you would need to reach out to your CSM to inquire.
Before doing that (it makes the system slower for others), I would seek alternatives methods of getting the information. For example, we cache much of the information that we need in a local database so we don't have to fetch it. Then I set the process to run in the middle of the night where it can run more spread out and then be ready for us to use the next day. We also use Canvas data and Canvas Live Events to gather information.
Adding to what James wrote, all of the details for the rate limiter can be found in the API documentation. There is a header called 'X-Rate-Limit-Remaining' that you can pay attention to and it will let you know if you are going faster than you should. As the limit remaining approaches zero you should slow your requests.
Hey I just have a question how do you become a Community Champion? If it says I just need to know where and have a great day
Hi @KingNickName000,
You can review the What are the Community ranks and roles? - Instructure Community article for info on the different community levels. Your rank will level up over time as you visit and participate meaningfully (answer questions, provide input, etc... it's not just post count). Is there something in particular you can't access or do right now as a new user that you'd like to?
-Chris
Community helpTo interact with Panda Bot, our automated chatbot, you need to sign up or log in:
Sign inTo interact with Panda Bot, our automated chatbot, you need to sign up or log in:
Sign in