Batching Cell History Requests
Hi, I'm a newer software dev (only about a year or so of working in python, been working with the smartsheet API for roughly ~2mo). One of the things I've attempted to do in our environment is create some reporting based upon cell history that involves getting the cell history object of an entire column, by iterating through a FOR loop and making the requests one by one. Obviously I hit rate limits pretty quickly when I'm dealing with 200-300 rows, so I was wondering if anyone had any experience working around this limitation with more success than just "waiting the rate limit out". Can you batch requests/approach from alternating API keys/etc? Thank you!
Answers
-
I use simple math, for instance, 300 tokens per minute / 60 = 5 tokens per second, but history cost 10 tokens each. so you can make 30 calls per minute, depending on your latency (the delta from the time you make a request and the time you get a response) - 2secs = wait_time, then use sleep.time(wait_time), that way you will use all tokens per minute without going over the quota.
that means, for a 300 rows column, it will take around 10min to complete.
yes, this limitation is per key.
Categories
- All Categories
- 14 Welcome to the Community
- Smartsheet Customer Resources
- 64.1K Get Help
- 412 Global Discussions
- 221 Industry Talk
- 459 Announcements
- 4.8K Ideas & Feature Requests
- 143 Brandfolder
- 141 Just for fun
- 58 Community Job Board
- 461 Show & Tell
- 31 Member Spotlight
- 1 SmartStories
- 299 Events
- 38 Webinars
- 7.3K Forum Archives