Batching Cell History Requests
Hi, I'm a newer software dev (only about a year or so of working in python, been working with the smartsheet API for roughly ~2mo). One of the things I've attempted to do in our environment is create some reporting based upon cell history that involves getting the cell history object of an entire column, by iterating through a FOR loop and making the requests one by one. Obviously I hit rate limits pretty quickly when I'm dealing with 200-300 rows, so I was wondering if anyone had any experience working around this limitation with more success than just "waiting the rate limit out". Can you batch requests/approach from alternating API keys/etc? Thank you!
Answers
-
I use simple math, for instance, 300 tokens per minute / 60 = 5 tokens per second, but history cost 10 tokens each. so you can make 30 calls per minute, depending on your latency (the delta from the time you make a request and the time you get a response) - 2secs = wait_time, then use sleep.time(wait_time), that way you will use all tokens per minute without going over the quota.
that means, for a 300 rows column, it will take around 10min to complete.
yes, this limitation is per key.
Categories
- All Categories
- 14 Welcome to the Community
- Customer Resources
- 64.8K Get Help
- 434 Global Discussions
- 138 Industry Talk
- 470 Announcements
- 4.9K Ideas & Feature Requests
- 129 Brandfolder
- 148 Just for fun
- 65 Community Job Board
- 486 Show & Tell
- 33 Member Spotlight
- 2 SmartStories
- 300 Events
- 36 Webinars
- 7.3K Forum Archives