Batching Cell History Requests

Hi, I'm a newer software dev (only about a year or so of working in python, been working with the smartsheet API for roughly ~2mo). One of the things I've attempted to do in our environment is create some reporting based upon cell history that involves getting the cell history object of an entire column, by iterating through a FOR loop and making the requests one by one. Obviously I hit rate limits pretty quickly when I'm dealing with 200-300 rows, so I was wondering if anyone had any experience working around this limitation with more success than just "waiting the rate limit out". Can you batch requests/approach from alternating API keys/etc? Thank you!

Answers

  • Alex Argumedo
    Alex Argumedo ✭✭✭✭✭

    I use simple math, for instance, 300 tokens per minute / 60 = 5 tokens per second, but history cost 10 tokens each. so you can make 30 calls per minute, depending on your latency (the delta from the time you make a request and the time you get a response) - 2secs = wait_time, then use sleep.time(wait_time), that way you will use all tokens per minute without going over the quota.

    that means, for a 300 rows column, it will take around 10min to complete.

    yes, this limitation is per key.