Hello lovely community, I need some help figuring out the best mechanism for clearing a sheet on a schedule. It really should be easier than this!
I have tried the following methods, and outlined the limitations and issues below. I would LOVE to get some suggestions for other methods!
Problem: I need to fully clear (delete ~8000 rows of) a sheet full of resource data, once a month on a specific date. I need to delete the rows to make room for the next 8000 to be loaded, not just clear cells. Additionally I need to clear another sheet of ~8000 rows where they meet certain criteria.
Solutions tried:
1) Move rows automation + Data Shuttle: on a schedule, the Move Rows automation shifts all rows, or filtered rows, to another "clearinghouse" sheet. Data Shuttle clears the "clearinghouse" sheet daily by importing a blank Google Sheet and Deleting all non-matching rows. This works pretty well and is quick, and is my current solution. Drawbacks: automations fail if the clearinghouse sheet hasn't been cleared yet. Automations also cannot be manually triggered as Run Now...not sure if it's a size thing, but they fail. Using two sheets, automations, and Data Shuttle is a relatively complicated setup that fails at various points if everything isn't perfect.
2) Just Data Shuttle: Data Shuttle works well to clear an entire sheet on a schedule, by importing a blank Google Sheet and selecting Delete rows that don't match. Drawbacks: didn't work for me, because the Data Shuttle scheduling is at most weekly, and I needed to run this only once a month....that isn't an option.
3) Data Mesh: cannot delete rows, and I need to delete them, not just clear them
4) Bridge: I tried to set up Bridge to do this but wasn't able to get it to work well. The main steps were: 1. Read 300 rows of the sheet 2. Extract the row ID into an array for the 300 rows 3. Use the Delete Rows Smartsheet Integration step. For row ID, provided the 300 id array. And this works! (around 300 is the "limit" for number of id's that the URL will accept). Drawbacks: I couldn't figure out a good method to "loop back" and do the next 300 rows. When I tried to connect the bottom of the workflow back to the top, using a conditional junction to identify when the workflow should stop, it simply wouldn't work...it never looped back. I was able to use Child Workflow to call itself again, but this resulted in a pile of nested loops and started taking a long time to run, as 8000 / 300 meant the "child" had to be called 26+ times before the whole thing would finish.