Updating Sheets with more than 5000 rows

marc4
marc4 ✭✭✭✭

We pull data from external systems and load it into smart sheet as reference data that can be used in vlookups, data mesh, etc so that people do not have to manually enter data.

We use Data Shuttle to REPLACE the data twice a week with the contents of updated CSV files.

Many of these sheets have 10,000 - 17,000 rows.

The scheduled updates fail. Smartsheet has said there is no Data Shuttle support for sheets with more than 5,000 rows.

We can't split these sheets into smaller sheets. Data Tables don't work since you cannot REPLACE data, only update it.

Any ideas?

Thanks.

/marc

Answers

  • Paul H
    Paul H ✭✭✭✭✭✭

    Is there a unique identifier? Never tired this but the first though that came to mind was using multiple data shuttles to the same sheet run on set schedules to space them out.

    First workflow use Replace to wipe the sheet, and filter to lower the row count

    Second one uses a merge + add rows, and the opposite filter to the first to add remaining rows

    Third one if needed

    If that works I might have just solved some of my own issues....

  • marc4
    marc4 ✭✭✭✭

    It is the deletes that fail not the adds.

    We tried a workflow that did a replace with a csv file that contained only one line. That fails.

    Smartsheet says that once you have more than 5,000 rows, workflows with REPLACE are not supported.

    /marc

  • marc4
    marc4 ✭✭✭✭

    BTW, I do actually have a work around but am not happy with it.

    I have a script that uses the API to get the sheet data with the array of all the row ID's. Then it walks through the list in batches of 300 and deletes the rows. I have a cron job that runs this 30 minutes before the data flow. Just takes a couple of minutes to delete all the rows.

    I think once I am sure it works for weeks at a time I'll run it just 5 or 10 minutes before the workflow.

    I'd prefer not having to rely on an external service.

    /marc

  • Paul H
    Paul H ✭✭✭✭✭✭

    @marc4 Not supported but you can do more, I have one at almost 10k that fails one out of ten times

    Good you found a work around for now