Best Practice to read a large set of objects

Brian_Richardson
Brian_Richardson Overachievers
edited 09/11/23 in Add Ons and Integrations

I'm looking for the best practice method to read a large-ish data set from an API using Bridge.

Specifically, I need to read 9000 rows from a Resource Management report and post that data to a Smartsheet sheet.

But I also have this need coming up with other systems such as Gitlab and Workday.

When I use a parent-child workflow in Bridge (parent gets the data, child processes each object), Bridge simply stops responding and errors after 3000 objects or so. It appears to process the set of objects serially, triggering the child workflow for each object one at a time. So the 3000 or so rows end up taking hours to process, and it just eventually times out partway through the data.

Is there a better way?

I'm guessing the answer is... Bridge isn't setup to process large sets. That's fine if that's the answer. But I'd love to know if anyone has solved this.

(I'm aware of Data Shuttle, but that's only helpful if the source system can automate exports, which Resource Mgmt cannot)

BRIAN RICHARDSON | PMO TOOLS AND RESOURCES | HE|HIM

SEATTLE WA, USA

IRON MOUNTAIN

Tags:

Answers

  • Grace F.
    Grace F. Employee Admin

    Hi Ryan,

    You're right that Bridge at this time isn't designed to process large datasets, though the workaround you've described is how customers usually enable this functionality at a smaller scale. I'd recommend adding this as feedback for the Bridge product using the feedback form linked here in the Community.

    In terms of the processing timing out, I'd like to encourage you to reach out to our Support team so we could investigate further and determine the cause of the timeout. It's possible that we could help optimize what you're attempting to do or confirm if something isn't set up correctly. You can create a support request here: https://help.smartsheet.com/contact

  • Brian_Richardson
    Brian_Richardson Overachievers

    Thanks Grace. Actually I think the answer should be to enable Bridge integration with DataTables to process and store larger sets. And enable Bridge to parallel process the child workflows so it can handle the load. I'll talk to support about the existing timeout issue.

    Is there a common "upper bound" on the number of objects in an array that you typically can handle with Bridge when using that array to spawn child workflows to store the objects in a sheet?

    BRIAN RICHARDSON | PMO TOOLS AND RESOURCES | HE|HIM

    SEATTLE WA, USA

    IRON MOUNTAIN