Troubleshooting DataMesh Configs and Managing Duplicates in the Source Sheet

Options

I'm running two automated Datamesh jobs between a parent Smartsheet and two child Smartsheets with the field "Duplicates in the Source Sheet" set to "Pick 1st Match". For my first 400 records, it's seeing a matching row and overwriting it which is the intended behavior. For a newer, recently created row my child Smartsheets are essentially being cloned every hour on the hour. How do I troubleshoot this?

I've already tried deleting the one offending row in the parent worksheet and recreating the record, but it seems to continue happening.

Best Answer

  • Marco P
    Marco P
    edited 08/01/22 Answer ✓
    Options

    It was 05571925.

    The issue has since been resolved. After troubleshooting the issue several times, we eventually discovered the root cause was due to an ID column in the target sheet being set to an "auto-generated" column, which in turn prevented the DataMesh job from maintaining the ID sequence from the source sheet. Instead, a new auto-generated ID was being generated each time all non-sync'd new records were created, which would then duplicate every time the Datamesh job ran. 

    Upon changing the ID column in the target sheet to a "text/number" column, we found the issue was resolved and no more duplicate rows were being created.

    To summarize, only the parent sheet should ever have auto-generated ID columns enabled. All child sheets receiving information from that column should be set to text/number and not auto-generated. If the child is set to auo-generated, the moment there is a mismatch (usually due to a deletion at the parent sheet level) a new record will be generated on the child sheet because the sheet logic is telling it to count +1 for every mismatched row.

Answers

  • Marco P
    Marco P
    edited 07/13/22
    Options

    Support says this is a known bug and automatically closed out my support ticket. Meanwhile, my Datamesh destination table is being flooded with duplicate entries that I have to manually delete on a daily basis before running metrics. Any advice on how to fix this issue would be appreciated. It looks like it's a mix of issues between how overwriting behaviors in the datamesh tool work, and how autogenerated ID#s are created in destination tables. Note - the other 460 records are just fine... it's anything written after 460 that seems to have this issue.

  • Genevieve P.
    Genevieve P. Employee Admin
    Options

    Hi @Marco P

    Would you be able to provide me with the support ticket number? I'd like to look into this further and see if I can clarify anything for you or replicate the issue.

    Thank you!

    Genevieve

  • Marco P
    Marco P
    edited 08/01/22 Answer ✓
    Options

    It was 05571925.

    The issue has since been resolved. After troubleshooting the issue several times, we eventually discovered the root cause was due to an ID column in the target sheet being set to an "auto-generated" column, which in turn prevented the DataMesh job from maintaining the ID sequence from the source sheet. Instead, a new auto-generated ID was being generated each time all non-sync'd new records were created, which would then duplicate every time the Datamesh job ran. 

    Upon changing the ID column in the target sheet to a "text/number" column, we found the issue was resolved and no more duplicate rows were being created.

    To summarize, only the parent sheet should ever have auto-generated ID columns enabled. All child sheets receiving information from that column should be set to text/number and not auto-generated. If the child is set to auo-generated, the moment there is a mismatch (usually due to a deletion at the parent sheet level) a new record will be generated on the child sheet because the sheet logic is telling it to count +1 for every mismatched row.

  • Genevieve P.
    Genevieve P. Employee Admin
    Options

    Hi @Marco P

    Thank you so much for explaining the solution! That makes sense now that you've outlined it, and I'm sure that this will help other Community members down the line.

    I appreciate your detailed response.

    Cheers,

    Genevieve