General Ledger > Transaction Distribution Syncing

Does anyone have any techniques they would like to share pertaining to how they synchronize their Transaction Distribution data with an in-house database to make sure that it stays as up to date as is reasonable?

Comments

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    @Daniel Maxwell
    My sync for transaction distribution is done via “iterative” approach, as syncing 4M+ records on any regular basis is time consuming.

    I do my sync every 4 hours on business days, where I go back 1 or 2 days using the last_modified option (depending if Monday or other days). For the sync, I will INSERT into a staging table with same table definition, then use MERGE to update the main transaction table from staging table. To handle “deleted transactions” in FE, I will query the “count” of total transactions by using “limit”=1 comparing to COUNT(*) from the main table. If different, will have a iterative loop that goes back backward until count matches and will get transactions for the date range that didn't match into staging table. Then DELETE from main table the same date range, and INSERT from staging to main.

  • Thanks @Alex Wong. It looks like I'm using a similar approach for at least part of what I'm doing. We have 10M+ records, as we just migrated our system with many years of data.

    Is your staging table just the current transactions that you've just pulled in? I'm using a staging table and a TRUNCATE the table then fill with transactions from what I just pulled from the API and then use the MERGE query to populate the main table.

    I had not considered deleted records. If you have a moment, could you clarify how you are comparing the counted records? Is the staging table holding everything as well? I guess I'm also curious how the iterative loop is structured.

    Thanks again!

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    @Daniel Maxwell

    Is your staging table just the current transactions that you've just pulled in? I'm using a staging table and a TRUNCATE the table then fill with transactions from what I just pulled from the API and then use the MERGE query to populate the main table.

    Yes

    If you have a moment, could you clarify how you are comparing the counted records? Is the staging table holding everything as well? I guess I'm also curious how the iterative loop is structured.

    • transaction list using no parameter other than limit=1 (reduce bandwidth)
    • SELECT COUNT(*) on main transaction table
    • if count from transaction list not equals to count from transaction table
      • call subroutine (in power automate, that's a child flow) to check transaction mismatch and return a date for when the mismatch started
      • using the date return by the subroutine, retrieve transactions list using from_date parameter and save into staging table. Once all transactions have been processed into staging, DELETE from main transaction table from the post_date, insert from staging to main

    for the subroutine to check transaction mismatch, it basically loop and reduce from_date every 30 days, check transaction list API with limit=1 and COUNT(*) on the main transaction table, until they match

  • @Alex Wong Ah ha, thanks, the clarification helped things click, as I re-read your first answer and then coupled with this one, perfect. I will definitely have to incorporate similar functionality.