PowerBI Gateway - RENXT

Happy New Year all! I wanted to check to see if the only way to get regular refreshes between RENXT and PowerBI is still via the On-premise data gateway? Having a server running this service comes at a cost so we are trying to find the most cost effective solution available.

Comments

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    @Mark Palfrey
    I don't use power bi connetor for my 30+ power bi dashboards/reports, so no gateway needed.

    I use an Azure SQL server as a data warehouse. Every 4 hours (i.e. 5am, 9am, etc), I have 20+ power automate flows that does either full sync (i.e. fund, appeal, etc) or iterative sync (i.e. constituent, gift, FE transactions distribution, etc) of all data needed. All my power bi connects to the azure sql data warehouse to get data needed, refresh of the power bi is also on every 4 hours, 1 hour after power automate sync the data (i.e. 6am, 10am, etc).

    There is also a cost to using azure sql server, but from our perspective it is minimal compare to the benefits we get from all the automation and reporting that can be done.

    I don't know what your cost is for the gateway route, so you will need to gauge that talking to your IT team.

  • @Alex Wong thank you for the info. That is really interesting to hear that you use a Azure SQL data warehouse. I think I recall some of the webinars in the last BB Dev conference mentioning this also. I will have to go back to those and have a look again. Any particular draw backs/issues you picked up when setting up this solution?

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    @Mark Palfrey
    I see no drawback with the data warehouse route I take as compare to using Power BI connector, only benefits.

    Pro:

    • data sync happens every 4 hours, so with the data warehouse sync and updated, no matter how many power bi dashboard/report I create, it will not affect our daily API limits.
      • Power BI connector will have a hard time and more “loading” the more reports are created. UNLESS, you have ONE main Power BI that loads/refresh all the data needed by all reports one time, and multiple power bi referencing and using this power bi dataset, still very limiting
    • there are other data that only comes available using “SINGLE API" call (i.e. constituent request no email field, constituent no valid address field, appeal attribute, fund attribute, etc), which is VERY time consuming to do using power automate flow and SKY API, (and eats up a lot of API quota), for those, I uses RE:Queue scheduled export of data from database view and process using power auto flow SFTP connector to get CSV into data warehouse, meaning, ALL data in 1 place for querying.
      • Power BI connector will be very difficult to do this

    Con:

    • much more technically involve to setup, Power BI connector is much easier to use for no/low-coder without caring about performance, API quotas, etc. (again as long as not creating a LOT of reports that using some of the same data point)
    • Time
      • a lot of issue has to be resolved b/c of how the API works and poor documentation, for example:
        • initially I was doing one full constituent sync (400K+ record) and then use iterative sync. however, then I realize this isn't good enough b/c deleted record will not be removed from data warehouse, so then I have to setup webhook for constituent delete, and delete from data warehouse
        • initially same I was doing for gift (which is 4 separate SQL table: gift, gift split, gift soft credit, and gift solicitor), but have even more problems (i.e. if constituent is merged/deleted, the soft credit constituent id is changed, however, the gift is not consider “changed” so will NOT show up in iterative sync). Spent a huge amount of time thinking through how to use minimal processing to keep all data accurate

    If you are able to spend the time and has the technical know-how (or has IT team that has that expertise), data warehouse is much better. I know you already mention cost, but I will say that data warehousing of RE and FE data is also available from a few partners of Blackbaud, but with a high premium cost.

  • @Alex Wong
    Hi Alex, We are in initial stage of using Azure server as our data lake. Do you have links or steps that we can follow using the Power Automate flows to start with getting the Constituents and their Giving? Eventually we are going to use PowerBI to generate the reports. Thank you!

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    @Oscar Ira
    I don't have any link or written down steps on setting azure data warehouse with RE NXT data via Power Automate. I think i might have talked about it in many of the power automate user group previously, but I don't know if that has been cateloged @Erik Leaver maybe able to help

  • Erik Leaver
    Erik Leaver Blackbaud Employee
    Tenth Anniversary Kudos 5 First Reply Name Dropper

    @Oscar Ira Check out the Template Showcase - you should find some flows that get you on your way:


  • @Alex Wong:

    @Mark Palfrey
    I see no drawback with the data warehouse route I take as compare to using Power BI connector, only benefits.

    Pro:

    • data sync happens every 4 hours, so with the data warehouse sync and updated, no matter how many power bi dashboard/report I create, it will not affect our daily API limits.
      • Power BI connector will have a hard time and more “loading” the more reports are created. UNLESS, you have ONE main Power BI that loads/refresh all the data needed by all reports one time, and multiple power bi referencing and using this power bi dataset, still very limiting
    • there are other data that only comes available using “SINGLE API" call (i.e. constituent request no email field, constituent no valid address field, appeal attribute, fund attribute, etc), which is VERY time consuming to do using power automate flow and SKY API, (and eats up a lot of API quota), for those, I uses RE:Queue scheduled export of data from database view and process using power auto flow SFTP connector to get CSV into data warehouse, meaning, ALL data in 1 place for querying.
      • Power BI connector will be very difficult to do this

    Con:

    • much more technically involve to setup, Power BI connector is much easier to use for no/low-coder without caring about performance, API quotas, etc. (again as long as not creating a LOT of reports that using some of the same data point)
    • Time
      • a lot of issue has to be resolved b/c of how the API works and poor documentation, for example:
        • initially I was doing one full constituent sync (400K+ record) and then use iterative sync. however, then I realize this isn't good enough b/c deleted record will not be removed from data warehouse, so then I have to setup webhook for constituent delete, and delete from data warehouse
        • initially same I was doing for gift (which is 4 separate SQL table: gift, gift split, gift soft credit, and gift solicitor), but have even more problems (i.e. if constituent is merged/deleted, the soft credit constituent id is changed, however, the gift is not consider “changed” so will NOT show up in iterative sync). Spent a huge amount of time thinking through how to use minimal processing to keep all data accurate

    If you are able to spend the time and has the technical know-how (or has IT team that has that expertise), data warehouse is much better. I know you already mention cost, but I will say that data warehousing of RE and FE data is also available from a few partners of Blackbaud, but with a high premium cost.

    Hi Alex,

    This is fascinating, and incredibly helpful. I currently have a semantic model that uses scheduled refreshes with Power BI to the Blackbaud connector and some files in SharePoint. I noticed that we cannot filter out the API calls when using the Blackbaud connector (only pull gifts after this date). Is each refresh pulling ALL available items from the Blackbaud connector (for example, all constituents in the constituents connector), and THEN filtering them in the applied steps in Power Query? If so, I see what you mean about API limits and quotas.

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    @Lawrence Kinkopf
    I am pretty certain this is the case, where all data has to be pulled first, then Power Query's filtering happen. You may want to lookup “Query Folding” for Power Query, which only works for certain source of data (i.e. data warehouse).

  • Rebecca Sundquist
    Rebecca Sundquist Blackbaud Employee
    Seventh Anniversary Kudos 2 Name Dropper Participant

    @Lawrence Kinkopf, you can use a Gift List or Constituent List to limit the gifts or constituents you are pulling in the Power BI Connector. Note the connector's navigation lets you select Public or Private lists. This feeds into the respective end point's “list_id” parameter. For example: API Reference - GET Gift List - SKY API.

  • @Rebecca Sundquist oh, that's a great idea! Totally forgot about that feature. Thanks so much!

  • @Alex Wong I am picking up on this project once again. When you mention "I have 20+ power automate flows" are they each pointing to queries in RENXT to pick the data you need or are you pointing to end points in the SKY API and drawing it down that way? Can you point me to a Template flow for this?

    Also I read the post @Allan Delmare dropped about SKY Bridge and wondered if this might be the way forwards for many to easily get data into something usuable for report/dashboard development. My aim is to create a single source of truth and having a Datalake to throw all our sources into before organising it etc…

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge

    I gave some details on my reply to Allan's post, you can read it there: