Streamlining updates with custom connector?

Hi everyone, I have been using the custom connector for various PowerBI reports and dashboards for my team for several years now. Most of my reports refresh daily, but they are all separate data sources. When I have a new project, I go to my template PBI file on my desktop, which already has a ton of data modeling and ETLs set up that I need for most reports. This means if I delete or rename a custom field, for example, all of my refreshes fail and I have to go back into each PBI file on my desktop to update the ETLs.

Is that how everyone's Power BI stuff is set up? Or am I missing something obvious that would make things so much easier for me?

Similar but possibly separate issue, we use a VM for the Custom Connector so that refreshes don't fail when I'm offline. We are hiring a new person soon, who I want to be someone who can work in Power BI as well, but I'm not sure if they would be able to fix things in PowerBI that aren't fixable in published reports. If anyone knows how to make this all work better, please let me know! I imagine what I'm needing is some version of a Warehouse or something, I'm just not sure how the Custom Connector might fit into that (I am not trained in data science terminology and infrastructure, I'm just making the most of what I can teach myself haha).

Best Answer

  • Alex Wong
    Alex Wong Community All-Star
    Ninth Anniversary Kudos 5 Facilitator 3 Raiser's Edge NXT Fall 2025 Product Update Briefing Badge
    Answer ✓

    A very good question you are asking, a question that many would have once they start building many reports out of Power BI.

    Let me first start by saying, this part is more of an Art and it is Science.

    So let me start with what my setup is, my thought process, and then maybe you can get some idea on how you want to go forward with yours.

    I use Azure SQL data warehouse (this is better than using the Blackbaud custom connector for Power BI) but not for reason you are asking your question about. I pull data from the data warehouse. I do not have ONE Power BI data source, I have many, but this all depends on the report I'm building.

    • Master Donor Report
      • This pulls in data from constituent table, gift table, constituent custom fields, gift custom fields, and a few others. Modeling on gift is done through soft credit (so not exactly financially good, but it's fundraising good, if you know what I mean).
      • This Power BI dashboard power a donor report that allows various filtering that uses our org's business logic, have heat map, and donor category
      • This Power BI dataset (semantic model) powers 12 other reports that utilize the data in the same data modeling: Donor Retention Analysis; Giving Society (membership), Monthly Giver, etc etc
    • Gift Analysis
      • this is like a gift report, where data is model with direct credit and specific additional data pulled in for pledge's detail (installment, payment write off).
      • This report itself is like a filterable report for various business logic we build into how we record our gifts
      • This report's dataset (semantic model) is used by 5 other reports that is more focus on REAL financial numbers: ScoreBoard (goal vs actual contribution), Contribution Source Analysis; Quick Fundraising Matrics, etc)
    • Pledge Report
      • this Power BI supports 2 dashboards
        • Pledge report that shows all pledge's commitment, payment to date with column to payment for each year, and write off for each year, balance remaining, and how much is coming in based on installment for next few years
        • Pledge Aging report that tells the story of overdue pledges.

    Above are 3 examples of my reports build out. When I need to create a new report for a new business need, my first question is going to be, which of my dataset (semantic model) has all the data I need to build my report, if one exists, I use by connecting my new power bi to the power bi service and attach the semantic model. While I build the new report, I MAY come across additional needs (new measures to display visualization, or new calculated column to filter on, etc). Most of the time, I will just add these new needs directly into the original power bi semantic model.

    My reasoning is, I "CAN" put everything into ONE dataset and all reports run through that one dataset. However, that will create a MASSIVE dataset and sometime can be conflicting (direct credit vs soft credit). Duplicated data will make it hard to work within the model too.

    The biggest benefits here using Azure SQL data warehouse over Blackbaud custom connector is going to be filtering option and loading time. For example, you cannot filter to get only pledges that has a balance. Azure SQL can easily just give me that using SQL querying. Blackbaud custom connector will need to get all Pledges, then filter down on the balance property. Given the "5000 limit" per call to get data from Blackbaud server, it is a big limitation and performance issue.

Answers

  • Thank you, Alex, this is super thorough and super helpful! I wasn't sure what options there were but it's great to hear that you use Azure and have found success with it. I will have to look into that!