Approach of a big integration

What can be the better approach to achieve the following: given the rate limits per second and day, we have to integrate and retrieve the information of about and maybe more than 57.500 members from the RE API, so for each member, we need the information of memberships, actions, and name formats, so this can be a 172.464 request per day, so the problem is that the base is 100.000 calls per day just for one of our customers and with 10 request per second we have to slow down the time of queering.

Thanks

Comments

  • Ben Wong
    Ben Wong Blackbaud Employee
    Tenth Anniversary Kudos 3 Name Dropper Participant

    @Duvan Castelblanco unfortunately the 10 calls per second rate limit is the only option we have today. It's a protection mechanism that we keep in place to protect against spikes on shared resources. However, we are exploring options to make this limit more flexible. As a partner, I'd be happy to include you in some of our experiments to see if it helps with your performance. Feel free to follow up with me over email.

    Thanks!

  • @Duvan Castelblanco

    Some considerations:
    1-May not be necessary to pull all data daily. For example, why not pull items modified that day, and append/overwrite where applicable?
    2-Pulling a large number of entries doesn't always require 1:1. For example, may not be required to pull 50,000 records individually. Perhaps there is a batch or list view that you can iterate through. Reducing a 50,000 user record pull, to 1/10th of that in total # of calls.

  • @Ben Wong Hi Ben, even if I use your solution here (which I am already doing), the limitation put on lists (to 1000 records) further limits the ability to pull batch data in a simple way. This will significantly increase your API calls (I used to do one call for up to 100,000 records and now need to make 100 calls for the same data - which is ridiculously slow also).

    Edit: I'm dealing with the School API

Categories