How Can We Help?
API Performances Issues - How to Optimize ItAPI Performances Issues - How to Optimize It
What
The question is what is the standard Pure API and Web service performance you can expect.
If you are working on connecting Pure to other integration to pull out data from Pure to other systems, and in the process you are experiencing performance issues working with the Pure API.
For example:
- It takes long time to export the data from Pure.
- This results in time-outs. If you are using batches of 10. But with an X number of publications e.g: 200 publications which is still 20 requests and takes a lot of time.
- Due to the time-outs, the profiles are sometimes loaded incompletely.
- It seems to be not possible to import a delta (See below), resulting in a necessary full load every 24 hours.
- The API can't import all entities at once, so you have to make multiple requests per entity.
Why
The API/Web services performance depends on various aspects:
- The quantity of data that you will harvest, and whether the web-service is queried in parallel.
- The hardware allocated to the Pure system
- If it is a very large amount of data, e.g. 19.000 persons in total, it might take around 5ms per person, but again, this also depends on your hardware.
- Other aspects for the you to review are:
- Do you have other jobs running in Pure while you are harvesting?
- Do you have other applications querying the WS?
The resources allocated to the systems hosting Pure admin and web-service are finite, and aggressive resource use from jobs, or web-service will often impact user experience inside Pure. Due to this, we generally recommend running jobs, and resource intensive use of the web service outside of normal business hours.
NOTE: Once, the information is queried the first time, you can continue using the changes endpoint, so only the updated and new data will be queried. Please refer the Pure client space documentation Harvesting content from the Pure web services (changes)
Also, when there are performance issues, the best fix to optimize this is to make a switch to using the changes API and then, harvest deltas instead of a full import every day.
*Delta: way of storing or transmitting data in forms of differences (deltas) between sequential data rather than complete files.
Updated at July 27, 2024