Best Practices

Start with Calls

The Calls entity contains the richest data in Aircall. Set this up first to validate your connection and get immediate value before adding other entities.

Set a sensible start date

Use the date picker to limit your import to the period you actually need. Pulling years of data on the first run slows things down and fills your destination with records you may not use.

Label your data flows clearly

Name each data flow with the entity and destination, for example "Aircall – Calls – BigQuery". This saves time when you're managing multiple connections.

Data refresh and scheduling

Schedule Calls data flows frequently

Call data changes constantly. A refresh interval of every 1–4 hours keeps your dashboards current without hammering the API.

Refresh Users and Teams less often

Team structure and user profiles change infrequently. A daily or weekly refresh is usually enough for these entities.

Performance optimization

Narrow your date range for large teams

If your team handles hundreds of calls per day, pulling several months of data at once can slow imports significantly. Start with a rolling 30–90 day window and expand as needed.

Avoid overlapping schedules

If you have multiple Aircall data flows, stagger their refresh times by 10–15 minutes. Running them simultaneously can increase the chance of hitting rate limits.

Common pitfalls

triangle-exclamation

Do

  • Rotate your API key periodically and update it in Coupler.io

  • Use the start date to scope the data flows to the data you need

  • Store exported data in a dedicated sheet or table to avoid accidental overwrites

Don't

  • Pull all historical data on every scheduled refresh — use incremental date ranges where possible

  • Use an agent-level account for the API connection — you need Admin access

  • Rely on the User availabilities entity as a historical log — it reflects the current status only

  • Run data flows for every entity at the same time on the same schedule

Last updated

Was this helpful?