# Best Practices

## Recommended setup

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Use one entity per source, then Join</strong></td><td>Configure each Apptivo entity (Customers, Contacts, Opportunities) as its own source within a data flow, then use the Join transformation to link them by Customer ID. This keeps things clean and lets you reuse sources across multiple data flows.</td></tr><tr><td><strong>Anchor your reports on Customers</strong></td><td>The Customers entity is the master record in Apptivo. Build your data model by joining Opportunities, Cases, and Contacts to Customers so every row has account-level context. This makes filtering and grouping much easier downstream.</td></tr><tr><td><strong>Use Aggregate for pipeline summaries</strong></td><td>Apply Coupler.io's Aggregate transformation on the Opportunities entity to pre-calculate total pipeline value, deal counts by stage, or win rates. This reduces the formula work needed in your destination.</td></tr></tbody></table>

## Data refresh and scheduling

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Match refresh frequency to record change rate</strong></td><td>Opportunities and Cases change frequently — consider hourly or every-few-hours refresh. Customers and Contacts are slower-moving, so daily is usually sufficient. Setting appropriate intervals avoids unnecessary API calls.</td></tr><tr><td><strong>Run a manual test before scheduling</strong></td><td>Always complete a successful manual run first. This confirms your API key is valid, permissions are correct, and the data looks as expected before you commit to a schedule.</td></tr></tbody></table>

## Performance optimization

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Avoid running all entities simultaneously</strong></td><td>If you have multiple Apptivo data flows, stagger their schedules by at least 15–30 minutes. Running them at the same time can push you toward Apptivo's API rate limits, causing timeouts or partial exports.</td></tr><tr><td><strong>Send large datasets to BigQuery, not Sheets</strong></td><td>If you're exporting thousands of cases or leads, Google Sheets can slow down or hit row limits. Use BigQuery or another database destination for high-volume entities and reserve Sheets for smaller, summary-level outputs.</td></tr></tbody></table>

## Common pitfalls

{% hint style="danger" %}
Don't use a personal user's API key for production data flows. If that user is deactivated or their password changes, the key may stop working and your scheduled refreshes will fail silently. Use a dedicated service or admin account for API key generation.
{% endhint %}

{% columns %}
{% column %}
**Do**

* Use a dedicated Apptivo admin account to generate your API key
* Join Customers to other entities for full account context
* Check field labels in your account — custom field names vary by Apptivo configuration
* Use BigQuery for large-volume entities like Cases or Leads
  {% endcolumn %}

{% column %}
**Don't**

* Use a personal employee's API key for scheduled data flows
* Expect deleted records to appear — they're excluded from the API
* Run all entities in parallel if you're on a lower Apptivo plan
* Assume default field names — your Apptivo customizations affect column headers
  {% endcolumn %}
  {% endcolumns %}
