Best Practices
Recommended setup
Use one entity per source, then Join
Configure each Apptivo entity (Customers, Contacts, Opportunities) as its own source within a data flow, then use the Join transformation to link them by Customer ID. This keeps things clean and lets you reuse sources across multiple data flows.
Anchor your reports on Customers
The Customers entity is the master record in Apptivo. Build your data model by joining Opportunities, Cases, and Contacts to Customers so every row has account-level context. This makes filtering and grouping much easier downstream.
Use Aggregate for pipeline summaries
Apply Coupler.io's Aggregate transformation on the Opportunities entity to pre-calculate total pipeline value, deal counts by stage, or win rates. This reduces the formula work needed in your destination.
Data refresh and scheduling
Match refresh frequency to record change rate
Opportunities and Cases change frequently — consider hourly or every-few-hours refresh. Customers and Contacts are slower-moving, so daily is usually sufficient. Setting appropriate intervals avoids unnecessary API calls.
Run a manual test before scheduling
Always complete a successful manual run first. This confirms your API key is valid, permissions are correct, and the data looks as expected before you commit to a schedule.
Performance optimization
Avoid running all entities simultaneously
If you have multiple Apptivo data flows, stagger their schedules by at least 15–30 minutes. Running them at the same time can push you toward Apptivo's API rate limits, causing timeouts or partial exports.
Send large datasets to BigQuery, not Sheets
If you're exporting thousands of cases or leads, Google Sheets can slow down or hit row limits. Use BigQuery or another database destination for high-volume entities and reserve Sheets for smaller, summary-level outputs.
Common pitfalls
Don't use a personal user's API key for production data flows. If that user is deactivated or their password changes, the key may stop working and your scheduled refreshes will fail silently. Use a dedicated service or admin account for API key generation.
Do
Use a dedicated Apptivo admin account to generate your API key
Join Customers to other entities for full account context
Check field labels in your account — custom field names vary by Apptivo configuration
Use BigQuery for large-volume entities like Cases or Leads
Don't
Use a personal employee's API key for scheduled data flows
Expect deleted records to appear — they're excluded from the API
Run all entities in parallel if you're on a lower Apptivo plan
Assume default field names — your Apptivo customizations affect column headers
Last updated
Was this helpful?
