# Best Practices

## Recommended setup

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Start with Builds as your primary entity</strong></td><td>Builds contain the richest operational data — status, duration, branch, and trigger info. Get this entity working first before layering in Analytics reports or Audit logs.</td></tr><tr><td><strong>Join Builds with Pipelines</strong></td><td>Build records include a pipeline ID but not the pipeline name or project. Use a Join transformation in your data flow to enrich build data with pipeline metadata for readable, filterable reports.</td></tr><tr><td><strong>Use Append for multi-range analytics</strong></td><td>Codefresh Analytics reports only cover one date range per request. Use the Append transformation to combine reports from multiple periods into a single continuous dataset for trend analysis.</td></tr><tr><td><strong>Scope your API key to an admin account</strong></td><td>Entities like Accounts, Account settings, and Audits require admin access. Using an admin-generated API key from the start avoids permission errors when you add new entities later.</td></tr></tbody></table>

## Data refresh and scheduling

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Set your start date close to your reporting window</strong></td><td>Don't pull all-time build history on every sync. Set a start date that covers your actual reporting need (e.g., the last 90 days) to keep syncs fast and your destination clean.</td></tr><tr><td><strong>Match analytics granularity to your reporting cadence</strong></td><td>If you report weekly to stakeholders, set report granularity to weekly. Daily granularity on long date ranges generates a lot of rows and slows down your destination.</td></tr></tbody></table>

## Performance optimization

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Separate Builds from Audit logs into different data flows</strong></td><td>Builds update frequently; Audit logs are lower volume but may have different retention. Keeping them in separate data flows lets you set different schedules and start dates for each.</td></tr><tr><td><strong>Use BigQuery for large build histories</strong></td><td>If you're syncing months of build records from an active account, Google Sheets will hit row limits quickly. Send high-volume entities like Builds to BigQuery and use Looker Studio to visualize them.</td></tr></tbody></table>

## Common pitfalls

{% hint style="danger" %}
Don't skip the report granularity and date range settings when syncing Analytics reports. Leaving these blank will result in an empty sync with no error message — it'll look like the entity has no data.
{% endhint %}

{% columns %}
{% column %}
**Do**

* Generate your API key from an admin account
* Join Builds with Pipelines for readable reports
* Test with a short date range before expanding to full history
* Use BigQuery or Looker Studio for large-scale build analytics
  {% endcolumn %}

{% column %}
**Don't**

* Use the same data flow for both frequently-updated and static entities
* Pull all-time build history on every scheduled run
* Rely on raw Build IDs in reports without joining pipeline metadata
* Assume Contexts expose secret values — they only contain metadata
  {% endcolumn %}
  {% endcolumns %}
