# Best Practices

## Recommended setup and configuration

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Create one flow per entity</strong></td><td>Each 100ms entity (Sessions, Rooms, Recordings, Analytics events) should have its own Coupler.io data flow. Mixing entities into one flow isn't supported — each flow exports a single entity type. Keep flows clearly named so you can identify them at a glance.</td></tr><tr><td><strong>Use Sessions as your primary entity</strong></td><td>Sessions contains the richest data including participant information, timestamps, and room references. Start here for usage analysis, then enrich with Recordings or Analytics events as needed.</td></tr><tr><td><strong>Set the start date intentionally</strong></td><td>Use the date picker to set a start date that aligns with when your 100ms integration went live — or the beginning of the period you want to analyze. Avoid setting it too far back if you have a large session volume, as this increases sync time significantly.</td></tr><tr><td><strong>Use the Management Token, not peer tokens</strong></td><td>Always authenticate with your 100ms Management Token (found in the 100ms dashboard under Settings → Developer). App tokens or peer tokens don't have the permissions needed to access session and room data via the server-side API.</td></tr></tbody></table>

## Data refresh and scheduling

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Schedule according to reporting cadence</strong></td><td>For daily dashboards, a once-daily refresh is sufficient for Sessions, Rooms, and Recordings. Analytics events may warrant more frequent refreshes (every few hours) if you're monitoring engagement in near real time.</td></tr><tr><td><strong>Don't over-refresh Active room peers</strong></td><td>The Active room peers entity reflects live state and changes every few minutes. Refreshing it too frequently is wasteful unless you're actively monitoring concurrency. For most use cases, skip this entity in scheduled flows and query it on demand.</td></tr></tbody></table>

## Performance optimization

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Narrow your date range for large workspaces</strong></td><td>If you have a high-volume 100ms workspace, exporting all sessions from day one can be very slow. Set your start date to the beginning of the current reporting period (e.g., the first of the current month) and run a one-time historical backfill separately if needed.</td></tr><tr><td><strong>Use BigQuery or Redshift for large datasets</strong></td><td>If you're exporting thousands of sessions or millions of analytics events, route the data to a data warehouse (BigQuery, Redshift) rather than Google Sheets. Sheets has a row limit of 10 million cells and can become slow to load with large datasets.</td></tr></tbody></table>

## Dashboard accuracy

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th></th></tr></thead><tbody><tr><td><strong>Join Sessions and Rooms by room_id</strong></td><td>Sessions contain a <code>room_id</code> field that links back to the Rooms entity. If you need to enrich session data with room names or template information, join these two exports on <code>room_id</code> in your destination or BI tool.</td></tr><tr><td><strong>Don't sum session durations without filtering</strong></td><td>If you're calculating total platform usage, filter out very short sessions (under 5–10 seconds) that may represent connection tests or failed joins. Including these can inflate your total session time metric.</td></tr></tbody></table>

## Do's and Don'ts

{% columns %}
{% column %}
**Do**

* Create separate flows for Sessions, Recordings, and Analytics events
* Set a meaningful start date aligned to your reporting period
* Use BigQuery or Redshift for high-volume analytics event exports
* Join Sessions to Rooms on `room_id` for enriched reporting
  {% endcolumn %}

{% column %}
**Don't**

* Use Active room peers for historical analysis — it's live state only
* Set your start date to the beginning of time for large workspaces
* Expect real-time data — allow a few minutes processing time for recent sessions and events
* Mix entities in a single flow — each entity requires its own data flow
  {% endcolumn %}
  {% endcolumns %}
