Best Practices
Recommended setup
Whitelist IPs before connecting
Add Coupler.io's IPs (52.21.222.113 and 35.170.113.181) to your firewall or security group before you create a data flow. This prevents "connection refused" errors and saves troubleshooting time.
Use a dedicated database user
Create a MySQL user specifically for Coupler.io with SELECT-only permissions. This limits the blast radius if credentials are ever compromised and makes audit logs cleaner.
Test with a small table first
Before exporting your largest or most critical table, test the connection with a smaller table to confirm everything works smoothly.
Data refresh and scheduling
Filter by date for large tables
If exporting millions of rows, apply a date filter (e.g., last 30 days) instead of exporting all history. This speeds up exports and reduces timeout risk. You can append historical data in a separate flow if needed.
Start with daily refreshes
Run a daily schedule first to ensure exports complete reliably. Once confident, increase frequency to hourly if your business requires real-time data.
Monitor your first few runs
After scheduling a data flow, check your destination (Google Sheets, BigQuery, etc.) to confirm data is arriving correctly and in the expected format before moving on.
Performance optimization
Index your filtered columns
If filtering on created_at, status, or user_id, ensure those columns have database indexes. This dramatically speeds up query execution and prevents timeouts.
Export views, not complex queries
If you need complex transformations, create a MySQL view and export that instead of filtering in Coupler.io. Views are pre-computed and run faster.
Common pitfalls
Do
Whitelist IPs in your firewall or security group before connecting
Create a dedicated MySQL user with SELECT-only permissions
Apply date filters to large tables to avoid timeouts
Test your first export with a manual run before scheduling
Use indexes on columns you filter by (created_at, status, user_id)
Don't
Use your root or admin MySQL account — create a separate user
Export your entire database history in one flow if it's millions of rows
Skip firewall configuration if your database is in the cloud or behind a corporate network
Schedule very high frequency refreshes (hourly) for large tables without testing first
Store your MySQL password in plain text notes; use a password manager
Last updated
Was this helpful?
