Common Issues
Connection issues
Coupler.io can't connect to my PostgreSQL server
This usually means the host, port, or credentials are wrong — or your database server is blocking the connection.
Things to check:
Make sure the host, port, database name, username, and password are all correct
Confirm your PostgreSQL instance is accessible from the internet (or at least from Coupler.io's IPs)
If your server uses IP allowlisting or a firewall, add both Coupler.io IPs to the allowlist:
34.123.243.11534.170.96.92
If you're connecting to Supabase, use the connection string details from Settings → Database in your Supabase dashboard, and make sure you're using the correct connection mode (Session or Transaction pooler)
Some cloud-hosted PostgreSQL providers (e.g., AWS RDS, Google Cloud SQL) require you to explicitly enable public access or configure VPC rules. Check your provider's network settings if the connection keeps failing.
Connection times out during the data flow run
A timeout during the run (not at the credential step) often indicates the database server is under load, or the query is taking too long for very large datasets.
Try running the data flow again — transient timeouts often resolve themselves
If the issue persists, check your PostgreSQL server's resource usage (CPU, memory, connections)
Consider splitting large data flows into smaller sources or filtering data at the source level
Data issues
Data flow fails with a column name error
PostgreSQL enforces a 63-character limit on column names. If your source data has field names longer than 63 characters, the data flow will fail.
This is a known issue with sources like Facebook (Meta) Ads and Stripe, which sometimes produce long field names.
To fix this:
Use the Transformations step in your data flow to rename the affected columns to shorter names before they reach PostgreSQL
Be careful when editing transformations — changes are saved automatically without a confirmation prompt, so double-check before navigating away
Transformation changes in Coupler.io save automatically. Accidental edits can change your column structure without warning. Review your transformation settings carefully if your data flow suddenly starts failing.
Append mode breaks after a source schema change
When using Append mode, Coupler.io writes new rows into the existing table structure. If your source adds new columns (for example, a new metric in your ad platform), PostgreSQL won't know about them and the run may fail.
To fix this, manually ALTER TABLE in PostgreSQL to add the new column(s) before the next run. After that, Coupler.io will populate them normally.
Data types are wrong in the destination table
Coupler.io detects and enforces column types automatically. However, if the table already exists with different column types from a previous run or manual creation, there can be a mismatch.
If you're using Replace mode, try deleting the existing table and letting Coupler.io recreate it cleanly on the next run
If you're using Append mode, check that the existing table's column types match what Coupler.io is trying to write
Permission errors
Permission denied when creating a schema or table
The PostgreSQL user you connected with doesn't have enough privileges. Make sure the user has:
CREATE privilege on the target database
CREATE privilege on the target schema
INSERT privilege on the target table
You can grant these with:
If you want Coupler.io to create new schemas and tables automatically, the user needs CREATE at both the database and schema level — not just INSERT on an existing table.
Last updated
Was this helpful?
