# Common Issues

## Connection issues

<details>

<summary>Coupler.io can't connect to my PostgreSQL server</summary>

This usually means the host, port, or credentials are wrong — or your database server is blocking the connection.

Things to check:

* Make sure the **host**, **port**, **database name**, **username**, and **password** are all correct
* Confirm your PostgreSQL instance is accessible from the internet (or at least from Coupler.io's IPs)
* If your server uses IP allowlisting or a firewall, add both Coupler.io IPs to the allowlist:
  * `34.123.243.115`
  * `34.170.96.92`
* If you're connecting to **Supabase**, use the connection string details from **Settings → Database** in your Supabase dashboard, and make sure you're using the correct connection mode (Session or Transaction pooler)

{% hint style="warning" %}
Some cloud-hosted PostgreSQL providers (e.g., AWS RDS, Google Cloud SQL) require you to explicitly enable public access or configure VPC rules. Check your provider's network settings if the connection keeps failing.
{% endhint %}

</details>

<details>

<summary>Connection times out during the data flow run</summary>

A timeout during the run (not at the credential step) often indicates the database server is under load, or the query is taking too long for very large datasets.

* Try running the data flow again — transient timeouts often resolve themselves
* If the issue persists, check your PostgreSQL server's resource usage (CPU, memory, connections)
* Consider splitting large data flows into smaller sources or filtering data at the source level

</details>

## Data issues

<details>

<summary>Data flow fails with a column name error</summary>

PostgreSQL enforces a **63-character limit** on column names. If your source data has field names longer than 63 characters, the data flow will fail.

This is a known issue with sources like **Facebook (Meta) Ads** and **Stripe**, which sometimes produce long field names.

To fix this:

* Use the **Transformations** step in your data flow to rename the affected columns to shorter names before they reach PostgreSQL
* Be careful when editing transformations — changes are saved automatically without a confirmation prompt, so double-check before navigating away

{% hint style="danger" %}
Transformation changes in Coupler.io save automatically. Accidental edits can change your column structure without warning. Review your transformation settings carefully if your data flow suddenly starts failing.
{% endhint %}

</details>

<details>

<summary>Append mode breaks after a source schema change</summary>

When using Append mode, Coupler.io writes new rows into the existing table structure. If your source adds new columns (for example, a new metric in your ad platform), PostgreSQL won't know about them and the run may fail.

To fix this, manually `ALTER TABLE` in PostgreSQL to add the new column(s) before the next run. After that, Coupler.io will populate them normally.

</details>

<details>

<summary>Data types are wrong in the destination table</summary>

Coupler.io detects and enforces column types automatically. However, if the table already exists with different column types from a previous run or manual creation, there can be a mismatch.

* If you're using **Replace** mode, try deleting the existing table and letting Coupler.io recreate it cleanly on the next run
* If you're using **Append** mode, check that the existing table's column types match what Coupler.io is trying to write

</details>

## Permission errors

<details>

<summary>Permission denied when creating a schema or table</summary>

The PostgreSQL user you connected with doesn't have enough privileges. Make sure the user has:

* **CREATE** privilege on the target database
* **CREATE** privilege on the target schema
* **INSERT** privilege on the target table

You can grant these with:

```sql
GRANT CREATE ON DATABASE your_database TO your_user;
GRANT CREATE ON SCHEMA your_schema TO your_user;
GRANT INSERT ON TABLE your_schema.your_table TO your_user;
```

{% hint style="info" %}
If you want Coupler.io to create new schemas and tables automatically, the user needs CREATE at both the database and schema level — not just INSERT on an existing table.
{% endhint %}

</details>
