# Common Issues

## Connection issues

<details>

<summary>"Connection refused" or "No route to host" error</summary>

Your PostgreSQL database is not accepting connections from Coupler.io's IP addresses.

**Fix:**

1. Copy the IP addresses shown in the error message or connection dialog.
2. Add these IPs to your database firewall or security group:
   * **AWS RDS:** Edit the security group's inbound rules; add port 5432 for Coupler.io IPs.
   * **Azure Database:** Add firewall rules in the Azure portal.
   * **Self-hosted/On-premise:** Update `pg_hba.conf` or your network firewall to allow Coupler.io IPs on port 5432.
3. Test the connection again in Coupler.io.

If you're unsure how to whitelist IPs in your infrastructure, contact your database administrator or cloud provider support.

</details>

<details>

<summary>"Connection already closed" error</summary>

PostgreSQL closed the connection before the data flow completed, usually due to idle timeout or network instability.

**Fix:**

1. Check your PostgreSQL `idle_in_transaction_session_timeout` setting (default is no limit). If set to a low value, increase it.
2. If the query is slow, optimize it:
   * Add indexes to WHERE clause columns: `CREATE INDEX idx_name ON table(column);`
   * Filter by date range instead of pulling all historical data.
   * Use LIMIT to test: `SELECT * FROM table LIMIT 1000`.
3. Re-run the data flow.

</details>

<details>

<summary>"Connection is timing out" error</summary>

The database took too long to respond (usually > 30 seconds).

**Fix:**

1. **Test the query directly** in psql or a query tool to see how long it takes.
2. **Optimize the SQL query:**
   * Add a date range filter: `WHERE created_at >= NOW() - INTERVAL '90 days'`.
   * Use aggregation to reduce rows: `GROUP BY` instead of SELECT \*.
   * Add indexes: `CREATE INDEX idx_column ON table(column);`
3. **Check database load:** Ask your DBA if the database is under heavy load during your scheduled import time.
4. **Increase timeout** (if self-hosted): Adjust PostgreSQL's `statement_timeout` or Coupler.io's connection timeout.

</details>

<details>

<summary>"Authentication failed" error</summary>

Your username, password, or database name is incorrect.

**Fix:**

1. Double-check your credentials in the connection dialog.
2. Test the connection manually using psql: `psql -h hostname -U username -d database_name -p 5432`.
3. If the manual connection works, re-enter credentials in Coupler.io carefully (watch for trailing spaces).
4. If you recently changed your password, update it in Coupler.io.

</details>

## Missing data

<details>

<summary>No data appears after import, or only partial data</summary>

Your query may have no results, be filtered too strictly, or be missing tables/columns.

**Fix:**

1. **Test your query** directly in psql: `SELECT COUNT(*) FROM table;` to confirm data exists.
2. **Check schema and table names** — they're case-sensitive in PostgreSQL. Use `\dt` in psql to list tables.
3. **Verify column access** — ensure your database user has SELECT permissions: `GRANT SELECT ON table TO username;`
4. **Review filters** in your SQL query. If you're filtering by date, ensure the date range isn't too narrow.
5. **Check row limits** — if you used LIMIT in your query, increase or remove it.

</details>

<details>

<summary>Data is outdated or hasn't refreshed</summary>

The data flow hasn't run recently, or the destination wasn't updated.

**Fix:**

1. **Check the data flow status** — click on the data flow and review the run history. Did the last run succeed or fail?
2. **Check your schedule** — is the data flow scheduled to run? If not, it only runs when you manually trigger it.
3. **Run manually** — click "Run" to trigger an immediate refresh.
4. **Check the destination** — if using Google Sheets, ensure you're viewing the correct sheet and it's not cached. Refresh the browser.

</details>

## Permission errors

<details>

<summary>"Permission denied" or "role does not have SELECT privilege" error</summary>

Your database user lacks SELECT permissions on the table or schema.

**Fix:**

1. **Grant SELECT permission** (run as a superuser or table owner):

   ```sql
   GRANT SELECT ON table_name TO your_username;
   GRANT USAGE ON SCHEMA schema_name TO your_username;
   ```
2. **For all tables in a schema:**

   ```sql
   GRANT USAGE ON SCHEMA schema_name TO your_username;
   GRANT SELECT ON ALL TABLES IN SCHEMA schema_name TO your_username;
   ```
3. Re-test the connection in Coupler.io.

</details>

## Data discrepancies

<details>

<summary>Fewer rows in Coupler.io than in the database</summary>

Your query may be filtering out rows unintentionally, or you may be comparing counts from different queries.

**Fix:**

1. **Run the same query in psql** to confirm the row count: `SELECT COUNT(*) FROM table WHERE ... ;`
2. **Check for NULL values or filters** in your SQL that might exclude rows.
3. **Verify date ranges** — are you filtering by timestamp and comparing against a different timezone?
4. **Check for duplicates** — does your query have a JOIN that multiplies rows? Use DISTINCT if needed.

</details>

<details>

<summary>"Division by zero" or other SQL errors during import</summary>

Your SQL query has a logic error (e.g., dividing by a NULL or zero value, invalid type conversion).

**Fix:**

1. **Fix the SQL query** to handle edge cases:

   ```sql
   SELECT id, CASE WHEN denominator = 0 THEN 0 ELSE numerator / denominator END as ratio
   FROM table;
   ```
2. **Use COALESCE** to replace NULL values: `COALESCE(column, 0)`.
3. **Cast types explicitly** if mixing text and numbers: `CAST(column AS INTEGER)`.
4. Re-run the data flow.

</details>

## Rate limits and performance

<details>

<summary>"Too much data" error or memory limit exceeded</summary>

Your query returns too many rows or columns for the import to process in memory.

**Fix:**

1. **Add a WHERE clause** to filter by date, status, or category: `WHERE created_at >= NOW() - INTERVAL '90 days'`.
2. **Select only needed columns** instead of SELECT \*:

   ```sql
   SELECT id, name, email FROM users WHERE active = true;
   ```
3. **Aggregate the data** to reduce row count:

   ```sql
   SELECT DATE(created_at), COUNT(*) as count FROM events GROUP BY DATE(created_at);
   ```
4. **Split into multiple data flows** — import different date ranges or tables separately, then Append them in Coupler.io.

</details>

<details>

<summary>Data flow times out or runs very slowly</summary>

Your query is inefficient, the database is overloaded, or the network is slow.

**Fix:**

1. **Add indexes** to columns used in WHERE or JOIN clauses:

   ```sql
   CREATE INDEX idx_user_id ON orders(user_id);
   CREATE INDEX idx_created_at ON orders(created_at);
   ```
2. **Reduce data volume** — filter by date, use aggregation, or remove unnecessary JOINs.
3. **Schedule imports during off-peak hours** when the database is less loaded.
4. **Profile the query** using EXPLAIN ANALYZE in psql to identify slow parts.

</details>
