Data Overview

PostgreSQL data flows pull data directly from your database using either table selection or custom SQL queries. The data structure depends on what you're importing.

Import modes

Mode
Use case
Example

Table or view

Import all or most data from a single table or materialized view.

Pull the customers table as-is.

Custom SQL

Write a query to filter, join, aggregate, or transform data.

SELECT customer_id, COUNT(*) as order_count FROM orders GROUP BY customer_id

Available columns and data types

When you import from PostgreSQL, all columns and their data types come through as-is. Common PostgreSQL data types include:

Numeric types

Type
Description

INTEGER

Whole numbers

BIGINT

Large whole numbers

DECIMAL / NUMERIC

Precise decimal numbers

FLOAT / DOUBLE PRECISION

Floating-point numbers

Text types

Type
Description

VARCHAR / TEXT

Variable-length text

CHAR

Fixed-length text

Date and time types

Type
Description

DATE

Calendar date

TIME

Time of day

TIMESTAMP / TIMESTAMPTZ

Date and time (with or without timezone)

Other types

Type
Description

BOOLEAN

True/False values

UUID

Unique identifiers

JSON / JSONB

JSON objects (imported as text)

ARRAY

Arrays (imported as text)

Common query patterns

Here are useful SQL queries you can paste into the Custom SQL field:

Filter by date range:

Aggregate data:

Join multiple tables:

Use cases by role

Analysts use PostgreSQL data flows to build custom datasets for analysis. Write SQL queries to join tables, aggregate metrics, and filter to specific segments. Export to BigQuery for large-scale analysis or to Google Sheets for quick exploration. Use the Join transformation to combine PostgreSQL data with data from other sources (e.g., marketing platforms).

Platform-specific notes

  • IP whitelisting required — Coupler.io connects from fixed IP addresses. You must whitelist these in your PostgreSQL firewall or security group (AWS RDS, Azure Database, etc.).

  • Read-only access — Your database user only needs SELECT permissions; no writes or schema changes.

  • Large result sets — If your query returns millions of rows, consider filtering by date, aggregating, or using LIMIT to avoid timeouts.

  • Connection pooling — PostgreSQL has a connection limit per role. If you run many data flows, you may hit this limit; consider using a connection pooler or increasing max_connections.

  • Custom SQL performance — Complex queries with multiple joins or aggregations may take longer. Test in psql or your query tool first.

  • JSON/ARRAY types — These are imported as text strings. Parse them in your destination (Google Sheets formulas, BigQuery JSON functions, etc.) if needed.

Last updated

Was this helpful?