PostgreSQL
PostgreSQL is a powerful, open-source relational database. With Coupler.io, you can automatically export data from any PostgreSQL table, view, or custom SQL query to Google Sheets, Excel, BigQuery, Looker Studio, or AI destinations like Claude and ChatGPT — no code required.
Why connect PostgreSQL to Coupler.io?
Automate reporting — Stop manually exporting data. Set up a data flow once and refresh on a schedule.
Query flexibility — Pull data from tables, views, or write custom SQL to get exactly what you need.
Multiple destinations — Send PostgreSQL data to spreadsheets, data warehouses, BI tools, or AI models.
Real-time insights — Keep dashboards and reports up-to-date without manual work.
Prerequisites
Before you start, you'll need:
PostgreSQL database access — Host, port, database name, username, and password.
Network access — Your PostgreSQL instance must accept connections from Coupler.io's IP addresses (we'll provide these during setup).
Read permissions — Your database user needs SELECT access to the tables or views you want to import.
Quick start
If your PostgreSQL database is behind a firewall, you'll need to whitelist Coupler.io's IP addresses before connecting. We'll show you which IPs to allow during the setup process.
How to connect
Create a new data flow. Click "Create" and choose PostgreSQL as your source. Enter your database connection details:
Host (e.g.,
db.example.com)Port (usually
5432)Database name
Username
Password
Coupler.io will test the connection and show you our IP addresses that need to be whitelisted in your database firewall (if applicable).
Choose your report type. Select "Table or view" to pick an existing table or view, or "Custom SQL" to write a query that pulls the exact data you need.
Pick your schema and table (if using Table or view mode). If you're using Custom SQL, enter your SQL query — for example: SELECT id, name, created_at FROM users WHERE status = 'active'.
Select your destination. Choose where your data goes: Google Sheets, Excel, BigQuery, Looker Studio, or an AI destination like Claude or ChatGPT.
Run the data flow. Click "Run" to test the connection and import your data. Once it completes successfully, your data is ready in the destination.
After a successful manual run, you can schedule the data flow to refresh automatically (daily, weekly, or on a custom schedule).
What you can import
Table or view
Select any table or view from your PostgreSQL database by schema and name.
Custom SQL
Write a SQL query to filter, join, aggregate, or transform data before import.
Last updated
Was this helpful?
