BigQuery

BigQuery is Google's fully managed, serverless data warehouse that lets you analyze massive datasets using standard SQL. With Coupler.io, you can run custom SQL queries against your BigQuery datasets and automatically push results to Google Sheets, Excel, BigQuery itself, or AI destinations like Claude and ChatGPT.

Why connect BigQuery to Coupler.io?

  • Automate reporting — Run scheduled queries and sync results directly to Google Sheets without manual downloads

  • Custom SQL queries — Write any SQL query to extract, transform, and filter exactly the data you need

  • Real-time insights — Keep dashboards and reports fresh with automated data flows

  • Multiple destinations — Send results to Google Sheets, Excel, BigQuery, Looker Studio, or AI platforms like ChatGPT and Gemini

Prerequisites

Before connecting BigQuery, you'll need:

  • A Google Cloud Platform (GCP) project with BigQuery enabled

  • A Service Account with the following roles assigned:

    • BigQuery Data Viewer — to read data from datasets

    • BigQuery Job User — to run queries

  • A JSON key file for your Service Account (downloaded from GCP)

  • Access to at least one BigQuery dataset

Quick start

circle-check

How to connect

1

Create a new data flow. In Coupler.io, click "Create" and select BigQuery as your source.

2

Upload your Service Account JSON key. Click "Connect" and select your .json key file from GCP. This grants Coupler.io permission to run queries on your behalf. If you don't have a key yet, follow the GCP setup guidearrow-up-right to create one.

3

Enter your SQL query. In the "SQL query" field, write a standard SQL query to fetch the data you need. For example: SELECT order_id, customer_name, order_date FROM your_dataset.orders WHERE order_date >= CURRENT_DATE()

4

Choose your destination. Select where you want the data to go — Google Sheets, Excel, another BigQuery table, or an AI destination like Claude or ChatGPT.

5

Run the data flow. Click "Run" to execute your query and send the results to your destination. A successful manual run is required before you can schedule the data flow.

Common use cases

Use case
Example

Weekly reporting

Query sales data from the last 7 days and push to a Google Sheet for team review

Data transformation

Use SQL to aggregate metrics, join multiple tables, and clean data before exporting

Real-time dashboards

Feed filtered BigQuery results into Looker Studio or Google Sheets for live dashboards

Data backup

Export BigQuery tables to Google Sheets for archival or sharing with non-technical users

AI-powered analysis

Send query results to ChatGPT or Claude for automated insights and summaries

Last updated

Was this helpful?