Timeout: we could not import your data within 9 minutes
Your data flow failed with the following error:
Timeout: we could not import your data within 9 minutes.This means Coupler.io was unable to finish fetching or delivering your data within the allowed time. The good news is that in most cases, this is easy to resolve — and often requires nothing more than running the data flow again.
Why does a timeout happen?
A timeout occurs when the source application's API takes longer than 9 minutes to return all the requested data, or when the destination application takes too long to record it. This is usually caused by a large volume of data being processed in a single run.
How to resolve this error?
Step 1: Simply run the data flow again
In most cases, simply re-running the data flow will fix the problem. Most Coupler.io sources are incremental. This means that when a data flow runs, it loads data in parts rather than all at once. If a timeout interrupts an incremental data flow, no data is lost — the progress is saved automatically.
What to do: Simply run the data flow again. The next run will not start from the beginning. Instead, it will pick up where it left off and continue fetching the remaining data within your selected time frame.
Step 2: Reduce the data set
If you've re-run the data flow several times and it keeps timing out at the same point, the source likely needs to fetch all data at once and can't resume from where it stopped, or the source is non-incremental, which means each run fetches the entire dataset from scratch. In this case, you need to reduce the amount of data being processed in a single run.
There are two ways to do this:
Option A: Use source filters to reduce the dataset
Many sources have advanced settings on the Sources step that allow you to filter the data before it's fetched. For example, you can limit the date range, select specific entities, reduce the number of accounts, specify columns, or apply other criteria to reduce the volume of the data being fetched.
Applying filters on the Data sets step will not reduce the original dataset pulled from the source API. Therefore, it will not impact the time it takes for your source API to return the data before Coupler.io shows it on the preview at the Data sets step. Be sure to apply the filters within the Sources step.

Example with HubSpot Source
You're importing your deals data from your account, and the import keeps timing out. You realized that you only need to import a few columns, and you need the data from 2026 only.
Edit your data flow on the Sources step.
Under the Columns section, specify the columns you need
Under the Date filters section, specify the needed time frame (you can apply the date filter by created and updated date).
Reduce your data further by applying filters under the Advanced filters section.
If the timeout error persists after applying filters, please try narrowing your data further. If that’s not possible, you can proceed with Option B below.
Option B: Split the data across multiple sources
If you need all the data and can't apply filters, split your data into smaller parts by creating multiple sources within the same data flow. Each source has its own separate timeout allowance, so smaller chunks will process faster.
Example with Google Sheets
Imagine you're importing data from a Google Sheet with a range of A1:Z20000, and the import keeps timing out.
Duplicate your source within the data flow by clicking the "duplicate" button next to your existing source.
Adjust the data ranges so each source handles a smaller portion of the spreadsheet:
Source 1: set the range to
A1:Z10000Source 2: set the range to
A10001:Z20000
Combine the sources on Step 2 — Data Sets using the Append mode. This will merge the data from both sources back into one unified dataset.
Run the data flow. Each source will now fetch a smaller portion of data, staying within the 9-minute limit.
If two sources aren't enough (you're still hitting timeouts), split the data into three or more sources.
Additional tips
If the timeout is caused by a heavy spreadsheet rather than a large dataset, consider the following:
Remove unnecessary data, unused tabs, or excessive formatting from the source spreadsheet.
Replace complex formulas with static values where possible — formulas can cause the spreadsheet to hang when Coupler.io reads from it.
If the spreadsheet is still too heavy, move your data to a cleaner file and point the source there.
If the timeout happens on the destination side
In some cases, the timeout occurs not when fetching the data but when writing it to the destination. This can happen when the destination spreadsheet is large, contains many formulas, or is slow to respond.
What to do:
Clean up the destination file by removing unnecessary data, tabs, or formulas.
Try switching to a fresh, empty destination spreadsheet.
If the dataset is very large, consider splitting your data flow into multiple flows with smaller outputs, and chain them using automatic schedules or webhooks.
Still experiencing timeouts after trying these steps? Contact our support team, and we'll help you troubleshoot further.
Last updated
Was this helpful?
