Skip to main content

We are trying to replicate data from SQL Server to Snowflake. Is there a way to configure the Snowflake destination to make periodic updates on Snowflake rather than make real-time? Having the pipeline run in real-time would mean high credit usage on the snowflake warehouse.

The way StreamSets writes data to Snowflake is by creating a staging file in the Snowflake staging directory and then issuing a Snowflake Copy (for inserts only) or Merge (for CDC) for each batch. You could have your pipeline write to S3 as the destination (instead of Snowflake) and then periodically run the Copy or Merge command to load that data to Snowflake at whatever interval you desire.


Reply