Question

Simple JDBC Query Consumer to Local LS

  • 30 April 2024
  • 1 reply
  • 14 views

Hello, 

 

I am creating the most simplest pipeline:  a query (JDBC Query Consumer) whose output is written to files - Local FS.  I have broken the files to 200k records each but this job never stops running and keeps writing file after file of what I assume is duplicate data. 

I cannot for the life of me figure out what I did wrong.  Any insight would be greatly appreciated!

I have created complex pipelines in the past, so this is strange.

 

 


1 reply

Userlevel 5
Badge +1

@lizzie 

can you please give a try with below configuration mentioned in the below snap.

Add the event as pipeline finisher with condition as ${record:eventType() == 'no-more-data'} in it and also uncheck the incremental mode.

 

Please let me know if it helps.

 

Reply