Skip to main content

Our requirement is to collect all records and do some processing. That we have done using JavaScript array. We want to write that array to the output processor post completion of all batch. sdc.output.write() function is not working in the destroy block. Is there anyway to achieve the above requirement? Is there any way to hold writing output and only do output once all batches are finished.

hi @Anjalee 

I have no idea about your use case, the data collector works on data in batches and each batch is not aware of the previous or next batch. There are ways to keep track of all the records but they would be work arounds. If possible, use Transformer of you want to perform actions on the whole dataset rather than batches of data.

The destroy code block in Javascript Evaluator is fired just before pipeline stop. But by then the last batch of data has already been written to the destination and possibly the connections are closed too. As a work around if you want to save this array with data from all the batches somewhere then you can look at opening a database connection in Destroy block, write the data in a table and close the connection. As i said, it is a workaround :-)

 

I hope this helps.

 


Thank you saleempothiwala. It helped.


@Anjalee 

Excellent ! Please mark it as an answer so someone else looking for a solution will see it too.


Reply