Question

How to get the exact failure reason for failed job/pipeline


Hi team,

I define a pipeline with the processer called “Field Remover”, the job failed and can see all Input rows are shown in Error rows.  However when check the log(log severity is ‘ALL’), I cannot see any error or error reason in the log, I can only see an warning “Exception thrown while trying to remove temporary directory /opt/streamsets-datacollector/tmp/sdc1627444042246270257”, how can I get the exact failed reason?

Some context of pipleline, it is to load some data from oracle origin to snowflake internal tables. and it can run well several days, but same workflow failed today, however if I remove the process “filed remover” now, I can see the pipeline run successfully. Any suggestions for this trouble shooting.


2 replies

Userlevel 2
Badge

@Dolphin ,

by any chance can you run a preview on t hat pipeline with Field Remover?

That is one of the ways to see the error.

Userlevel 5
Badge +1

@Dolphin 

The field remover will not show any error in case of any configuration is missing in it.

The reason of failure of your job is that there are issues in field remover processor , please check the processor what exactly you want to and if all are in correct place or not.

Field remover processor only works for fields but not for directory removal .Kindly correct the processor and rerun the pipeline , i hope it will work for you.

 

 

Reply