Solved

Transformer

  • 27 April 2023
  • 6 replies
  • 124 views

In Transformer, i sent data from file(directory) to file(local fs)

but the destination will got error

JOBRUNNER_63 - Pipeline status: 'RUN_ERROR', message: 'Operator File_3 failed due to org.apache.spark.sql.AnalysisException, check Driver Logs for further information'

RUN_ERROR: Operator File_3 failed due to org.apache.spark.sql.AnalysisException, check Driver Logs for further information (View Stack Trace... )

 

how to clear the error and I need  help to clear the error

Thanks 

Tamilarasu

 

icon

Best answer by Bikram 30 April 2023, 21:21

View original

6 replies

Userlevel 4
Badge

@tamilarasup

generally this error comes up when the input fields are not mapped correctly.

 

I would recommend Preview and see where the issue is.

@saleempothiwala  Preview  is working good...the pipeline run upto before the destination area.If i use trash means its working

 

after thati use files means it’s not working

 

Userlevel 5
Badge +1

@tamilarasup 

Kindly check the driver logs ,it will tell the reason of failure.

Can you please attach the error logs to check the reason of failure?

 

Thanks & Regards

Bikram_

Userlevel 4
Badge

@tamilarasup 

Please check the fields in preview just before the destination.

@tamilarasup

Kindly check the driver logs ,it will tell the reason of failure.

Can you please attach the error logs to check the reason of failure?

 

Thanks & Regards

Bikram_

Thank you @Bikram and @saleempothiwala .

 

now its working..

above scenario now destination will be delta lake. storage Aws or ADLS2.

in cluster configuration i using databricks.

i got error as

how to connect or solve this  - The Transformer Spark application in the Spark cluster might not be able to reach Transformer at ‘http://661f1312d1e9:19630’.

Reply