Solved

Reading SDC Error File from S3 Bucket


I have a pipeline that reads Kafka records and I’m writing any errors to an S3 bucket.  Most of the errors are related to tombstone records.  Once my original pipeline runs, I plan on having another pipeline process the error files on S3.  So far so good.  I can get the error code from the SDC file in the record:errorCode() field.  I also need the Kafka key from the S3 file.  What is the highlighted record/field called and how do I obtain the value with an expression evaluator or other processor?  Once I convert it to a map, that value is gone.

 

icon

Best answer by pkandra 8 March 2023, 21:46

View original

7 replies

Userlevel 5
Badge +1

@pkandra 

can i get the error file with no sensitive data in it ,so i can try it from my end. .

@Bikram  attached is a sample S3 error file in SDC format with 50 records

Userlevel 5
Badge +1

@pkandra 

Kindly check if the below code snippet works for you. The expression will work for this case only but not for other cases because I try to fix it by using the str:length() function with some hard coded values.

We can handle it by using any of the processors like (Jython or groovy etc) .

Please let me know if it helps.

Result  = ${str:substring(record:value('/text'),str:length(record:value('/text')) - 63,str:length(record:value('/text')) - 27)}

 

 

 

 

Thanks & Regards

Bikram_

Hi @Bikram.  That doesn’t work for me.  I get the attached error.  My error file is in SDC format, but your looks like a map.

 

 

Userlevel 5
Badge +1

@pkandra 

 

I just kept the error file in s3 bucket and fetched all data from s3 with data format as text and by using expression evaluator got the above the result.

Can you please share your pipeline with no sensitive data in it ,so i can check it and will try to help you on the same.

 

Thanks & Regards

Bikram_

No worries @Bikram  I was able to get what I needed using an Expression Evaluator and the below.

 

 

Userlevel 5
Badge +1

@pkandra 

Nice to know that you managed to fix the issue , please mark the issue as resolved .

 

Thanks & Regards

Bikram_

Reply