Skip to main content

Hi All

 

I am reading a CSV file created by a instrument , it produces a csv file with header information , then a collection of fields with values for each element tested for , I want to write each element value set to JDBC as a single row

example

machine_id, date, time , run Number,  element, element_value, element_error, element, element_value,element error (50 of these sets )

i need to write to the database machine_id,data,time,run Number, element, element_value,element_ error , for each set of elements, values, errors , I can’t figure out  how to loop through the record,

So from the 1 csv record I need to write 50 JDBC records , is it possible in streamsets

thanks

Hi Gavin,

 

What version of SDC are you running, as a foreword, It doesn't matter which version as it should work on them all, just I have a demo pipeline that I think will do what you are after. and If I know the version ill make sure its rendered in an appropriate version for you.

 

My suggestion is to import the data into Streamsets without its header, then we can use the field mapping function to process the groups of 3 columns into a nested record, then pivot that out into fields

Let me know and ill upload a pipeline that should give you a nudge in the right direction

 

Kind Regards

 

Anth McMullen
Solutions Engineer
Streamsets


Hi thanks

for that we are using

StreamSets Data Collector 3.11.0   If you could provide an example that would be great

 

Gavin


Please find attached,
I have included the test data i invented for this too, as that should allow it to test out of the box… then you can adjust for your use case.

 


Thanks for that 

I'll give it a try   your help is greatly appreciated 

 

Gavin


Reply