Skip to main content

How easy would it be to create a custom Scala or PySpark stage, that can output an array of spark data frames (like it can receive for input) rather than just one?

@collid 

In this case the custom function will help you in fixing the issue.

Create the custom function scala and deploy the JAR in in streamsets transformer .In scala processor , import the UDF in it to use the function.


Thanks for the response!

 

Could you possibly share an example? as i’m a little unsure on what you mean exactly. 

 

Thanks


Reply