How easy would it be to create a custom Scala or PySpark stage, that can output an array of spark data frames (like it can receive for input) rather than just one?
Page 1 / 1
In this case the custom function will help you in fixing the issue.
Create the custom function scala and deploy the JAR in in streamsets transformer .In scala processor , import the UDF in it to use the function.
Thanks for the response!
Could you possibly share an example? as i’m a little unsure on what you mean exactly.
Thanks
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.