Skip to main content

How to parameterize your DataBrick spark cluster configuration through runtime resource ?

  • December 22, 2021
  • 0 replies
  • 59 views

AkshayJadhav
StreamSets Employee
Forum|alt.badge.img

Question:

How to parameterize your DataBrick spark cluster configuration as runtime?

Cluster Manager Type : DataBrick

Answer: 

We can leverage the runtime:loadResource function to call a runtime resource. 

Step1:  Create a resource file, cluster configuration JSON:

#cat test
{
    "num_workers": 6,
    "spark_version": "5.3.x-scala2.11",
    "node_type_id": "i3.xlarge"
}

Step2:  In pipeline under cluster config section call the below EL:

${runtime:loadResource('test', false)}

 

This topic has been closed for replies.