Skip to main content

Hi there,

We are migrating to DataOps Platform.

We run a Self-Managed Environment.

We use connection objects for our pipelines and our credential store is AWS.

Our structure is such that, ONLY Operations team can deploy pipelines to PROD deployment / engines.

 

We are new to using the DataOps Platform, we noticed that we could NOT change the connection to point to PROD sources at the time of creation of a job instance from an already tested pipeline. 

 

Our “Operations” team had to create a new version of the pipeline pointing to PROD connections and ONLY then they can create a Job instance and deploy. Note - The PROD connections are ONLY accessible by our Operations team.

 

It would have been nice, if we have the option to choose the values for the connection (objects) at the point of job instance creation. 

 

Please advise if there are any other alternatives.

 

P.S. I understand I can always switch to runtime parameters to meet my requirements but then I will have to completely ignore the concept of Connection Objects.

 

Thanks,

 

 

Hi @Srinivasan Sankar :

 

You can use the same connection in multiple jobs. It can simply be achieved by using parameters

While creating a connection, use parameters which will be defined in the pipelines. 

 

you can override those parameters while creating the jobs. Hope it helps


@Srinivasan Sankar were @wilsonshamim suggestions helpful? 


@wilsonshamim , @Drew Kreiger - the suggestion really helped.

 

I’ve tested one pipeline with a parameterized connection object and it worked as expected. I am yet to fully test our security model. This will definitely reduce the effort on our Ops Support team. Thanks. 


Great to hear, @Srinivasan Sankar


Reply