Question

Metadata driven ingestion framework

  • 10 October 2023
  • 5 replies
  • 84 views

Hi

I'm currently working on implementing a metadata-driven pipeline. One of the key components of this pipeline is a job that's responsible for transferring data from Teradata to Snowflake. To make this process dynamic and reusable, I've introduced a job which will load the data to snowflake.

Furthermore, I've established a separate pipeline that serves the purpose of fetching metadata from Teradata. This metadata contains important information, including the values for ${dbname} and ${Tname}. The intention is to pass these parameter values to the "startjobs" process. However, I'm currently unsure about how these parameter values will be effectively utilized within the job I've created for the data transfer from Teradata to Snowflake. I would greatly appreciate any guidance or assistance in this regard.

 

 


5 replies

Userlevel 5
Badge +1

 

@uday8770 

 

Please attempt using the provided pipeline. If it doesn't address your specific use case, please share details about test pipelines that you've designed so that I can set up those pipelines in my system and investigate the issue further.

 

 

The data source retrieves information about database names and table names from a metadata table. You can use the stream selector to filter the data if needed. After that, the data is passed to a JDBC producer, which inserts the data into the database tables specified in the metadata table.

 

 

 

 

 

 

Hi ,

Below pipeline fetches the metadata and pass the value to start jobs.

This pipeline uses the metadata and fetch data from Teradata and load into snowflake.

 

 

Userlevel 5
Badge +1

@uday8770 

The pipeline appears to be in good order. Could you please let me know the specific error you're encountering? I'll attempt to reproduce the issue and provide you with more detailed information.

Regarding the scenario we've been working on, it seems like we can manage it using a single pipeline, as I mentioned earlier. I'll go ahead and propose a solution for this.

 

Thanks & Regards

Bikram_

 

Userlevel 5
Badge +1

@uday8770 

can you please try to configure job parameter and check if it helps.

 

{

"dbname":"${record:value('/databasename')}",

“Tname”:”${record:value('/Tablename')}”

}

Userlevel 4
Badge

@uday8770 as Bikram suggested, you need to quote the key and value both in the parameter json payload. You can alternate ‘ and “ so you could use 

 

{

"dbname":"${record:value('/databasename')}",

“Tname”:”${record:value('/Tablename')}”

}

or

{

"dbname":'${record:value("/databasename")}',

"Tname":'${record:value("/Tablename")}'

}

 

Reply