Unable to fetch data from kafka when trying kafka consumer to Local FS and getting follwoing error :
iorg2.org] [runner:] [thread:ProductionPipelineRunnable-KAFKATOL__494a67a7-94ef-49e1-a41b-385c79551625__apiorg2.org-KAFKA_TO_LOCALFS] [stage:] INFO AbstractRunner - Scheduling retry in '239996' milliseconds
2023-05-11 23:21:31,730 user:*admin@apiorg2.org] pipeline:KAFKA_TO_LOCALFS/KAFKATOL__494a67a7-94ef-49e1-a41b-385c79551625__apiorg2.org] runner:] thread:ProductionPipelineRunnable-KAFKATOL__494a67a7-94ef-49e1-a41b-385c79551625__apiorg2.org-KAFKA_TO_LOCALFS] stage:] ERROR ProductionPipelineRunnable - An exception occurred while running the pipeline, com.streamsets.datacollector.runner.PipelineRuntimeException: CONTAINER_0800 - Can't start pipeline due 1 validation error(s). First one: KAFKA_41 - Could not get partition count for topic 'tst2' : com.streamsets.pipeline.api.StageException: KAFKA_41 - Could not get partition count for topic 'tst2' : org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
com.streamsets.datacollector.runner.PipelineRuntimeException: CONTAINER_0800 - Can't start pipeline due 1 validation error(s). First one: KAFKA_41 - Could not get partition count for topic 'tst2' : com.streamsets.pipeline.api.StageException: KAFKA_41 - Could not get partition count for topic 'tst2' : org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
at com.streamsets.datacollector.execution.runner.common.ProductionPipeline.run(ProductionPipeline.java:130) ~3streamsets-datacollector-container-5.3.0.jar:?]
at com.streamsets.datacollector.execution.runner.common.ProductionPipelineRunnable.run(ProductionPipelineRunnable.java:63) ~astreamsets-datacollector-container-5.3.0.jar:?]
at com.streamsets.datacollector.execution.runner.standalone.StandaloneRunner.startInternal(StandaloneRunner.java:757) ~rstreamsets-datacollector-container-5.3.0.jar:?]
at com.streamsets.datacollector.execution.runner.standalone.StandaloneRunner.start(StandaloneRunner.java:750) ~ustreamsets-datacollector-container-5.3.0.jar:?]
at co
I have given the cosumer name any random consumer and topic is the one where I have fetched the data from db in another pipeline :
So flow is like this
DB_TO_KAFKA - t data puhed into test1 topic and its succesful
KAFKA_TO_LOCALFS- same topic test1 with any random consumer to localfs
I have checked the port host are correct.
Any help will be greatly appreciated.