Skip to main content
Solved

Why does Kafka Producer not handle timeout exceptions?

  • October 26, 2021
  • 1 reply
  • 9241 views

Dash
Headliner
Forum|alt.badge.img+3
  • Senior Technical Evangelist and Developer Advocate at Snowflake

When writing to Kafka Producer destination it doesn’t always handle timeout exceptions (as shown below) and the pipeline does not honor On Record Error » Send to Error setting on the Kafka Producer destination. How can this be resolved?

Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms. at org.apache.kafka.clients.producer.KafkaProducer$FutureFailure.<init>(KafkaProducer.java:1186) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:880) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:690) at com.streamsets.pipeline.kafka.impl.BaseKafkaProducer09.enqueueMessage(BaseKafkaProducer09.java:64) at com.streamsets.pipeline.stage.destination.kafka.KafkaTarget.writeOneMessagePerRecord(KafkaTarget.java:242) ... 30 more Caused by: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.

 

Best answer by Dash

According to Confluent, the timeout exception can be resolved by setting/updating the following config on the Kafka Producer destination.

{
"key": "connections.max.idle.ms",
"value": "60000"
},
{
"key": "metadata.max.idle.ms",
"value": "60000"
},
{
"key": "metadata.max.age.ms",
"value": "60000"
}

Cheers,

Dash

1 reply

Dash
Headliner
Forum|alt.badge.img+3
  • Author
  • Senior Technical Evangelist and Developer Advocate at Snowflake
  • Answer
  • October 26, 2021

According to Confluent, the timeout exception can be resolved by setting/updating the following config on the Kafka Producer destination.

{
"key": "connections.max.idle.ms",
"value": "60000"
},
{
"key": "metadata.max.idle.ms",
"value": "60000"
},
{
"key": "metadata.max.age.ms",
"value": "60000"
}

Cheers,

Dash