30-Day Free Trial: It’s Never Been Easier To Get Started With StreamSets
The community has since transitioned to the IBM Community. Please click here to continue your engagement
EnvironmentJava 8 MySQL JDBC driverIssueJava updates disable TLSv1 and TLSv1.1 by default SSL/TLS connections are not successful to MySQL utilizing JDBC driverCaused by: javax.net.ssl.SSLHandshakeException: No appropriate protocol (protocol is disabl
Problem:The Kafka stages performance seems to decrease and we can see the following exception in the logs:ERROR Encountered error in multi kafka thread 0 during read org.apache.kafka.clients.consumer.CommitFailedException: Commit cannot be completed
Issue:Pipeline with Kafka destination stops processing data with the following exception:ERROR ProductionPipelineRunner - Pipeline execution failed com.streamsets.pipeline.api.StageException: KAFKA_50 - Error writing data to the Kafka broker: java.ut
"Failed at step EXEC - Permission denied" when starting SDC as a Service on Systems with SELinux ProblemWhen starting StreamSets Data Collector as a service under systemd, the service fails immediately on startup. The following error is shown when
Problem DescriptionWhen running an SDC pipeline which processes records with Strings containing UTF-8 special characters, these special characters are being replaced by question marks (? or �) in various parts of your pipeline.ExampleInput record:{"p
When writing to Kafka Producer destination it doesn’t always handle timeout exceptions (as shown below) and the pipeline does not honor On Record Error » Send to Error setting on the Kafka Producer destination. How can this be resolved?Caused by: jav
I'm trying to enable Kerberos for my SDC RPM installation, but when I start the SDC I get following exception:java.lang.RuntimeException: Could not get Kerberos credentials: javax.security.auth.login.LoginEx Caused by: javax.security.auth.login.Login
Trying to read XML in the following format with “:” is throwing Can't parse XML element names containing colon ':' error.<sh:root> <sh:book> </sh:book> <sh:genre> </sh:genre> <sh:id> </sh:id> <sh:book&
Issue:While trying to execute a pipeline that makes use of a Kafka origin or destination, the pipeline fails. The following error message is seen in the SDC logs:ERROR SafeScheduledExecutorService - Uncaught throwable from com.streamsets.pipeline.lib
IssueA pipeline fails due to a Kafka Stage with Kerberos authentication enabled is throwing the stage exception KAFKA_29 - Error fetching data from Kafka: org.apache.kafka.common.errors.GroupAuthorizationException: Not authorized to access group: xxx
while running python program using streamsets python sdk we are getting error File "/var/lib/jenkins/.local/lib/python3.8/site-packages/streamsets/sdk/exceptions.py", line 5, in <module> from requests.exceptions import HTTPError, JSONDecodeErro
Getting this error While create JDBC connection for GreenPlum Database:JDBC_06 - Failed to initialize connection pool: com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed to initialize pool: [Pivotal][Greenplum JDBC Driver][Greenplu
IssueA Kafka stage fails to connect to the broker due to the follow error stack trace:INFO ConsumerCoordinator - (Re-)joining groupINFO NetworkClient - Disconnecting from node xxx due to socket connection setup timeout. The timeout value is 8424 ms
My JDBC producer stage is giving the error java.lang.NoClassDefFoundError: org/apache/commons/lang3/StringUtils Has anyone had a similar issue before?
I’m running below command in my windows machine for Auth token and getting below error message. curl -s -X POST -d '{"userName":"userid", "password": "password"}' https://cloud.streamsets.com/security/public-rest/v1/authentication/login -H "Content-T
Question:How to connect JDBC Consumer Stage to MS SQL Server via AAD or Active Directory? Answer:Our product teams are currently evaluating to see this could be added natively. Also, our Product Teams has this on the future Product Road Map for nativ
Receiving German characters from one of the attribute from API response and while joining the response with another API, we are facing issues because of the non-english characters.Can anyone suggest how we can handle this in pipeline?
Hello,Does somebody have experience with this issue?I have pipeline with HTTP server on origin side and Kafka producer on destination side. The pipeline restarts more or less regularly with this error message: 2022-09-07 13:24:26,245 [user:*admin] [p
we are ingesting data from oracle and it has timestamp format as mm/dd/yyyy hhss AM/PM, while ingesting through streamsets its taking as yyyy-mm-dd hhss. how to change the timestamp format to source format which is mm/dd/yyyy hhss AM/PM in streamsets
Become a leader!
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.