Skip to main content
Solved

conversion error

  • September 3, 2021
  • 6 replies
  • 361 views

matfior
Fan

Pipeline is from AWS MySql transaction log to Google BigQuery.

Error is:

java.lang.IllegalArgumentException: Cannot convert byte[] field '[B@5178e7eb' to Integer
 

I can’t seem to find any way to understand what is causing the error.

Can you please help?

Best answer by matfior

I managed to restart the pipeline by using the command `Reset Origin & Start`, however I would like to understand why this happened in the first place.

View original
Did this topic help you find an answer to your question?

6 replies

Dash
Headliner
Forum|alt.badge.img+3
  • Senior Technical Evangelist and Developer Advocate at Snowflake
  • 67 replies
  • September 3, 2021

Hi @matfior,

Can you please post the entire stack trace?

 


matfior
Fan
  • Author
  • Fan
  • 4 replies
  • September 3, 2021
java.lang.IllegalArgumentException: Cannot convert byte[] field '[B@5178e7eb' to Integer
	at com.streamsets.pipeline.api.impl.IntegerTypeSupport.convert(IntegerTypeSupport.java:48)
	at com.streamsets.pipeline.api.impl.IntegerTypeSupport.convert(IntegerTypeSupport.java:20)
	at com.streamsets.pipeline.api.Field$Type.convert(Field.java:113)
	at com.streamsets.pipeline.api.Field$Type.access$100(Field.java:83)
	at com.streamsets.pipeline.api.Field.create(Field.java:434)
	at com.streamsets.pipeline.api.Field.create(Field.java:418)
	at com.streamsets.pipeline.stage.origin.mysql.schema.MysqlType$15.toField(MysqlType.java:118)
	at com.streamsets.pipeline.stage.origin.mysql.RecordConverter.toMap(RecordConverter.java:206)
	at com.streamsets.pipeline.stage.origin.mysql.RecordConverter.toRecords(RecordConverter.java:113)
	at com.streamsets.pipeline.stage.origin.mysql.RecordConverter.toRecords(RecordConverter.java:74)
	at com.streamsets.pipeline.stage.origin.mysql.MysqlSource.produce(MysqlSource.java:241)
	at com.streamsets.datacollector.runner.StageRuntime.lambda$execute$2(StageRuntime.java:296)
	at com.streamsets.pipeline.api.impl.CreateByRef.call(CreateByRef.java:40)
	at com.streamsets.datacollector.runner.StageRuntime.execute(StageRuntime.java:244)
	at com.streamsets.datacollector.runner.StageRuntime.execute(StageRuntime.java:311)
	at com.streamsets.datacollector.runner.StagePipe.process(StagePipe.java:221)
	at com.streamsets.datacollector.execution.runner.common.ProductionPipelineRunner.processPipe(ProductionPipelineRunner.java:855)
	at com.streamsets.datacollector.execution.runner.common.ProductionPipelineRunner.runPollSource(ProductionPipelineRunner.java:585)
	at com.streamsets.datacollector.execution.runner.common.ProductionPipelineRunner.run(ProductionPipelineRunner.java:391)
	at com.streamsets.datacollector.runner.Pipeline.run(Pipeline.java:520)
	at com.streamsets.datacollector.execution.runner.common.ProductionPipeline.run(ProductionPipeline.java:112)
	at com.streamsets.datacollector.execution.runner.common.ProductionPipelineRunnable.run(ProductionPipelineRunnable.java:75)
	at com.streamsets.datacollector.execution.runner.standalone.StandaloneRunner.start(StandaloneRunner.java:726)
	at com.streamsets.datacollector.execution.runner.common.AsyncRunner.lambda$start$3(AsyncRunner.java:151)
	at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.lambda$call$0(SafeScheduledExecutorService.java:226)
	at com.streamsets.datacollector.security.GroupsInScope.execute(GroupsInScope.java:34)
	at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.call(SafeScheduledExecutorService.java:222)
	at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.lambda$call$0(SafeScheduledExecutorService.java:226)
	at com.streamsets.datacollector.security.GroupsInScope.execute(GroupsInScope.java:34)
	at com.streamsets.pipeline.lib.executor.SafeScheduledExecutorService$SafeCallable.call(SafeScheduledExecutorService.java:222)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at com.streamsets.datacollector.metrics.MetricSafeScheduledExecutorService$MetricsTask.run(MetricSafeScheduledExecutorService.java:100)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

 


matfior
Fan
  • Author
  • Fan
  • 4 replies
  • Answer
  • September 3, 2021

I managed to restart the pipeline by using the command `Reset Origin & Start`, however I would like to understand why this happened in the first place.


Dash
Headliner
Forum|alt.badge.img+3
  • Senior Technical Evangelist and Developer Advocate at Snowflake
  • 67 replies
  • September 3, 2021

Ah so the pipeline is running just fine after resetting the origin and restarting the pipeline? If that’s the case, I am not 100% sure what may have caused the mismatch in the datatype. Do you recall any prior  pipeline runs or preview(ing) of data that had caused the pipeline to stop abruptly? 


Drew Kreiger
Rock star
Forum|alt.badge.img
  • Senior Community Builder at StreamSets
  • 95 replies
  • September 17, 2021

Hi @matfior Welcome to our new Community platform! I want to double check that you saw @Dash’s latest message.


matfior
Fan
  • Author
  • Fan
  • 4 replies
  • September 17, 2021
Drew Kreiger wrote:

Hi @matfior Welcome to our new Community platform! I want to double check that you saw @Dash’s latest message.

Sure I saw it! Thanks a lot!


Reply