Skip to main content

Hi,

Facing error in Data collector pipeline Oracle CDC to Snowflake while processing CLOB datatype table. When inserting value >= 4000 then works fine. But the less than that facing error.
SNOWFLAKE_28 - Snowflake MERGE for 'sdc-df26c7ce-fd00-4b5d-a3b1-817b9809e0a3.csv.gz' processed '1' CDC records out of '3'

Oracle:
create table TEST_LOB_VAL(lobid number(10) primary key, lob_value clob );

INSERT INTO TEST_LOB_VAL VALUES (1,to_clob(RPAD('AAAAAAAAAAAAAAAAAA',4000,'A')));
INSERT INTO TEST_LOB_VAL VALUES (2,to_clob(RPAD('AAAAAAAAAAAAAAAAAA',2000,'A')));
INSERT INTO TEST_LOB_VAL VALUES (3,'AAAAAAAAAA');
COMMIT;

Hello @Nilesh - Thank you for reaching out to StreamSets Community. As per the snowflake documentation, we have to create the VARCHAR datatype in snowflake as CLOB is not supported.

Link: https://docs.snowflake.com/en/sql-reference/data-types-unsupported

Could you please try VARCHAR datatype in the snowflake and see if it resolve the issue?

Thank you - AkshayJ

 


Hi @AkshayJadhav thanks for reply. Yes I have created table with VARCHAR only. As I mention in post, I’m able to load value which having around length >= 4000 but when data length < 2000 around, I’m facing error…

error.
SNOWFLAKE_28 - Snowflake MERGE for 'sdc-df26c7ce-fd00-4b5d-a3b1-817b9809e0a3.csv.gz' processed '1' CDC records out of '3'

Oracle:
create table TEST_LOB_VAL(lobid number(10) primary key, lob_value clob );

INSERT INTO TEST_LOB_VAL VALUES (1,to_clob(RPAD('AAAAAAAAAAAAAAAAAA',4000,'A')));
INSERT INTO TEST_LOB_VAL VALUES (2,to_clob(RPAD('AAAAAAAAAAAAAAAAAA',2000,'A')));
INSERT INTO TEST_LOB_VAL VALUES (3,'AAAAAAAAAA');
COMMIT;

 

Thanks.

#Nilesh


Hello @Nilesh - I was bit busy in other work hence could not get back to you on this topic. I ran a couple of tests and able to push the records to the snowflake without any issues. 

I believe, records are getting discarded due to the limitation on LOB size. This test was performed on the data collector v5.2 where I have enabled the following configuration:

 

Pipeline screenshots:- 

 

What is the data collector version that you are using here?

 

Thank you - AkshayJ


Hi @AkshayJadhav , thanks for reply.

I’m using Data Collector version 5.3.0. I have the same configuration, but still having same error. I have tried destination as JDBC Producer as well, but this time pipeline not showing error but records also not processing, so the issue is still there as it is.

 

Thanks,

Nilesh


Reply