Skip to main content
Solved

Connect SqlServer


realmatcha
Opening Band

Hi, I am new to Streamsets, started using it today.

I have a task to create some backup data from sqlserver to hadoop and hive, the jdbc connection has been changed to sqlserver but it still doesn't work, please help me

Best answer by Bikram

@realmatcha 

The error is saying that the ip and port is not reachable. Please check if the firewall has been opened for this to connect from streamsets.

Once the firewall opened , try to ping the ip from the command prompt to check if its reachable or not then you can try to connect it in streamsets.

 

View original
Did this topic help you find an answer to your question?

4 replies

Bikram
Headliner
Forum|alt.badge.img+1
  • Headliner
  • 486 replies
  • July 28, 2023

@realmatcha 

 

If you are connecting to sql server , then the jdbc string should like below .

 

jdbc:sqlserver://203.102.174.155:3254;databaseName=testing 

 

if you are still having issues please 

make sure you have the port opened for connecting sql server from StreamSets.

 

 


realmatcha
Opening Band
  • Author
  • Opening Band
  • 13 replies
  • July 29, 2023

 

can't connect, is it because it's not a sqlserver processor?


Bikram
Headliner
Forum|alt.badge.img+1
  • Headliner
  • 486 replies
  • Answer
  • July 29, 2023

@realmatcha 

The error is saying that the ip and port is not reachable. Please check if the firewall has been opened for this to connect from streamsets.

Once the firewall opened , try to ping the ip from the command prompt to check if its reachable or not then you can try to connect it in streamsets.

 


realmatcha
Opening Band
  • Author
  • Opening Band
  • 13 replies
  • August 8, 2023

I backup data once a day, for example: on August 7 2023 there are 23894 rows in total, on August 8 2023 there are 532 new rows, total 23894+532 = 24426 rows, but when I run streamsets the data is duplicated and the total is 48320rows(23894+23894+532), how do i avoid the data getting duplicated with the ones that has already been backed up?


Reply