I want to stream the data from postgreSQL to HDFS but all of my tables have multiple timestamp columns. I had tried simple jdbc connect and HDFS sink offered by confluent to read data from postgres and write it to HDFS but all of my date columns are converted into BigInt instead of date format. I don't know which format this big int is in but its not in Unix timestamp.Is there any why to find out in which format that is? I am also thinking to transform the date columns to UNIX timestamp in my jdbc connect and then transform them to date in my sink by using TimeStampConverter. But i don't know how to implement it in my current procedure. The issue is reported here.
Asked
Active
Viewed 503 times
1

OneCricketeer
- 179,855
- 19
- 132
- 245

jarry jafery
- 1,018
- 1
- 14
- 25
-
1You need to provide some code along with example messages. – Giorgos Myrianthous Oct 03 '18 at 08:28
-
1what type of code and example messages means the data? I am using simple JDBC connector to get the data to the topic. – jarry jafery Oct 03 '18 at 08:31
-
You should be able to add a CAST to the query option for the connector to get Unix time, no? Have you tried using Debezium? – OneCricketeer Oct 03 '18 at 13:57
-
1@cricket_007 I had used cast function and it worked without Debezium. – jarry jafery Oct 04 '18 at 07:40
-
Feel free to post your final solution below – OneCricketeer Oct 04 '18 at 13:30