1

I am looking for a solution to stream data from Oracle 11g to Kafka. I was hoping to use GoldenGate, but that only seems to be available for Oracle 12c. Is the Confluent platform the best way to go?

Thanks!

miguno
  • 14,498
  • 3
  • 47
  • 63
cicit
  • 581
  • 5
  • 24
  • Does this answer your question? [How to integrate Oracle and Kafka](https://stackoverflow.com/questions/29929205/how-to-integrate-oracle-and-kafka) – Adam Leszczyński May 25 '20 at 18:54

4 Answers4

2

First, the general answer would be: The best way to connect Oracle (databases) to Kafka is indeed to use Confluent Platform with Kafka's Connect API in combination with a ready-to-use connector for GoldenGate. See the GoldenGate/Oracle entry in section "Certified Connectors" at https://www.confluent.io/product/connectors/. The listed Kafka connector for GoldenGate is maintained by Oracle.

Is the Confluent platform the best way to go?

Hence, in general, the answer to the above question is: "Yes, it is."

However, as you pointed out for your specific question about Oracle versions, Oracle unfortunately has the following information in the README of their GoldenGate connector:

Supported Versions

The Oracle GoldenGate Kafka Connect Handler/Formatter is coded and tested with the following product versions.

  • Oracle GoldenGate for Big Data 12.2.0.1.1
  • Confluent IO Kafka/Kafka Connect 0.9.0.1-cp1

Porting may be required for Oracle GoldenGate Kafka Connect Handler/Formatter to work with other versions of Oracle GoldenGate for Big Data and/or Confluent IO Kafka/Kafka Connect

This means that the connector does not work with Oracle 11g, at least as far as I can tell.

Sorry if that doesn't answer your specific question. At least I wanted to give you some feedback on the general approach. If I do come across a more specific answer, I'll update this text.

Update Mar 15, 2017: The best option you have at the moment is to use the Confluent's JDBC connector. That connector can't give you quite the same feature set as Oracle's native GoldenGate connector though.

miguno
  • 14,498
  • 3
  • 47
  • 63
  • Updated my answer with Confluent's JDBC connector as the best option at the moment. – miguno Mar 15 '17 at 18:04
  • 1
    Thank you, it does answer my question. I have started going the direction of the Confluent JDBC Kafka Connect product. – cicit Mar 22 '17 at 15:03
  • Confluent's JDBC connector can't get every change. All modes will result in skipping rows that are committed in a different order than inserted/updated, as it uses a naive incrementing column approach. – Patrick Szalapski Apr 22 '21 at 21:07
0

Oracle GoldenGate and Confluent Platform are not comparable.

Confluent Platform provides the complete streaming platform and is a collection of multiple software which can be used for streaming your data, where as GoldenGate is replication and data-integration software.

Also GoldenGate is highly reliable for db replication since it maintains transactional integrity, same cannot be said for Kafka Mirror Maker or Confluent's Replicator at this time.

zer0Id0l
  • 1,374
  • 4
  • 22
  • 36
  • 1
    I wasn't trying to imply they are comparable, but since I can't use Golden Gate then what is the next best thing...even if that thing needs custom solutions to solve some of those issues. – cicit Mar 15 '17 at 11:22
0

If you want just pure transactions - please also consider using OpenLogReplicator. It supports Oracle database from version 11.2.0.1.

It can produce transactions to Kafka in 2 formats:

  • Classic format - when every transaction is one Kafka message (multiple DMLS per Kafka message)

  • Debezium style format - transactions are divided - every DML is one Kafka message

There is already a working version. You can try it.

Adam Leszczyński
  • 1,079
  • 7
  • 13
-2

Right now I am using ojdbc6 to connect to Oracle 11g. It is good enough but not perfect especially when using pooling mode to check if there are new updates on the original tables.

I tried also to read all tables using certain pattern but this did not work well.

The best mode to connect an Oracle DB to Kafka (especially when the tables are very wide, columns wise, is to use queries for the connectors. This way, you ensure that you pick the right fields and do some casting for numbers if you are using avro.