0

I need to clone (make a 1to1 copy of) several Tables in our Oracle DB into another Oracle DB. Both DBs are running under Oracle version 11.2.0.3

Problems are:

  1. Tables (together) are quite large (> 20gb)
  2. Must be a real "Snapshot". DB may change during cloning process
  3. Realisation must be (of course) damn quick

I came across the DB-Link technology, which seems feasible here. But my question is: How can I make sure, that ALL Tables are consistent after the cloning process? I mean this scenario:

  1. Copy Table A
  2. Copy Table B
  3. Source Table A and Table C changes
  4. Copy Table C

Then my copy Table C contains data, which is NOT present in copy Table A, which might be a constraint violation (logically). How can I avoid this? How do I make a real snapshot of 40 Tables? Is there something like a "revision" of the whole DB? How would the DB-Link-Query then look like?

Sauer
  • 1,429
  • 4
  • 17
  • 32
  • 1
    When asking something about a specific product it is better to always include as many details as possible. E.g. "I am running Oracle 11G on SunOS 9.56.7" – p.marino May 13 '14 at 07:58
  • Also, it is usually a good idea to explain "and I have already tried X and it didn't satisfy my requirements due to Y" - in your case you should have mentioned that you used Oracle Export but it is too slow. – p.marino May 13 '14 at 10:01

1 Answers1

1

I suggest you investigate the export utility of Oracle itself. You can impose that the "dump" of the set of tables you are interested in is consistent in terms of transaction, too.

More details here.

Considering you are using 11g I suppose that with "exp tool" you mean the "Data Pump Exp" and not the "legacy exp/imp Utility"?

See the differences here.

If this is not the case try switching to the Data Pump which was designed specifically to cope with large datasets, and may be further tuned to squeeze some extra performance.

p.marino
  • 6,244
  • 3
  • 25
  • 36
  • Thanks for the post. But the exp tool is very slow. I already tested it before. 100MB take over 4 minutes. Thats way too slow. – Sauer May 13 '14 at 08:53
  • Exp tool is pretty optimized already so I sincerely doubt you can do better with an applicative solution. *Especially* if you want to mantain data consistency. – p.marino May 13 '14 at 09:57
  • Though I haven't tested it so far, I accepted your answer, because the first link you've posted helped me a lot. – Sauer May 14 '14 at 11:49