0

I was wondering what would be the approach to get rid of a lot of records from an Oracle database in order to create a lighter database for developer's laptops.

We aim to reduce the exports from different production environments NOT excluding entities, but reducing the number of records in each table mantaining the referential integrity.

Is there a tool/script around?

I was also wondering if transforming all the FKs on a replica DB to "on delete cascade" and deleting a subset of record from the entities on the top of the relational hierarchy would do the job.

Any suggestion?

Max
  • 2,508
  • 3
  • 26
  • 44
  • you can import minimal data or data from specific tables ? – Ravi Jan 19 '18 at 18:01
  • Maybe... what about the referential integrities? Are you suggesting to write a script for any single table? Could you elaborate further? – Max Jan 19 '18 at 18:13
  • 1
    I think they might be suggesting doing `expdp` with `CONTENT=METADATA_ONLY`. You could also follow it up with a DATA_ONLY export with an INCLUDE clause to only include your reference/validation table data. – kfinity Jan 19 '18 at 18:20
  • @Max I asked that question to understand what you wanted. Please be specific, that will help to answer. – Ravi Jan 20 '18 at 04:39
  • generate all the create table, procedures, functions or other objects that you need in your development from your database, then you can just run it your development database, for the records that you want to import to development database you can export them from your database as INSERT scripts. you can use SQLDeveloper to generate the scripts – eifla001 Jan 20 '18 at 09:13
  • Fair enough, there is only something I'm still missing: in order to guarantee relational integrity, I'm required to set the criteria (WHERE clause) table by table (on hundreds tables) or there is a way to simplify this task? – Max Jan 20 '18 at 10:31
  • @Ravi: technically, I can do anything. I'm looking for a solution that won't take too long to be implemented because we are talking about hundreds of tables – Max Jan 20 '18 at 13:13
  • @Max We would appreciate, if you share the solution if you found any or consider accepting the answer :-) or start bounty to get more attention. – Ravi Jan 21 '18 at 10:49
  • The suggestions received so far have been useful, but won't suite my needs. I think I'll try to elaborate a solution with my team and I'll be glad to share with you either we succeed/fail the task – Max Jan 21 '18 at 16:45

2 Answers2

1

With Jailer you can export data to an SQL script which can traverse foreign key constraints to include all data needed to maintain referential integrity.

http://jailer.sourceforge.net

Guest
  • 11
  • 1
0

If you wanted to export/import limit object from/to database, then you could EXCLUDE the objects, which you don't wanted to be part of your dump.

You can exclude any specific table to be exported/imported by specify the object type and object name.

EXCLUDE=TABLE:"='<TABLE_NAME>'"

==Update==

AFAIK, I don't see, if Oracle provides such flexibility to export subset data, but Oracle does have option to export partitioned data from TABLES

TABLES=[schema_name.]table_name[:partition_name] [, ...]
Ravi
  • 30,829
  • 42
  • 119
  • 173
  • Now I got what you ment. Well, actually I want almost all the objects... what I aim to get is ALL the tables with just a subset of data so It can be ways lighter. But of course data must be consistent through FKs. – Max Jan 20 '18 at 13:24
  • @Max You mean, all object, but limited data from that tables ? – Ravi Jan 20 '18 at 13:25
  • Exactly. Developers need all the entities for development and testing. That's why I was thinking about turning FKs to "on delete cacade" and delete record from entities on the top of the relational hierarchy, – Max Jan 20 '18 at 13:26
  • @Max I'm not aware of such thing, where you can limit the data import/export. But, if you are fine with getting **only** metadata of all tables, then we do have option – Ravi Jan 20 '18 at 13:27