Background:
I've been tasked with importing a large amount of data from a production database to a test database (Oracle 12c release 2 running on RHEL) and I'm using Data Pump.
The first time I imported the tables, The tables were created and the data was imported as planned, but - due to an issue in my data pump parameter file - the constraints were not imported.
My subsequent attempts did not go as well, however. Data Pump began to freeze part way and the STATUS
command showed that no bytes were being processed.
My Solution Attempts:
I tried using TABLE_EXISTS_ACTION=REPLACE
and dropping the tables directly after an attempt. I also dropped the master tables of any data pump jobs I was unable to terminate from the utility.
Still, it seemed to hang earlier and earlier in the process as I continuously tried to import the tables. df -h
returned 100% disk usage every time it hanged.
The dump file itself is on a separate drive so it's no longer taking up room. I've been trying to clear out space but it keeps filling up when I run a job and I can't tell where. Oracle flashbacks are disabled and I made sure to purge the oracle recycle bin.
tl;dr:
Running impdp jobs seems to use up disk space beyond the added tables and the job master tables. Where is this memory getting used up and what can I do to clear it for a succesful import?