I have 30k+ text files with different number of columns, space delimited, and task is to load data, columns wise, into 1 big oracle table. Details & possible approaches are below:
Text File 1-
Date Time SiteName CompName ipAdd
jkl mno pqr stu vwx
Text File 2-
Date CompName ipAdd
abc def ghi
Text File 3-
Date SiteName
mno pqr
Expected Oracle Table-
Date Time SiteName CompName ipAdd
jkl mno pqr stu vwx
abc def ghi
mno pqr
Now, as you can see, there are different number of 'same' columns spread across multiple text files. And I have to load data into 1 big oracle table.
PLEASE NOTE:
In all text files, the first column is always 'Date', and the columns occur in an order, i.e., from Date to Time..or Date to CompName... (an increasing order from Date onwards)
The problem is that I could not find a way to use SQL Loader in a way that it can differentiate between 2 columns, and 15 columns in 2 different text files, even when I have their name in 1st line of text files, and then load them in a big table.
Approaches:
- Combine all text files in to 1 big text file and do something - FAILED
- Tried to put a delimiter, for max number of columns that can be out of all the text files (25 columns), and do something - COULD NOT
- Load 35k+ text files in 35k+ oracle tables, and then sit and combine these 35k+ oracle tables into 1 big table - RECOMMENDED BY PPL AROUND ME, BUT IT WAS TOO LONG, MAYBE SPACE & TIME/EFFORT INEFFICIENT SO I SEARCH FOR A REAL ALTERNATIVE
Please give your best approach. :-(