I have a complex sql script. All tables are on database(DB) 'A', when i run the script there i get the results after 50 minutes. I have an ETL tool which runs the script on DB 'A' and fills a table on DB 'B' with the results. The ETL tool is being fased out so for the future i need to execute the script from DB 'B' with a DB link to DB 'A' to still have the table with results on DB 'B'.
The DB link works fine. I have added @MY_DB_LINK to around 80 table references and 1 function call. However after 2 and a half hours of running the script this way i still have no results. Is there a way to parse the whole script over the DB link to have the sql executed on DB 'A' and write the results to a table on DB 'B'? Can anyone think of a different solution?
Extra info which might clarify the situation: DB 'A' is a weekly copy of a source system. The old copy is deleted and a new copy is created. I can not create anything on DB 'A', i only have read rights there. I can do anything i want on DB 'B'.