0

Currently we are using Oracle 19c external table functionality on-prem whereby CSV files are loaded to a specific location on DB server and they get automatically loaded into an oracle external table. The file location is mentioned as part of the table DDL.

We have a requirement to migrate to azure managed postgresql. As per checking the postgresql documentation, similar functionality as oracle external table can be achieved in standalone postgresql using "foreign tables" with the help of file_fdw extension. But in azure managed postgresql, we cannot use this since we do not have access to the DB file system.

One option I came across was to use azure data factory but that looks like an expensive option. Expected volume is about ~ 1 million record inserts per day.

Could anyone advise possible alternatives? One option I was thinking was to have a scheduled shell script running on an azure VM which loads the files to postgresql using PSQL commands like \copy. Would that be a good option for the volume to be supported?

Regards
Jacob

Jacob
  • 426
  • 3
  • 19
  • 2
    This is a sad story. Database hosting is great for some use cases, not so much for others. Unfortunately, it is fashionable, so managers outdo each other to see who is faster in handing over all their data to cloud providers. At some point, people will wise up, only to learn that you need to face massive down time if you want to get your data back. Meanwhile, people have to come up with complicated workarounds to get some of the old functionality back. Small as the chances are, try to talk sense into the heads of the decision makers. – Laurenz Albe May 16 '22 at 07:11

1 Answers1

0

We have one last option that could be simple to implement in migration. We need to use Enterprise DB (EDB) which will avoid the vendor lock-in and also it is free of cost.

Check the below video link for the migration procedure steps.

https://www.youtube.com/watch?v=V_AQs8Qelfc

Sairam Tadepalli
  • 1,563
  • 1
  • 3
  • 11