0

I have deep learning models (tensorflow in hdf5 format), which I want to upload to a PostgreSQL database. A single model may be up to 500 MBs, and the models need to be updated and uploaded/downloaded from the database frequently (e.g., once every 30 mins).

I'm a begineer to PostgreSQL, so any information/receipes to start with would be appreciated.

2 Answers2

0

Storing large binary data in an relational DB is possible but unwise. Also, you have to eventually copy your model from the DB to the disk for TensorFlow to load anyway. Why not store your model directly on the disk?

wtz
  • 426
  • 4
  • 15
  • The data which the model is trained on is separated into two different servers. No direct connection exists between the two servers, but both can access the database. So both ends will load the model from the database and train (fine-tune) it on the local data, then reupload it to the database.. sorry if this was a bit confusing, but data has high privacy, cannot be shared outside the servers.. – Mr. Gulliver Feb 27 '23 at 15:16
0

Relation DB is a wrong tool for this. Store the link to an S3 bucket for example. Or your own private cloud if the model is private.