0

I have two PCs that want to share tensorflow models "hdf5 format" in a federated learning manner via a PostgresSQL database.

The models will be trained locally on both machines, and then transferred to the database along with the training history. The transfer will be done for multiple cycles in a specific schedule.

I searched online for solutions to transfer the files via PostgresSQL database, but all solutions suggest a tabulated data transfer, e.g. csv file data, not arbitrary file extensions, like hdf5.

Can anyone help me, even with a roadmap, for the solution? If any tutorials or examples for similar scenarios would be suggested, that would be also appreciated.

Thanks for your help in advance!

  • While it's possible to store large binary data blobs in Postgres, this doesn't sound like it's the best solution. You say you already have an ssh connection between the PCs? Then just use that to copy the file. – Bergi Jan 23 '23 at 04:04
  • @Bergi Thanks for your quick answer! SSH would work, but in this project, I need the PostgresSQL database to host the whole data transfer between the machines rather than relying on the SSH connection. – Mr. Gulliver Jan 23 '23 at 08:12

0 Answers0