I'm working on federating a UNET semantic segmentation workflow using flower and Pytorch. As of right now I can load the data and run a centralized training but once I try to federate it I see that model parameters are not being loaded properly. I have included a google colab notebook to the code and the log output, to keep the question short.
https://colab.research.google.com/drive/1dmlH4QTX_ZwicbSfwVeCw55BXRnV6PY4?usp=sharing
I'm leaving this up here incase someone is trying to implement a similar workflow. Feel free to reach out.