1

Im currently using scp, but it seems to become slower and slower every day. Im up to 30 servers, and I need to shuffle around 3-5k files per day, with the average size of 200-400MB.

I used rsync before also, with even more pitiful results.

Each server should be able to transfer files to any other server in the pool (Im adding about 7 servers a month now), so key management is pretty crucial also.

  • I agree with Shane Madden that without more information it's difficult to provide a reasonable answer. Are these physical or virtual servers? What kind of storage? To me this almost seems like an issue that could be addressed on the storage level (replication etc.) but that would only work if your infrastructure is supporting it. – Reality Extractor Apr 26 '11 at 07:03

1 Answers1

3

Rsync is pretty fast, it should be faster than scp under normal circumstances.

If that's not good enough.. where's the bottleneck? Common source device for all of the files?

Without more information, I'll throw BitTorrent out as an option. For instance, Twitter built a tool for mass code deployment using BitTorrent: Murder.

Shane Madden
  • 114,520
  • 13
  • 181
  • 251