some code of mine is using a queue to upload files to an FTP server. When files get queued, a connection attempt is made and, if successful, files are uploaded. Once the queue is empty it disconnects the server. Pretty straightforward.
The queue can, and will, be accessed by multiple threads. In some cases the queue completes, thus disconnects, but then new files get queued immediately afterwards, and a new Connection\Upload cycle starts. In some extreme cases it uploads only 1 file between connection and disconnection, only to repeat the process for several minutes or even hours.
I find this unacceptable and will change the code accordingly, but I wonder:
Question
Say we'd ignore the obvious waste of time & resources (all those dis/connections are unnecessary, to say the least).
Does the FTP server mind? Is such a behavior frowned upon, is it rude? Could it be even seen as a borderline flooding or hammering of some sort?