I have programmed a plugin in Lua for a game that sends player information via a UDP packet (512 bytes) to a remote server that reads the data from the packet and aggregates all player information into an xml file (which can then be viewed on the web by all players so they can see eachother's current state).
I have programmed the server in Java using a DatagramSocket to handle the incoming packets, however I noticed some strange behavior. After a certain period of time, the DatagramSocket appears to temporarily stop accepting connections for about 10-12 seconds, then resumes normal behavior again (no exceptions are thrown that I can see). There is definitely a relationship between how often packets are sent by the clients and how quickly this behavior occurs. If I increase the update frequency of the clients, the DatagramSocket will "fail" sooner.
It may be worth mentioning, but each packet received spawns a thread which handles the data in the packet. I am running the server on linux if it makes a difference!
Does anyone know what could be causing this sort of behavior to occur?
Andrew