0

BACKGROUND:

I'm implementing a client that communicates to a server using a WebSocket. The core of the client application is built ontop of LibGDX and it is deployed solely in browser form using GWT (HTML5/JavaScript). This is for a side-project - a fast-paced (high network traffic), online, browser-based game.

Without providing any source - essentially the client code is an empty LibGDX application that provides a correct implementation of gwt-websockets (a commonly recommended GWT wrapper for JavaScript websockets).

SYMPTOMS OF THE PROBLEM:

(pre-note: This appears to be a purely client-side issue) I start the application and connect to the server. On 1000 ms intervals the client application sends a packet to the server and receives a packet back - no problems. There are also no problems when I send and/or receive two, three, or even five packets per second.

Speeding up that process faster however ... approximately sending more than 10 packets per second (100 ms intervals) from the client to the server OR receiving more than 10 packets per second (100 ms intervals) OR both sending and receiving 10 packets per second (100 ms intervals) causes the client to slowly drop in FPS while the packets are pouring in, out, or both. The more network communication, the lower the FPS floor becomes (slowly drains from 60...55..50..45..all the way down to 1 if you keep sending the packets). Meanwhile, the server is completely healthy.

Here's the weird thing which makes me suspect there is some sort of buffer overflow on the client - after a packet has been "handled" (note: my Websocket.onHandle() method is empty), the FPS jumps back up to ~60 as if nothing ever happened. From this point, if a packet is sent or received, the FPS drops right back down to the floor value (except a tad bit worse each time this occurs).

FURTHER DEBUGGING:

  • I've looked into the possibility of memory leaks on my end, but after going through a 15 hour debugging session I doubt my code is at fault here (not to mention it is essentially empty). My network class that communicates via Websocket contains literally a bare-bones implementation and the symptoms only occur upon network activity.

  • I've read about potential garbage collection causing undesirable performance hits in a GWT deployment. I don't know much about this other than I cannot rule it out.

  • I doubt this matters but my server uses Java-WebSocket to communicate with the client's gwt-websocket

MY BEST GUESS:

I'm really stumped here. My leading suspicion is that there exists some sort of bug in gwt-websockets with buffers and the handling of frequent network traffic or possibly there is a memory leak.

VILLAIN bryan
  • 701
  • 5
  • 24
  • 1
    I am not sure what your app is exactly doing but check in your libgdx project the render method. If you are creating some objects(e.g your packets) there in the loop that's what's causing the drop in the FPS since the GC should intervene very often and as a whole this results in massive memory overhead – george Jul 16 '15 at 10:45
  • Thanks for the thought, however that isn't the case in the code. Also, the FPS dropping behavior persists in the case where a client is idling (after connecting to the server) and the server sending frequent packets. Once the server stops sending packets to a client, the client FPS jumps back up to an expected value (~60 FPS). – VILLAIN bryan Jul 16 '15 at 11:33

0 Answers0