2

I am developing application with Web sockets on Wildfly. I do not know how Wildfly manages threads for Web socket. I guess that there is some thread pool but I noticed that one web socket endpoint has only one thread. Wildfly uses Undertow implementation of Web socket. I am not sure if it is good idea creating additional thread for Web socket endpoint. My requirements are a little unusual. I use JsonRPC protocol. Server is a both JsonRcp client and JsonRpc server. Client also is a both JsonRcp client and JsonRpc server. It means that server can send JsonRpc requests to client and client also can send JsonRpc requests to server. One way to implement it is to open two Web socket connections. First connection will be used for client requests and server responses, second connection will be used for server requests and client responses. I am not sure if it is good idea to open two web socket connections because I googled this idea and it is rather discouraged. One web socket connection is enough, it only must be properly handled.

I noticed that web socket endpoint has one thread. I mean that @OnMessage method is executed on one thread, if two messages was sent to web socket endpoint then second message will be processed after first message processing end. My @OnMessage method can receive requests and responses and I see risk that they can delay each other. I mean that e.g. client send Json RPC request to server, server @OnMessage method is busy because it processes JSON RPC request and prepares JSON RPC response. In mean time, server send unrelated JSON RPC request to client from other thread. Lets assume that this request was processed very fast and JSON RPC response was sent from client to server. Server can not receive response quickly because @OnMessage method is busy.

import java.io.IOException;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;

@ServerEndpoint( value = "/endpoint" )
public class MyWebSocketEndpoint
{

    private Session session;

    @OnOpen
    public void onOpen( Session aSession )
    {
        this.session = aSession;
    }

    @OnMessage
    public void onMessage( String aMessage )
    {
        if( isRequest( aMessage ) )
        {
            // process request and prepare response
            String response = "JsonRpcResponse....";
            try
            {
                session.getBasicRemote()
                    .sendText( response );
            }
            catch( IOException aE )
            {
                aE.printStackTrace();
            }
        }
        else if( isResponse( aMessage ) )
        {
            // process response
        }

    }
    ...  
}

Response processing is rather fast in my case. Processing request can take some time so I think that request should be processed in separate thread but I do not want that single Web socket endpoint could process two requests simultaneously. Single endpoint can process only one request on time, other requests should be queued. I mean sth like this:

import java.io.IOException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;

@ServerEndpoint( value = "/endpoint" )
public class MyWebSocketEndpoint
{

    private Session session;
    final ExecutorService executorService = Executors.newSingleThreadExecutor();

    @OnOpen
    public void onOpen( Session aSession )
    {

        this.session = aSession;
    }

    @OnMessage
    public void onMessage( String aMessage )
    {
        if( isRequest( aMessage ) )
        {
            executorService.execute( () -> {// process request and prepare response
                String response = "JsonRpcResponse....";
                try
                {
                    session.getBasicRemote()
                        .sendText( response );
                }
                catch( IOException aE )
                {
                    aE.printStackTrace();
                }
            } );
        }
        else if( isResponse( aMessage ) )
        {
            // process response
        }

    }
}

I am afraid that it is not good idea because :

  1. Application in Java EE container should not create threads. Java EE container manages threads and application should use only managed threads.
  2. I am afraid that it can be overkill. Each Web socket endpoint will use additional thread and number of threads can increase too much.

How to handle my case properly?

Mariusz
  • 1,907
  • 3
  • 24
  • 39

0 Answers0