2

Below code are servlet 3.1 Non Blocking IO demo:

UploadServlet:

@WebServlet(name = "UploadServlet", urlPatterns = {"/UploadServlet"}, asyncSupported=true)
public class UploadServlet extends HttpServlet {
    protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        AsyncContext context = request.startAsync();
        // set up async listener
        context.addListener(new AsyncListener() {
            public void onComplete(AsyncEvent event) throws IOException {
                event.getSuppliedResponse().getOutputStream().print("Complete");

            }

            public void onError(AsyncEvent event) {
                System.out.println(event.getThrowable());
            }

            public void onStartAsync(AsyncEvent event) {
            }

            public void onTimeout(AsyncEvent event) {
                System.out.println("my asyncListener.onTimeout");
            }
        });
        ServletInputStream input = request.getInputStream();
        ReadListener readListener = new ReadListenerImpl(input, response, context);
        input.setReadListener(readListener);
    }

    protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {

    }
}

RealListenerImpl:

public class ReadListenerImpl implements ReadListener{
    private ServletInputStream input = null;
    private HttpServletResponse res = null;
    private AsyncContext ac = null;
    private Queue queue = new LinkedBlockingQueue();
    ReadListenerImpl(ServletInputStream in, HttpServletResponse r, AsyncContext c) {
        input = in;
        res = r;
        ac = c;
    }
    public void onDataAvailable() throws IOException {
        System.out.println("Data is available");

        StringBuilder sb = new StringBuilder();
        int len = -1;
        byte b[] = new byte[1024];
        while (input.isReady() && (len = input.read(b)) != -1) {
            String data = new String(b, 0, len);
            sb.append(data);
        }
        queue.add(sb.toString());
    }
    public void onAllDataRead() throws IOException {
        System.out.println("Data is all read");

        // now all data are read, set up a WriteListener to write
        ServletOutputStream output = res.getOutputStream();
        WriteListener writeListener = new WriteListenerImpl(output, queue, ac);
        output.setWriteListener(writeListener);
    }
    public void onError(final Throwable t) {
        ac.complete();
        t.printStackTrace();
    }
}

WriteListenerImpl:

public class WriteListenerImpl implements WriteListener{
    private ServletOutputStream output = null;
    private Queue queue = null;
    private AsyncContext context = null;

    WriteListenerImpl(ServletOutputStream sos, Queue q, AsyncContext c) {
        output = sos;
        queue = q;
        context = c;
    }

    public void onWritePossible() throws IOException {
        while (queue.peek() != null && output.isReady()) {
            String data = (String) queue.poll();
            output.print(data);
        }
        if (queue.peek() == null) {
            context.complete();
        }
    }

    public void onError(final Throwable t) {
        context.complete();
        t.printStackTrace();
    }
}

above codes work fine, i want to know what are differences with blocking IO servlet? and i want to know how above code works.

Tony
  • 117
  • 1
  • 11

1 Answers1

1

Reading input data:

In the blocking scenario when you read data from the input stream each read blocks until data is available. This could be a long time for a remote client sending large data which means the thread is held for a long time.

For example consider inbound data being received over 2 minutes at regular intervals in 13 chunks. In blocking read you read the first chunk, hold the thread for ~10 seconds, read the next chunk, hold the thread for ~10 seconds etc. In this case the thread might spend less than a second actually processing data and almost 120 seconds blocked waiting for data. Then if you have a server with 10 threads you can see that you would have a throughput of 10 clients every 2 minutes.

In the non-blocking scenario the readListener reads data while isReady() returns true (it must check isReady() before each call to read data),but when isReady() returns false the readListener returns and the thread is relinquished. Then when more data arrives onDataAvailable() is called and the readListener reads data again in until isReady is false().

In the same example, this time the thread reads the data and returns, is woken up 10 seconds later, reads the next data and returns, is woken up 10 seconds later reads data and returns etc. This time, while it has still taken 2 minutes to read the data the thread(s) needed to do this were only active for less than a second and were available for other work. So while the specific request still takes 2 minutes, the server with 10 threads can now process many more requests every 2 minutes.

Sending response data:

The scenario is similar for sending data and is useful when sending large responses. For example sending a large response in 13 chunks may take 2 minutes to send in the blocking scenario because the client takes 10 seconds to acknowledge receipt of each chunk and the thread is held while waiting. However in the non-blocking scenario the thread is only held while sending the data and not while waiting to be able to send again. So, again for the particular client the response is not sent any more quickly but the thread is held for a fraction of the time and the throughput of the server which processes the request can increase significantly.

So the examples here are contrived but used to illustrate a point. The key being that non-blocking i/o does not make a single request any faster than with blocking i/o, but increases server throughput when the application can read input data faster than the client can send it and/or send response data faster than the client can receive it.

mmulholl
  • 281
  • 2
  • 7