7

I want to read from a (Tomcat servlet) InputStream and copy the (large) content to a file asynchronously using the AsynchronousFileChannel. I can do it with a regular FileChannel and read about the missing transferTo. But if I use the Java 7 AsyncFileChannel, I always get the BufferOverflowException.

    try (AsynchronousFileChannel output = AsynchronousFileChannel.open(path, StandardOpenOption.CREATE, StandardOpenOption.WRITE);
         output.lock(); // need to lock, this is one key reason to use channel

        ReadableByteChannel input = Channels.newChannel(inputStream); // servlet InputStream
        ByteBuffer buf = ByteBuffer.allocate(4096);
        int position = 0;
        int count;
        Future<Integer> lastWrite = null;
        while ((count = input.read(buf)) >= 0 || buf.position() > 0) {
            logger.info("read {} bytes", count);
            buf.flip();
            output.write(buf, position);
            if (count > 0) position += count;
            buf.compact();
        }
        if (lastWrite != null) lastWrite.get(10, TimeUnit.SECONDS);

then when running I get

14:12:30.597 [http-bio-9090-exec-3] INFO  c.b.p.c.BlobUploadServlet - read 4096 bytes
14:12:30.597 [http-bio-9090-exec-3] INFO  c.b.p.c.BlobUploadServlet - read 0 bytes
... many more with 0 bytes read ...
14:12:30.597 [http-bio-9090-exec-3] INFO  c.b.p.c.BlobUploadServlet - read 3253 bytes
14:12:30.605 [http-bio-9090-exec-3] ERROR c.b.p.c.BlobUploadServlet - null
java.nio.BufferOverflowException: null
at java.nio.HeapByteBuffer.put(HeapByteBuffer.java:183) ~[na:1.7.0_17]
at java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:393) ~[na:1.7.0_17]

How can I fix the BufferOverflow? Also what's the proper way to suspend the loop and wait when 0 bytes are read?

Community
  • 1
  • 1
teddy
  • 2,158
  • 4
  • 26
  • 34
  • Still looking for an answer for this or did you solve it? – Jose Martinez Feb 10 '15 at 18:41
  • Where are you getting `inputStream` from? The error is caused by `HeapByteBuffer.put` getting called with a byte array that is too large for it to fit, but `Channels.ReadableByteChannel.read` appears to be correct, unless `inputStream.read` returns a larger size than the maximum passed to it. (That would be a broken implementation of `InputStream`, but the source code of `HeapByteBuffer` and `ReadableByteChannel` seem to be correct.) – Cel Skeggs Mar 29 '15 at 01:21

1 Answers1

3

Too late for the original poster but anyway.

I've tried to reproduce your issue (but with slightly different sample, I duplicated large file with aid of channels):

public static void main(String[] args) throws IOException,  InterruptedException, ExecutionException {
    final InputStream inputStream = new FileInputStream("/home/me/Store/largefile");
    final ReadableByteChannel inputChannel = Channels.newChannel(inputStream);
    final AsynchronousFileChannel outputChannel = AsynchronousFileChannel.open(
                    FileSystems.getDefault().getPath(
                    "/home/me/Store/output"),
                    StandardOpenOption.CREATE, StandardOpenOption.WRITE);
    outputChannel.lock();

    final ByteBuffer buffer = ByteBuffer.allocate(4096);
    int position = 0;
    int recievedBytes = 0;
    Future<Integer> lastWrite = null;

    while ((recievedBytes = inputChannel.read(buffer)) >= 0
            || buffer.position() != 0) {
        System.out.println("Recieved bytes: " + recievedBytes);
        System.out.println("Buffer position: " + buffer.position());
        buffer.flip();
        lastWrite = outputChannel.write(buffer, position);
        // do extra work while asynchronous channel is writing bytes to disk,
        // in perfect case more extra work can be done, not just simple calculations
        position += recievedBytes;
        // extra work is done, we should wait, because we use only one buffer which can be still busy
        if (lastWrite != null)  lastWrite.get();
        buffer.compact();
    }

    outputChannel.close();
    inputChannel.close();
    inputStream.close();
}

In the each iteration of the loop we read a chunk of data from input stream, then we 'push' this chunk into output stream. Current thread doesn't wait for the completion of writing, it proceeds, so we can do extra work. But before new iteration we should wait for the completion of writing. Try to comment out

if (lastWrite != null) lastWrite.get();

and you'll get

java.nio.BufferOverflowException.

Your code gave me a tip to use Future for handling last write operation. But you missed waiting for the last operation.

Also I've omitted some additional tunning which is in your snippet (just for simplicity, don't need additional tunning when working with files).

flaz14
  • 826
  • 1
  • 13
  • 25