2

I'm trying to run Handbrake through a Java app I'm writing, and am having trouble waiting for Handbrake to finish.

When I try this :

        ProcessBuilder builder = new ProcessBuilder(
                "cmd.exe", "/c", command);
        Process p = builder.start();

        BufferedReader inputreader = new BufferedReader(new InputStreamReader(p.getInputStream()));
        String line = null;
        while((line = inputreader.readLine()) != null)
        {
            System.out.println(line);
        }

The output I get is :

Encoding: task 1 of 1, 0.00 %

Over and over, and the file never gets converted.

When I change it to the following:

        BufferedReader inputreader = new BufferedReader(new InputStreamReader(p.getInputStream()));
        BufferedReader errorreader = new BufferedReader(new InputStreamReader(p.getErrorStream()));
        String line = null;
        String line2 = null;
        while((line = inputreader.readLine()) != null && (line2 = errorreader.readLine()) != null)
        {
            System.out.println(line);
            System.out.println(line2);
        }

It works on my test files, however it gets hung-up when the errorreader runs out of lines to read and the readLine() locks the thread infinitely. On full length files the file gets converted but this portion of code gets locked so it never continues with the application.

Any suggestions?

Justin.Bell
  • 105
  • 9
  • I have the general feeling that process for which you are waiting is not being given a chance to execute, and hence you are hung up on the same output coming from it. – Tim Biegeleisen Jun 12 '15 at 01:56
  • @TimBiegeleisen - That was my original thought, but when the second one worked I was confused at what would cause that to happen. – Justin.Bell Jun 12 '15 at 02:16
  • @fftk4323 - The output of which piece? – Justin.Bell Jun 12 '15 at 02:17
  • The second piece I think. – Tim Biegeleisen Jun 12 '15 at 02:17
  • Is there any chance that the process is not advancing for some reason? In this case, the output you are seeing is accurate and expected. – Tim Biegeleisen Jun 12 '15 at 02:17
  • My guess is this is because the external process is writing to the same line... Like wget writes download progress in the same line everytime... – Codebender Jun 12 '15 at 02:18
  • 1
    That's the *only* explanation: the child process keeps producing the same data. Reading lines alternately from `stdout` and `stderr` can't possibly work unless the child process produces equal numbers of lines on both, which is extremely improbable. Process output should be consumed in separate threads, or by merging the streams. – user207421 Jun 12 '15 at 02:29
  • I commented below, any ideas from you guys? @AbishekManoharan - That is a correct assumption of it writing to the same line. – Justin.Bell Jun 12 '15 at 02:30
  • @TimBiegeleisen - I'm struggling to figure out if it's advancing or not, any tips on figuring that out? – Justin.Bell Jun 12 '15 at 02:30
  • @JB4times4 Easy answer: Just run the process standalone from a Command Prompt. Monitor the output. – Tim Biegeleisen Jun 12 '15 at 02:33
  • @EJP - I realized after implementing the second one, when the file is small enough it takes long enough to read all the lines that by the time it finishes reading all lines the "readLine" will hit an EOF because the process has finished, allowing the loop to be ended. I'm still confused at why the solution suggested below allows it to work though. – Justin.Bell Jun 12 '15 at 02:33
  • @TimBiegeleisen - So running it outside of my app it does run successfully, and inside the app it runs successfully with the below suggestion (any file size) or my second attempt above if the file is small enough. Now that I've found a solution, I'm hoping to figure out the technical reason for that fixing it. – Justin.Bell Jun 12 '15 at 02:35

1 Answers1

3

Call builder.redirectErrorStream(true); before creating the process (this will merge the input and the error stream into one: the input stream), and only read from the InputStream.

That should solve the problem of the error stream running out of data before the input stream.

If you do want to keep them separate, then you can start two threads, on to read from the input stream and one from the error stream.

Erwin Bolwidt
  • 30,799
  • 15
  • 56
  • 79
  • Huh, that seemed to solve both issues. Any idea why doing that would cause the first to work? – Justin.Bell Jun 12 '15 at 02:29
  • Not a definitive reason, but: these streams are pipes with a limited capacity. If you're not reading from the error stream (in the first example), it ends up full and that means that the process cannot continue. My *guess*: handbrake is likely multi-threaded, and the thread that does the conversion writes to the error stream. This thread gets blocked because the error stream pipe is full. A different thread send out data to the input stream (the stdout output stream from the perspective of handbrake), and it keeps sending "0%" because the conversion thread is blocked. – Erwin Bolwidt Jun 12 '15 at 02:54
  • That's an interesting thought I'll have to try digging a little deeper and see if I can validate something like that causing the issue. Thanks for your input! – Justin.Bell Jun 12 '15 at 18:08