tl;dr
I'm streaming content with Dancer2's keywords delayed
, flush
, content
, done
and it blocks the server while streaming until I call done
. How can I avoid that?
More detailed
I wrote a Dancer2 application that calls an external command (a shell/Perl script) when the user hits a specific button. I want the output of that command to appear in the browser in a <pre>
section and I want to refresh that section as new lines arrive. Similar to a tail -f
but in the browser. My external command runs several seconds, possibly up to 10 minutes, so I run it asynchronously.
My first approach was to completely detach the program from the Webserver using double-fork, exec, ssid, and closing/reopening the program's STDIN/OUT/ERR so that the command's output goes to a temporary logfile. Then I set up AJAX calls (one per second) so that my Dancer2 application reads the new lines from the logfile and returns them to the client until the PID of the external command would disappear. This worked like a charm until the moment when my "external command" issued ssh
commands to contact some other server and return that output as well. ssh
doesn't work properly when run without a terminal and the ssh
did not produce any output. Think of ssh $host "ls -l"
, which gave no output.
So I switched to Dancer2's delayed
mechanism like shown in the code below. I took the CSV example from the Dancer2 docs as a template. Again, it works like a charm, but while the command is running and new lines appear in the browser, the server is blocked. When I click some other link on the Webpage, I only see an hour glass until the command is over. It looks like the server is single-process and single threaded.
index.tt
<script>
function start_command( event ) {
$('#out_win').empty();
var last_response_len = false;
$.ajax({
url: '/ajax/start',
xhrFields: {
onprogress: function(evt){
/* make "this_response" only contain the new lines: */
var this_response;
var response = evt.currentTarget.response;
if ( last_response_len === false ) {
this_response = response;
last_response_len = response.length;
} else {
this_response = response.substring(last_response_len);
last_response_len = response.length;
}
/* add those new lines to <pre> and scroll down */
var pre = $('#out_win');
pre.append(this_response);
pre.scrollTop(pre.prop('scrollHeight'));
}
},
success: function(result, textStatus, jqXHR) {
alert("Done streaming, result="+result);
},
error: function( jqXHR, textStatus, errorThrown ) {
alert("error; status=" + textStatus);
},
});
event.preventDefault();
}
</script>
<div class="container">
<div> <%# Links %>
<a href="javascript:start_command();">Start external command</a></br>
<a href="/other">Show some other page</a>
<div>
<div> <%# output window %>
<pre class="pre-scrollable" id="out_win"></pre>
</div>
</div>
Streaming.pm
package Streaming;
use Dancer2;
################################################################
#
################################################################
get '/' => sub {
template 'index';
};
################################################################
#
################################################################
get '/ajax/start' => sub {
delayed {
flush; # streaming content
# "stream" something. Actually I start an external program here
# with open(..., "$pgm |") and stream its output line my line.
foreach my $line ( 1 .. 10 ) {
content "This is line $line\n";
sleep(1);
}
done; # close user connection
}
on_error => sub {
my ($error) = @_;
warning 'Failed to stream to user: ' . request->remote_address;
};
};
true;
I'm using
- Dancer2 0.204002 as installed via
apt install libdancer2-perl
on - Ubuntu 17.04
- no further Webserver, i.e. I'm using the server that ships with Dancer2, started with
plackup -a bin/app.psgi
- jQuery 2.2.4
- Bootstrap 3.3.7
- Perl 5.18+
My questions are:
- Am I doing something completely wrong wrt. the
delayed
keyword? - Or am I using the wrong Webserver for this task? I'd love to stick with the Webserver that comes with Dancer2 because it's so simple to use and I don't need a throughput like e.g. Google does. We will have 1-3 users a day and mostly not at the same time, but a blocking Webapplication would be a no-go.