3

I am developing an application which download files from internet; the files are mainly in ftp servers, I am using LWP::Simple and the getstore function to retrieve the files. But I would like to limit the speed of download, like in wget ...

have you seen something similar to the --limit-rate from wget implemented within LWP or LWP::Simple??

Thanks

Juan
  • 1,520
  • 2
  • 19
  • 31

3 Answers3

2

There is no such option.

If you use a callback to grab the response body, you could intentionally slow down the callback (using sleep()) when necessary to provide the desired rate.

Of course, you could also avoid LWP entirely. For example, you could and shell out to call wget, curl or whatever. You could also use another module such as WWW::Curl.

Update: Added last paragraph.

ikegami
  • 367,544
  • 15
  • 269
  • 518
2

If your main concerns are FTP servers, maybe look at Net::FTP::Throttle. You can set maximum rate in constructor.

bvr
  • 9,687
  • 22
  • 28
1

...Or you could use LWP over libcurl :)

#!/usr/bin/env perl
use common::sense;

use LWP::Protocol::Net::Curl MAX_RECV_SPEED_LARGE => 10240; # 10 KB/s
use LWP::Simple;

getstore 'http://www.cpan.org/src/5.0/perl-5.16.2.tar.gz' => 'perl.tar.gz';
creaktive
  • 5,193
  • 2
  • 18
  • 32