1

I am using Net::FTP::Recursive to download all of the files on my server to localhost on a daily basis. Is there a way to check for file existence on localhost b/f I download everything? I'd like to skip the files I already have downloaded.

my $host = "host"; my $user = "u"; my $password = "p";
my$f = Net::FTP::Recursive->new($host, Debug => 0, Passive => 0);
    $f->login($user,$password);
    $f->rget();    
    $f->quit;
mu is too short
  • 426,620
  • 70
  • 833
  • 800
user_78361084
  • 3,538
  • 22
  • 85
  • 147

2 Answers2

4

The rget method in Net::FTP::Recursive takes a ParseSub => sub { ... } argument that decides which files on the remote server should be retrieved.

Take a look at the parse_files method in the source code -- that is the default function that rget uses -- and see how you can use a modified version of it to exclude files that already exist on your local host.

mob
  • 117,087
  • 18
  • 149
  • 283
  • thanks. None of them seem to have something that checks if the file exists on localhost...it's just performs regex on the remotehost – user_78361084 Mar 31 '11 at 19:18
  • Right, you have to implement your own `my_parse_files` routine that checks your local host for the files. – mob Mar 31 '11 at 19:55
2

What you could do is, before you initialize your FTP connection, get a list of files that are on your local machine. Then bodge these together into a regex, and use the OmitFiles property of rget to omit the files on your local machine.

Of course, this doesn't handle the case where a file has been modified on the remote machine and you don't have that modification. Or the case where your transfer got interrupted and you only have half a file.

You should probably think about using rsync instead, whether that's using the command-line program or the File::Rsync module.

CanSpice
  • 34,814
  • 10
  • 72
  • 86