4

I'm using psexec to launch a Perl program on remote Windows machines. The program makes a system call to xcopy.This works fine when run directly (locally) on the machines, but when run remotely via psexec, the xcopy fails with the following message:

File creation error - Incorrect function.

(Depending on the user, the message may instead be "Access denied.")

Note that $! gives the following diagnostic:

Bad file descriptor at syscall.pl. perl exited on REMOTE with error code 9.

It does not seem to make a difference whether xcopy is invoked via system() or backticks.

I should point out that the "from" folder is a ClearCase dynamic view (M drive).

Oddly enough, the xcopy seems to work correctly when called directly from psexec.

Here are some other oddities:

  1. The xcopy doesn't always fail. Certain files just seem to be "cursed". The read-only attribute appears not to be a factor.

  2. Once a copy succeeds (e.g., via Windows Explorer), the curse is lifted, and that particular file will no longer cause xcopy errors.

  3. The problem does not seem to be with the destination folder. Once the curse is lifted, the file can be xcopy'd to a new destination.

Following is a portion of the test Perl script I used to narrow down the problem (folder names have been genericized). Note that, for each "my $cmd" tested, I commented out the previous one, and added a status comment.

# ClearCase directory M:\STUFF\ABC contains ABC.tst, ABC.zip and several nonempty subfolders

# Directory copy, D drive to D drive
#my $cmd = "xcopy D:\\temp\\src D:\\temp\\dest /e /i /y";
# works

# Directory copy, M drive to D drive
#my $cmd = "xcopy M:\\STUFF\\ABC D:\\temp\\dest /e /i /k /y";
# fails with "File creation error - Incorrect function" or "Access denied"

# File copy (.tst), M drive to D drive (trailing backslash)
#my $cmd = "xcopy M:\\STUFF\\ABC\\ABC.tst D:\\temp\\dest\\";
# works!

# Directory copy, M drive to D drive (trailing backslash)
#my $cmd = "xcopy M:\\STUFF\\ABC D:\\temp\\dest\\ /e /i /k /y";
# copies the .tst file, but fails on the .zip (yes, the .tst file is now getting copied)

# Directory copy, M drive to D drive (same as above but without trailing backslash)
#my $cmd = "xcopy M:\\STUFF\\ABC D:\\temp\\dest /e /i /k /y";
# copies the .tst file, but fails on the .zip

# File copy (.zip), M drive to D drive
#my $cmd = "xcopy M:\\STUFF\\ABC\\ABC.zip D:\\temp\\dest";
# fails 

# File copy (.zip), M drive to D drive (trailing backslash)
#my $cmd = "xcopy M:\\STUFF\\ABC\\ABC.zip D:\\temp\\dest\\";
# fails

# After manually (Windows Explorer) copying the .zip file to the dest folder and deleting it  
# Directory copy, M drive to D drive with /c (continue after failure)
#my $cmd = "xcopy M:\\STUFF\\ABC D:\\temp\\dest /c /i /e";
# copies the .tst and .zip file (!), but fails on all other files (folders were successfully created)

# After manually copying the Folder1 folder to the dest folder and then deleting it  
#my $cmd = "xcopy M:\\STUFF\\ABC D:\\temp\\dest /c /i /e";
# copies the .tst and .zip file and the contents of Folder1(!), but fails on all other files

# Different dest:
my $cmd = "xcopy M:\\STUFF\\ABC D:\\temp\\dest1 /c /i /e";
# Same results as immediately above

print "Executing system command: $cmd ...\n";
system ($cmd);
#print(`$cmd 2>&1`); #same
UhClem
  • 129
  • 3
  • 10
  • 2
    I'm guessing that ClearCase dynamic views are the real culprit. I'd try the same code against a regular filesystem to rule that out. – msw May 10 '12 at 01:50
  • 2
    Incidentally: using single quotes saves a lot of backslashes here. – reinierpost May 10 '12 at 09:16
  • try `system ($cmd) or die "'$cmd' failed: $!";` this should tell you what the error message is, if it exists (I think that it will be more reliable than your print command). – Barton Chittenden May 10 '12 at 13:54
  • 1
    @BartonChittenden That will only print `$!` if `system` succeeds. Instead, use `system ($cmd) and die "'$cmd' failed: $!"`. See [perldoc -f system](http://perldoc.perl.org/functions/system.html) for details. Also, use `LIST` form as I suggested. – Sinan Ünür May 10 '12 at 19:06
  • @SinanÜnür -- oh, you're exactly right, I reversed the logic on the system command. I was focused on the use of `$!` and overlooked that. – Barton Chittenden May 10 '12 at 22:36
  • @msw, ClearCase dynamic views may indeed be a factor. As you can see in my example, my local D drive test worked fine. Unfortunately, we're stuck with copying from the dynamic view. Oddly, the contents can be seen, but in some seemingly random cases, they just can't be copied via psexec -> Perl system call -> xcopy, at least until the "curse" is lifted (typically via a manual copy). – UhClem May 11 '12 at 09:10
  • @reinierpost, you're right about single quotes. The original source has parts of the path expanded with $variables, thus the double quotes. – UhClem May 11 '12 at 09:13
  • @BartonChittenden, unfortunately $! doesn't provide any additional information. Though I should point out that the files I coerced into xcopying yesterday are failing again tonight. – UhClem May 11 '12 at 09:40
  • @BartonChittenden, I take it back. I tried it without the /c, and got the following diagnostic: Bad file descriptor at syscall.pl line 59. perl exited on REMOTE with error code 9. – UhClem May 11 '12 at 15:47

5 Answers5

3

I suggest that instead of using the xcopy command, you do the copying using Perl itself. There's a module File::Copy::Recursive that's pretty simple to use. It's not part of the standard Perl distribution, so you have to install it using cpan.

If you don't want to use non-native modules, you can try using File::Find to find the files in a directory, then combine that with File::Copy.

Found two examples on Perl Monks. One using the combination and the other using File::Copy::Recursive

Yes, this isn't answering your question directly, but you should try to avoid using system commands when possible. When you interact with the system shell and command processor (especially when ClearCase hacks with the file system), you can end up with a lot of unintentional interactions that can cause something to work in some situations, but not others.

To figure out the issue you're having with a system call, you'll have to assume the error could be in ClearCase, the cmd.exe shell, the xcopy command, or Perl. By not using the system command, you've simplified your problem, and many times actually speed up the process.

David W.
  • 105,218
  • 39
  • 216
  • 337
  • We may have to use the File::Find/File::Copy combo until the go-ahead is given to deploy the File::Copy::Recursive module at our company. But I hope we can get around this peculiar xcopy problem and not have to resort to that. – UhClem May 11 '12 at 09:01
  • @UhClem: It shouldn't be *that* hard to write a wrapper subroutine around File::Find/File::Copy. At that point, all you have to do is convert all of your system calls to use your recursive copy routine. At the very least, try using File::Copy to see if it works where xcopy fails. Perhaps you can get some information from that. Another advantage of File::Copy is that you can use forward slashes for path delimiters, thus removing the need for the escaped backslashes. That alone would be a big enough win in my book. – Barton Chittenden May 11 '12 at 11:57
3

Try redirecting INPUT from the null device to see if your xcopy works. I don't know why, I encountered this problem a long time ago and somehow (possibly via web search) figured this out.

It would look something like this:

xcopy /args $source $target <nul:;

(Backticks around entire command not showing up)

JKE

John Elion
  • 31
  • 1
  • Thanks, you're a lifesaver. I had the same problem with UniVerse ```EXECUTE "DOS /C XCOPY ..."```, it was working from a subroutine, but failed without any output when called from a phantom process. Adding the redirect to the end solved it. – DarthJDG Jan 27 '14 at 11:47
2

Have you tried system with a list of arguments?

my @args= qw(M:\STUFF\ABC D:\temp\dest1 /c /i /e);
warn "Executing 'xcopy @args' ...\n";
system xcopy => @args;
Sinan Ünür
  • 116,958
  • 15
  • 196
  • 339
  • Thought of trying that, and also open(), but since backticks and system give the same result, and the problem "fixes itself" consistently after a manual copy, I figure the _type_ of system call is not the culprit. – UhClem May 11 '12 at 08:54
1

File::Copy::Recursive may solve your problem.

This way you have a more controled way of copying files. You could wrap this call to an eval block and apply some kind of retry logic there?

Maybe a "dummy update" on the problematic file help? Change the attribute/timestamp of the file or rename it to something else and rename it back...

Sinan Ünür
  • 116,958
  • 15
  • 196
  • 339
user1126070
  • 5,059
  • 1
  • 16
  • 15
  • Yes, I put a comment in my code 6 months ago that File::Copy::Recursive would be a better way to go. The system call to xcopy is a temporary solution until the folks at my company are persuaded to allow that module. (Long story.) – UhClem May 11 '12 at 08:51
1

If you look at how ClearCase manages read/write access for Vob and views, you will see that there is a difference between:

  • reading the content of a directory (need r-x)
  • reading a file within a directory (need just '--x' on the directory + "r--" on the file)

So depending on which user is behind the process launching the xcopy, you can have trouble reading/accessing certain directories/files.
And you need to make sure that use has set the CLEARCASE_PRIMARY_GROUP to the right value (ie a group listed as primary group of the vob, or one of its secondary group), in order to access the Vob (but if its protections are lax enough, everyone can access it anyway).

All that applies for a dynamic view, and would apply only for the update of a snapshot view (once the update is completed, you get files copied locally, and which can be read by any process).

VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250