3

I have a program that creates a bunch of movie files. I runs as a cron job and every time it runs the movies from the previous iteration are moved to a 'previous' folder so that there is always a previous version to look at.

These movie files are accessed across a network by various users and that's where I'm running into a problem.

When the script runs and tries to move the files it throws a resource busy error because the files are open by various users. Is there a way in Python to force close these files before I attempt to move them?

Further clarification:

JMax is correct when he mentions it is server level problem. I can access our windows server through Administrative Tools > Computer Management > Shared Folders > Open Files and manually close the files there, but I am wondering whether there is a Python equivalent which will achieve the same result.

something like this:

try:

    shutil.move(src, dst)

except OSError:

  # Close src file on all machines that are currently accessing it and try again.  
  • On `*`nix systems, you should be able to *move* a file while it's open. You won't be able to *delete* it. This question would benefit from a small section of code showing *how* you are trying to move the file before the error is raised. Details of your operating system would also be useful. – johnsyweb Sep 27 '11 at 09:41
  • @Johnsyweb: On *nix systems, you should also be able to delete a file while it's open - that'll just unlink (hence the name of the system call) it from the file system. It'll be actually marked free in the filesystem only when the descriptors to it are closed. (This can, afaicr, also be used to resurrect an unlinked file as long as it's not closed.) – AKX Sep 27 '11 at 10:39
  • @AKX: You're right, of course. – johnsyweb Sep 27 '11 at 10:45
  • Why do you think you should be able to close a file that some other user is innocently in the middle of reading from? – Karl Knechtel Sep 27 '11 at 11:37

3 Answers3

3

This question has nothing to do with Python, and everything to do with the particular operating system and file system you're using. Could you please provide these details?

At least in Windows you can use Sysinternals Handle to force a particular handle to a file to be closed. Especially as this file is opened by another user over a network this operation is extremely destabilising and will probably render the network connection subsequently useless. You're looking for the "-c" command-line argument, where the documentation reads:

Closes the specified handle (interpreted as a hexadecimal number). You must specify the process by its PID.

WARNING: Closing handles can cause application or system instability.

And if you're force-closing a file mounted over Samba in Linux, speaking from experience this is an excruciating experience in futility. However, others have tried with mixed success; see Force a Samba process to close a file.

Community
  • 1
  • 1
Asim Ihsan
  • 1,501
  • 8
  • 18
  • 2
    I think I have my answer here. It seems that it isn't easily possible and is fairly inadvisable anyway. I shall change my approach and only generate the movies overnight when everybody is logged off to avoid the problem. – Daniel Lloyd-Wood Sep 27 '11 at 10:46
  • The best answer usually is to avoid the problem! Good on you Daniel. – Asim Ihsan Sep 27 '11 at 11:38
0

As far as I know you have to end the processes which access the file. At least on Windows

rocksportrocker
  • 7,251
  • 2
  • 31
  • 48
0

The .close() method doesn't work on your object file?

See dive into Python for more information on file objects

[EDIT] I've re-read your question. Your problem is that users do open the same file from the network and you want them to close the file? But can you access to their OS?

[EDIT2] The problem is more on a server level to disconnect the user that access the file. See this example for Windows servers.

Shadow The GPT Wizard
  • 66,030
  • 26
  • 140
  • 208
JMax
  • 26,109
  • 12
  • 69
  • 88