1

In my server process, it looks like this:

Main backend processes:

  1. Processes Huge list of files and , record them inside MySQL.

  2. On every 500 files done, it writes "Progress Report" to a separate file /var/run/progress.log like this "200/5000 files done"

  3. It is multi-processed with 4 children, each made sure to run on a separate file.

Web server process:

  1. Read the output of /var/run/progress.log every 10 seconds via Ajax and report to progress bar.

When processing a very large list of files (e.g. over 3 GB archive), the processes lock up after about 2 hours of processing.

I can't find what is going on. Does that mean that /var/run/progress.log caused an I/O deadlock?

Cœur
  • 37,241
  • 25
  • 195
  • 267
Phyo Arkar Lwin
  • 6,673
  • 12
  • 41
  • 55
  • Do you fill '/var/run/progress.log' file by yourself or use some standard logging module? – Roman Bodnarchuk May 01 '11 at 12:11
  • By my self , but writes are always closed immediately. They are like write every 3-5 second , depending on size of file . – Phyo Arkar Lwin May 01 '11 at 12:39
  • 3
    Unless you use file locks, there's no such thing as I/O deadlocking - it's a bug in your program somewhere. run `strace -p` on the process to see what it's trying to do. – nos May 01 '11 at 13:02
  • lets say , While a process open a file and writing data into it, and then another file tries to read (that open one is not closed) wont it cause IO Slowdown / Lockup? – Phyo Arkar Lwin May 01 '11 at 18:06
  • No, it won't deadlock. However, your reading application could read a partial update. If your parsing code isn't written to expect invalid data, you could have an exception being through from an unexpected place. Depending on your code structure, that could easily cause the appearance of a 'deadlock' – Rakis May 02 '11 at 12:38

2 Answers2

0

In python on Linux this should not block, however try using the os module

os.open(file, os.O_NONBLOCK | os.O_RDONLY)

and make sure you close the file fh.close() as Python is a bit lazy clean up files.

http://docs.python.org/library/os.html

0

Quick advice, make sure (like, super sure) that you do close your file.

So ALWAYS use a try-except-final block for this

Remember that the contens of a final block will ALWAYS be executed, that will prevent you a lot of head pain :)

Juan Antonio Gomez Moriano
  • 13,103
  • 10
  • 47
  • 65