0

On my new server (dedicated, CentOs5), 2 or 3 .myapp files (like "51b9dc4cc246f.myapp", ~500 Mo) is created every minute under /tmp, fulling my harddisk at an alarming speed.

There is nothing in crontab -e.

Any ideas from where this can come from ?

Thanks a lot

Nicolas Reynolds
  • 177
  • 1
  • 1
  • 8

3 Answers3

4

I've no idea what would create the files but you you try running fuser on one that is being generated to find out which process is creating them and work back from the PID.

fuser -v /tmp/51b9dc4cc246f.myapp

will hopefully give you some information you can use e.g.

fuser -v  /var/run/crond.pid
                     USER       PID  ACCESS COMMAND
/var/run/crond.pid:  root       1698 F....  crond

which shows us that the user root is running crond with PID 1698 which has the /var/run/crond.pid file open for writing(F).

user9517
  • 115,471
  • 20
  • 215
  • 297
  • No response from the first command, and the second one either : `USER PID ACCESS COMMAND /var/run/crond.pid: root 2601 F.... crond` – Nicolas Reynolds Jun 13 '13 at 16:28
  • @NicolasReynolds: they are only example cpommands - you should use fuser on a .myapp file that is being actively written to. – user9517 Jun 13 '13 at 16:31
  • After searching, it appears that it was the XHPROF extension that created the files, /tmp being the default folder. Debuuging XHPROF solved the problem. – Nicolas Reynolds Jun 14 '13 at 15:01
1

Check which process has a .myapp file open, as root:

lsof | grep -i myapp

This may give you a clue.

If not, then investigate a little more, check what type of file is it:

file 51b9dc4cc246f.myapp

If it's text just open it with less. If not use strings to see if it has anything readable in it:

strings 51b9dc4cc246f.myapp | less

If that doesn't get any clue, i'll bet to leave a loop running, preferably in a screen session with the lsof command, every minute or less

while [ true ]; do lsof | grep -i myapp; sleep 30; done

Good luck!

Kus
  • 41
  • 2
0

The other answers here recommend fuser or lsof, and those are technically better solution. However those tools can be tedious, so here is an alternative which might be quicker.

Creating that many files will consume a significant amount of processing time.

Run top. Sort by CPU usage. Watch for a few minutes. The process which is writing that many files will likely be in the top 3-4 consumers of CPU. To verify, try stopping each process one at a time. Does the bad behavior go away?

Stefan Lasiewski
  • 23,667
  • 41
  • 132
  • 186