4

I have a fairly large directory structure with thousands of files. I want to figure out if any have changed since a particular time. Now, I can use

find <dir> -mmin 30 -type f

..to find any files that changed in the last 30 minutes. However, this take a few seconds to go through, and I'm really not interested in (1) finding all the files that changed, or even (2) finding which files have changed. I'm only looking for a yes/no answer to "any files changed?".

I can make (1) better by using -print -quit to stop after the first file was found. However, for the case where no files have changed, the total search still takes a little while.

I was wondering if there was a quicker way to check this? Directory time stamps, maybe? I'm using ext4, if it matters.

Rao
  • 20,781
  • 11
  • 57
  • 77
Stan
  • 1,227
  • 12
  • 26
  • What about using `stat` against the directory itself? My local tests make me thing that this gets updated when a new file is created. Is this enough or do you also need to check if a file is updated? – fedorqui Dec 22 '15 at 17:05
  • @fedorqui I need to check if a file is updated, too. – Stan Dec 22 '15 at 17:07
  • OK then my approach wouldn't help. By the way, I think you mean `mmin -30`, that is `-30` instead of `30`. – fedorqui Dec 22 '15 at 17:08

1 Answers1

1

For GNU's find you may use option -quit to stop searching after the first match.

So if you want to find out, if there is at least one file changed in the past 30 minutes, then you can run:

find . -mmin -30 -type f -print -quit

That will print out name of the first matched file and quit.

Also, if you have control over the software that uses your bunch of files, and performance is not an issue, you may add a feature of touching a timestamp file everytime any file is changed or added and then check only that timestamp file's stats.

vrs
  • 1,922
  • 16
  • 23