0

I have one scenario where I need to select all files having aliencoders.numeric-digits like alienoders.1206 and find command should search all subdirectories too. If there is no such file it should not do anything.

I wrote :

find /home/jassi/ -name "aliencoders.[0-9]+" | xargs ls -lrt | awk print '$9'

Bit it says no such file or directory if there no such file starts with aliencoders.xx...

How I can by pass this error. I have to run it for several such directories and it should give output for those directories only in which such file pattern exists else no warning and doesn't do xargs etc stuffs.

Currently, if no such file is there then it is taking current directory t find instead of /home/jassi

Levon
  • 138,105
  • 33
  • 200
  • 191
Jassi
  • 521
  • 6
  • 31
  • which shell are you using? bash? tcsh? – Levon Jun 18 '12 at 17:24
  • also .. you want `awk '{print $9}'` no? – Levon Jun 18 '12 at 17:28
  • i m using tcsh and i needed to get all such files in sorted order so xargs ls -lRt and then I need to print only last column that is file name with full path so piping with awk – Jassi Jun 18 '12 at 17:33
  • If all the directories are in /home/jassi and not in deeper levels, you can excahnge `find` by `for i in "/home/jassi/aliencoders.[0-9]+" ; do ls -lrt $i | awk '{print $9} ; done`. – ott-- Jun 18 '12 at 17:46
  • No it has some deeper level too and I think either i have to use find command with xargs or File::Find module with regex – Jassi Jun 19 '12 at 05:49

2 Answers2

2

If you don't want xargs to execute if the input is empty you can use -r or --no-run-if-empty which are GNU extensions as pointed out in the man page. So if you have that support, you can try
find /home/jassi/ -name "aliencoders.[0-9]+" | xargs -r ls -lrt | awk '{print $9}'
Alternatively you can make use of exec option with find to achieve this something on these lines
find /home/jassi/ -name "aliencoders.[0-9]+" -exec ls -lrt {} + | awk '{print $9}'
Hope this helps!

another.anon.coward
  • 11,087
  • 1
  • 32
  • 38
  • Actually, find /home/jassi/ -name "aliencoders.[0-9]+" command was fetching everything from the current directory too. So I changed it to find /home/jassi/ -type f -name "aliencoders.[0-9]+" then it's working fine else it was giving unusual result. and -r option in xargs doesn't seem to be working. I am in tcsh shell. I will update again. – Jassi Jun 19 '12 at 05:46
1

bash:

Try this command (under the bash shell since most people use it, and no shell was specified):

find /home/jassi/ -name "aliencoders.[0-9]+" 2>&1 | xargs ls -lrt | awk '{print $9}'

With 2>&1 you are redirecting the error messages from stderr to stdout. Once you have this single output stream, you can process it with your pipes etc.

Without this redirection, your error messages to stderr would have continued to go to the console, creating clutter in the output and only stdout was getting processed by the pipes.

All about redirection will give you more details and control about redirection.

UPDATE:

tcsh:

Based on use of the tcsh (sometimes I think I'm the only one using it), it is possible to redirect stderr to stdout with the following construct:

command |& ...

so

find /home/jassi/ -name "aliencoders.[0-9]+" |& xargs ls -lrt | awk '{print $9}'

should help. Before your error messages were bypassing your pipes and going directly to the console, only stdout was getting processed by your pipes. With this all of your output goes through the pipes and you can filter out what you want.

Also, note the use of the awk command at the end of the command:

awk '{print $9}'

is different from what you posted originally.

Levon
  • 138,105
  • 33
  • 200
  • 191
  • Thanks guys. It actually solved the issue. I used -r with xargs which seems little more efficient than exec command and just added type -f in find command too. It worked the way we were thinking. Thanks again – Jassi Jun 19 '12 at 07:43