0

I'm using the following code (from this answer) to convert all CPP files in the current directory to a file named code.pdf and it works well:

find . -name "*.cpp" -print0 | xargs -0 enscript -Ecpp -MLetter -fCourier8 -o - | ps2pdf - code.pdf

I would like to improve this script to:

  1. Make it a .sh file that can take an argument specifying the extension instead of having it hardcoded to CPP;

  2. Have it run recursively, visiting all subdirectories of the current directory;

  3. For each subdirectory encountered, convert all files of the specified extension to a single PDF that is named $NameOfDirectory$.PDF and is placed in that subdirectory;

binarez
  • 779
  • 1
  • 8
  • 17
  • `find` will retrieve all files in all sub-directories that end with the extension (`cpp` in your example). If this is what you wanted, your recursive bullet makes no sense. I you wanted all files in the current directory, there is no reason to use find. – kabanus Feb 11 '19 at 13:52
  • The line I'm using is the base I'm working from and was taken from another answer on StackOverflow (see link). My bullet points are the improvements I want to add. – binarez Feb 11 '19 at 14:12

2 Answers2

1

First, if I understand correctly, what you are using is in fact wrong - find will retrieve the files from all sub-directories. To work recursively, only getting files from the current directory (I named it do.bash):

#!/bin/bash

ext=$1
if ls *.$ext &> /dev/null; then
    enscript -Ecpp -MLetter -fCourier8 -o - *.$ext | ps2pdf - $(basename $(pwd)).pdf
fi
for subdir in */; do
    if [ "$subdir" == "*/" ]; then break; fi
    cd $subdir
    /path/to/do.bash $ext
    cd ../
done

The checks are to make sure a file with the extension or a sub directory actually exist. This scripts operates on the current directory, and calls itself recursively - if you do not want a full path, put it in your PATH listings, though a full path is fine.

kabanus
  • 24,623
  • 6
  • 41
  • 74
  • Thanks for your answer but I'll go with the solution that doesn't require the script to be in the user's path, for simplicity of usage. – binarez Feb 11 '19 at 15:11
1

First, if I understand it correctly, this requirement:

For each subdirectory encountered, convert all files of the specified extension to a single PDF that is named $NameOfDirectory$.PDF

is unwise. If that means, say, a/b/c/*.cpp gets enscripted to ./c.pdf, then you're screwed if you also have a/d/x/c/*.cpp, since both directories' contents map to the same PDF. It also means that *.cpp (i.e. CPP files in the current dir) gets enscripted to a file named ./..pdf.

Something like this, which names the PDF according to the desired extension and places it in each subdirectory alongside its source files, doesn't have those problems:

#!/usr/bin/env bash
# USAGE: ext2pdf [<ext> [<root_dir>]]
# DEFAULTS: <ext> = cpp
#           <root_dir> = .
ext="${1:-cpp}"
rootdir="${2:-.}"

shopt -s nullglob

find "$rootdir" -type d | while read d; do

  # With "nullglob", this loop only runs if any $d/*.$ext files exist
  for f in "$d"/*.${ext}; do

    out="$d/$ext".pdf
    # NOTE: Uncomment the following line instead if you want to risk name collisions
    #out="${rootdir}/$(basename "$d")".pdf

    enscript -Ecpp -MLetter -fCourier8 -o - "$d"/*.${ext} | ps2pdf - "$out"

    break   # We only want this to run once

  done

done
Adrian
  • 595
  • 4
  • 9
  • Yes, I forgot to specify that I WANT the PDF next to the source files so there's one PDF per directory containing CPP files. I'm testing your approach and will comment back. – binarez Feb 11 '19 at 14:56
  • It doesn't work. I printf-debugged and it seems to be going thru all directories, but the for loop doesn't work (for f in "$d"/*.${e}; do). I am positive that there are CPP files within those sub-directories. – binarez Feb 11 '19 at 15:05
  • @binarez Whoops, sorry. I was renaming `$e` to `$ext` for clarity, but didn't change every instance. Fixed now. – Adrian Feb 11 '19 at 15:10