0

I'm IT at my small firm; and, despite my dire warnings, everyone puts files on the server with awful names, including leading & trailing spaces, bad characters (including \ ; / + . < > - etc!)

They do this by accessing the (FreeBSD/FreeNAS) server via AFP on Macs, so no part of the system complains.

Is there a script I can use to go through an entire directory tree and fix bad filenames?

Basically replace all spaces & bad ASCII with _ ... and if a file already exists, just slap a _2 or something on the end.

I don't suppose there's a way to get the system to enforce good filenaming conventions, is there?

voretaq7
  • 79,879
  • 17
  • 130
  • 214
Dan
  • 959
  • 5
  • 14
  • 25
  • You can't tell netatalk or whatnot to do this? – Ignacio Vazquez-Abrams Aug 20 '12 at 01:54
  • Not that I know of ... how ? – Dan Aug 20 '12 at 01:55
  • 2
    Actually, an even better question. Why do you believe that these characters are a problem? – Ignacio Vazquez-Abrams Aug 20 '12 at 01:57
  • Um, good question. I guess I've been "raised" to believe that those things can lead to issues/problems on a *nix-based server. But if that's not the case, all the better. I can stop harassing my colleagues. – Dan Aug 20 '12 at 01:59
  • 1
    The only things *nix doesn't like are `/` and `\0`. For everything else there's quotes. – Ignacio Vazquez-Abrams Aug 20 '12 at 02:04
  • You're tackling this from the wrong angle. Any filename that is accepted by the OS and its file system is an acceptable name. What you need to do is provide external systems that are compatible, rather than trying to force fit one into the other. – John Gardeniers Aug 20 '12 at 05:23
  • 1
    @IgnacioVazquez-Abrams You are right in theory but the OS is not the only problem. There are (sadly) many scripts and programs that behave badly with spaces in file names. Of course these are bugs but having simple names can be useful to avoid problems. – Matteo Aug 20 '12 at 05:31
  • Lucky you: You don't have any user that name files "*". Remember that script removing old logs by the way :) ? –  Aug 20 '12 at 22:11
  • 1
    Please don't cross post questions to multiple Stackexchange sites. – user9517 Aug 21 '12 at 07:39

2 Answers2

1

You can get a list of them by using find / -iname with a pattern to match what you don't want to see (consider using a regular expression like [^:alnum:]). This, you can manipulate (eg. with sed) into a shell script that will use mv to rename the files.

Acceptable filenames are determined by the filesystem, so there may or may not be a way to enforce it without a cronjob.

Falcon Momot
  • 25,244
  • 15
  • 63
  • 92
0

@terdon has provided an elegant 1-line solution over at superuser:

SAVEIFS=$IFS; IFS=$(echo -en "\n\b"); \
for i in $(find . -type f  -name "*[\:\;\>\<\@\$\#\&\(\)\?\\\%\ ]*"); \
do mv "$i" ${i//[\;><\@\$\#\&\(\)\?\\\%\ ]/_}; done; IFS=$SAVEIFS

(Line breaks added for clarity; it's all one line.)

Alternatively, you can get more aggressive with something like

for i in $(find . -type f  -name "*[^0-9A-Za-z\_\.\-]*"); 

Full details here: https://superuser.com/questions/463742/shell-script-to-fix-bad-filenames

Dan
  • 959
  • 5
  • 14
  • 25