-2

There are the details of my problematic :

  • I manage a PHP website with a massive number of files (>100 000) on a Linux server (Ubuntu 20.04) that I wish to save daily. I wish to create a shellscript (save.sh) that I could add and execute to the crontab to automate the data copy.

About the webserver:

  • I's a physical server installed with an Ubuntu 20.04 release on a 256 Gb SATA HDD.
  • Files are located in a local /var/www/mySite/ and many subfolders.
  • I already use crontab to schedule PHP scripts for maintenance

About the NAS:

  • It's connected on LAN, not by USB and filesystem is AFP.
  • It uses an afp protocol address like afp://MYNAS.local/SaveDirectory/
  • NAS shared folder appears in the Nautilus window as unmountable drive and I can reach it and navigate through the files.

I planned to use rsync command, rather than cp to copy files but this command need source and destination locations. For the source it's not a problem, I can use /var/www/mySite But for the destination I encountered troubles because rsync seems not to recognize, or cannot reach the AFP address of the NAS. afp://MYNAS.local/SaveDirectory/

I tried to search another way to access the NAS files by using a mount command in order to create a local directory that I can use to put my files onmount -t afp [SERVER IP]/SaveDirectory /mnt/myNas But I've got an error: mount /mnt/myNas: afp unknown file system

I looked into Ubuntu Wiki to try to find another way to copy my files using rsync but all answers are dead end and I go nuts.

Have somebody already successfully saved them files on an afp NAS using ? How can I use my NAS with this protocol to save my files using a shellscript to put in my crontab ?

Thanks for your help if you've got any idea.

Kroan
  • 7
  • 3

0 Answers0