I want to migrate a very large website (15 GB) built on Wordpress. I have followed this whitepaper, everything is working as it should for a new or small website installation/migration. Although I succeeded in restoring the Database using MySQL Workbench. But I have failed to restore my files (the uploads dir is 12GB itself), I have tried to use "Duplicator Pro" plugin but it does not work, I uploaded the files directly to the EC2 instance using sFTP but I got a lot of issues with files permissions. Also, I uploaded the files to the S3 bucket with CloudFront is enabled but this also didn't work. I am lost and I am not sure how to complete the job, please help me if there is any recommended method to upload the files.
-
Do it manually, it's not difficult. If you can't work out sFtp then you have a bit of learning to do. – Tim Oct 10 '17 at 18:44
2 Answers
EB: My recommendation with beanstalk, use an out of the box configuration with EB CLI. This github repo will get you through the steps. It will upload your whole wordpress installation.
Uploads folder Size: Problem is actually your file size, Beanstalk allows 512mb direct upload. If you use CodePipeLine via S3, you get 2GB. In your case, I would transfer the uploads & DB with updraft plus. It will first upload the backup files remotely, then download them on the target site. You might need to up your php memory + execution times first to handle such load. If Updraft can't unpack the zips because of file permissions, ssh into the instance.
Update:
So instead what you could do is, upload everything that you want ( uploads folder ) to an S3 Bucket in a zip or gzip file and set it to public. Then ssh in your server, and run a sudo wget fileurl
on that file to download it. Then unzip it with sudo unzip filename.zip
. I had to do it some times like that, due to updraft splitting the folders. After the sudo command, you have to run again sudo chown -R webapp:webapp
of your unzipped folders to be sure they have the right permissions.
Change the file permissions with sudo chown -R webapp:webapp or target specifically a file. I use webapp, because in the configuration, this is the group/user that is running this application.
DB: Alternatively for the db, install phpmyadmin and connect with your RDS instance or local mysql and import your exported sql file.
Conclusion: I don't know if you solved it yet, and I would be curious to find out how you do/did it. Until now, my sites have always been around 3-4 GBs.

- 111
- 3
-
Thank you for the detailed answer. Actually, I didn't resolve this issue yet, I have chosen to stay on a dedicated server until I find a good solution. I'm not sure that updraft may help to solve. one of my solutions was to upload the files directly to the instance using sFTP client, I did that but I got errors because of permission assigning, the wrong step that I made was not to give files permission to **webapp:webapp**, I remember it was www-data and that what make problems for me, I will give another try when I have time again to do it and sure I will share results with you – Mujahed Altahleh Nov 19 '17 at 18:50
-
Yea with SFTP there might some issues with file permissions, best is then to use SSH, it logs you in with the ec2-user. This was has sudo permissions, then you can also reassign the right permissions. See above! Cheers – Hendrik Vlaanderen Nov 23 '17 at 00:09
-
Updraft is slow as it runs within PHP in 30 second chunks. SFTP runs as a service under SSH. sFTP is the best way to transfer information. Why do you want this within EB? There are AWS templates for Wordpress, but not sure if they deal with migration . – Tim Nov 23 '17 at 00:14
The duplicator plugins are designed to provide two files...and archive and an installer. Those are uploaded and then duplicator self installs given the correct credentials on new server. You dont just try to use it to upload files and folders...its a one stop solution that does the databse too.
I would suggest that you contact duplicator plugin developers via their forum for help...they are pretty good guys.
The amazon whitepaper seems an incredibly complex process...much of this i wonder if it is relevant. I would keep it simple...its just files in a public html directory and a database (the rest is fluff)
You are not changing wordpress urls...read wordpress codex on how to move wordpress (keeping same url).
The amazon part is specific to amazon...not wordpress. Just get the thing working simply first, worry about the luxury complexities later...if amazon is anything like google cloud you can easily upgrade your performance requirements later.
Finally, check which user and group has ownership of the files after upload vs who is owner and group on amazon directory. Almost certainly you have wrong user and group ownership of the files and folders.
You will need to chown user:group after upload.
Ensure the correct user:group owns the public html directory and everything in it on amazon instance. It wont work properly anyway if root has ownership of anything in that directory. User group on my google cloud servers are usually either my account user or www-data (i use debian or ubuntu).
Are you using a control panel? This will make it easier...i use either ispconfig, vestacp, or virtualmin (gpl). Virtualmin is the most powerful of the 3 and my choice although its dashboard interface is a bit of a maize compared with other 2.
Permissions should be 755 and 644. Wordpress always works with those for me. With a website this size, i suspect your max file upload size and various php ini settings will need changing quite dramatically as the defaults on a new LAMP server wont be anywhere near enough for a big website like this installed using duplicator. Fortunately, duplicator usually tells you what changes are required when it is creating the initial package on original server prior to the move.

- 11
- 1