0

I have a high traffic PHP based website and have 2 web servers which sit under a load balancer. I only have FTP access to change the files on the 2 servers when I wish to make changes to the site. This will never change.

At present I upload only the files I have changed to each server. Sometimes this invloves many files, and it is very confusing having two servers and many different files in many folders. It is likely human error will creep in here and I will forget to upload a file introducing discrepencies between servers. All my changes are done locally and are version controlled under svn. I make the changes in Eclipse (as I also use this for java development)

I also wish to minimise downtime during publish. Can anyone reccomend best practices for automating this process?

Mark W
  • 101
  • 2

2 Answers2

0

Automate in small steps. First I would suggest that you write a simple script that does what you do when you make changes to the site. One for each server is fine. Then change and adopt the script in small steps.

pkhamre
  • 6,120
  • 3
  • 17
  • 27
0

A very long time ago, I wrote a perl script for syncing files over FTP: pushsite (the linked page also lists other software available at that time which does a similar job).

There are web-based file managers available, but I'm not aware of any which handling publishing to multiple servers. Even if this were feasible you'd need to be very careful how you implement a mechanism for automating transfer of source files across HTTP.

symcbean
  • 21,009
  • 1
  • 31
  • 52