I need to do some mysql replication. Some informations :
- I have two databases instances, in shared hosting, so I can't use Mysql Replication (I don't have access to the configuration files).
- It's for a non profit project (educational) so we can't afford our own servers.
- If the main server is down for a few minutes it's generally no that bad but there exists specific days where we REALLY need to have a backup solution, synchronized with the main server (time limited events of the website).
Right now the system is using a revision number on every row of every table and we periodically checked for modification of these number (and update the corresponding rows). It's quite slow.
What I'm thinking of is that every SELECT/INSERT/UPDATE query is logged in a specific table and the "slave server" periodically ask the "master server" for the content of this table and apply the corresponding queries.
What is your opinion on that idea ?
I know it's not perfect, a server might be down before all the queries are propagated, but I want to minimize the possible problems, with as few lines of code as possible.
What would be the best possible way to implement it ?
- In the php code, on every SELECT/INSERT/UPDATE I can do an other insert in a specific table (I simply insert the query)
- With a trigger ?