0

I have a weird scenario and I can't seem to find the best way to make it work.

I have a inventory app stored on an Linode server. This app handles different companies. Each company has its own database.

All companies have multiple stores located at different locations. All stores need to use the same app and at the same time the data has to be synced.

I need to replicate the data, But all stores/apps need to be able to write/read and replicate at the same time. The problem is that most of them don't have Internet connection for hours. They are totally disconnected from the world(just LAN).

The conventional MySQL replication is not going to work because it needs internet connectivity to stay operational.

What do I do???

Is having my own software solution that replicates data on a higher level a good idea? If yes are there any best practises I should follow?

I also can't use mysql auto_increment step and offset for ID generation because some of the clients keep opening more and more stores. Do I need to generate my own GUID for each entity to make sure ids don't clash by prefixing the store unique ID (STOREID-UNIQUEID)?

Jordan Dobrev
  • 317
  • 1
  • 3
  • 10

1 Answers1

1

MySQL's replication should be able to handle network downtimes as long as it has enough time, bandwidth and disc space to download the logs during uptime.

I'm not sure how the auto-reconnect handles extended downtimes, but you should be able to fix reconnection issues with a scheduled job which restarts the replication.

GUID's are a good option for multi-site key generation. The other option is to use a site (client) identifier along with the autoincrement for a PK.

Vatev
  • 7,493
  • 1
  • 32
  • 39
  • How do you handle duplicate ID's, concurrent overrides, conflicts, int overflow? It can mess up your db and you will end up with dirty data. – Jordan Dobrev Jan 04 '16 at 16:02
  • You handle duplicate ID's by not generating duplicates (e.g. GUID). I don't know what concurrent overrides means, I was assuming your clients work independently of each other. Int overflows can be handled by bigger ints (or string/binary). – Vatev Jan 04 '16 at 16:10
  • if you have 1000 stores for each customer then you will have massive gaps. concurrent overrides means that 2 stores try to update the same data. Problem comes when both stores are not connected to the internet and do different changes. What change are you going to handle as true? – Jordan Dobrev Jan 05 '16 at 09:26
  • There is no way to magically merge offline changes from two clients. It doesn't matter what database (or other tool) you use. If you need to do it you will need to design and implement the merge rules yourself. I don't understand why having gaps in autogenerated numbers is a problem (GUID's have gaps by definition). – Vatev Jan 05 '16 at 09:37
  • Its a problem because each store can generate missions of records per day in a single table. Replication won't probably do the work for me because of the same reasons you mentioned with the merge. Doing extra development work might be better. Pushing new data to the main server using Queues sounds better and would solve most of the problems – Jordan Dobrev Jan 05 '16 at 14:58