0

We have a customer which connects to our servers through Satellite. However, they have a major concern that the facilities in which they want to run our applications, at many times have connection issues. So, they want to run our applications locally (on specific PC's). Data from the host system would feed to the local machines continuously. If the connection is lost, the PC still have enough data to conduct it's business until the connection is restored. At that time, the data changes from the PC are reflected back to the host system and visa versa.

I guess we would be considering some type of replication (this is all new to me). This has many questions but here are the main ones.

  1. If we replicate, then they need a copy of SQL Server on each PC. We are talking about 60 sites which would be very expensive due to licensing. Also, other support costs.

  2. Is it better to always run replication or only in the event that the connection was lost?

  3. How does the local system get in sync with the hosted system?

Just looking for a better/less expensive solution.

Abdulqadir_WDDN
  • 658
  • 6
  • 22

1 Answers1

2

The way I see it, there are two ways to for it (depending on your requirements.

If you think the problem will not persist you can use the circuit breaker pattern: https://learn.microsoft.com/en-us/azure/architecture/patterns/circuit-breaker

Handle faults that might take a variable amount of time to recover from, when connecting to a remote service or resource. This can improve the stability and resiliency of an application.

If you need to retry indefinitely and you can't afford to lose data then you will need a custom solution.

On a totally local environment you could go with either a local database like sql lite, where you can store items and retry if not successful, or store the calls in Microsoft Queue. Then you call build a service that reads the database or the queue and retries.

Athanasios Kataras
  • 25,191
  • 4
  • 32
  • 61
  • Thanks for reply, but currently, i am using only asp.net web API in *Offline* approach *without internet* this the constraint for me.In this constraint and i need a recurring call of a web API to push the data to the targeted server and need to track the success and failure of Web API Calls.and the second constraint is we have very low bandwidth in data transmission to the targeted server. – Abdulqadir_WDDN Dec 03 '19 at 06:18
  • 1
    @Abdulqadir_WDDN I would suggest: 1. Push for better infrastructure. 2. Push for more reliable connection. If you don't know _why_ the connection fails, all you do is band-aid around a timebomb. 3. The patterns suggested in the answer also work in an intranet. The details may vary but the basic principle is the same: you need a buffer. – Fildor Dec 03 '19 at 07:48
  • Indeed! There is no reason this wouldn't worth in an intranet! Cheers – Athanasios Kataras Dec 03 '19 at 07:50
  • @Abdulqadir_WDDN The problem with buffering, though, is if you have a brittle and slow connection, you are very likely to run into buffer overflows. I.e. the pile of buffered data will keep building up, until it is unreasonable to think it will ever go back to empty. That's why your requirement can only be _one part_ of a solution. – Fildor Dec 03 '19 at 07:50
  • 1
    I think the OP just needs to make sure he won't lose data in the off chance the connection fails. – Athanasios Kataras Dec 03 '19 at 07:52
  • I think the OP just needs to make sure he won't lose data in the off chance the connection fails. – Athanasios Kataras Dec 03 '19 at 07:52
  • 1
    I don't know that. Could be. But anyway, not knowing the cause for brittle connection is a timebomb. Retrying indefinitely on a fatally broken connection doesn't make sense. You'll need at least some escalation mechanism. For example email an admin after N fails... And OP also stated the connection is known to be slow. So he'll have to do some math: If he buffers, how much buffering is reasonable? You cannot drain an ocean through a 3/4" pipe. But as I said: just regarding the question, I think your answer is absolutely valid. – Fildor Dec 03 '19 at 07:56
  • @Fildor here there is no continuous retry for data push using Web API.we have a business workflow in which we can schedule the retry occurrence as per our requirement. and we need to deal with satellite so we can not change the infrastructure easily so all data transmission will be done through these satellite, that's why i need to establish a mechanism in which i can pass the data to the different remote servers and if the connection line will be failed then retrieve the failure details and push the data until it succeeded by using predefined constraint for pushing data through the Web API. – Abdulqadir_WDDN Dec 03 '19 at 08:47
  • So for that i require failure management functionalities using Web API in which i can track a log of failure and schedule a retry and save that details into the database and show it to the dashboard.If anyone can guide me with step by step procedure and sample code than it will be a great.Thanks – Abdulqadir_WDDN Dec 03 '19 at 08:49
  • 1
    The satellite part would have been great to see in the question. – Fildor Dec 03 '19 at 09:00