0

We have a Windows Server 2008 server which needs very high uptime. We want failover in the event we need to take it offline for some reason, ex: hardware update, software update, etc.

It runs mostly FTP (and a few other less critical services).

It doesn't need to be automatic. It can be done manually, but it needs to be smooth so that a human error can't happen.

We have an extra Windows Server we are thinking of using as a gateway. We are thinking of providing NAT to just forward the FTP port to a different server for failover. Is this the best solution given our circumstance?

PS: I know that FTP is tricky- ex: we need to ensure the external IP is set in the FTP software, that we deal with Passive ports, mirroring the files, etc.

SilentSteel
  • 217
  • 1
  • 11

1 Answers1

2

Using a Windows Server machine to provide NAT services seems like overkill, and creates a maintenance need (Windows updates) that will, invariably, cause service outages with the FTP site (since you'll have to reboot the NAT "gateway" machine regularly). I think you'd be better off using an embedded device that supports either a layer 3 solution (like NAT) or a layer 7 solution (like a TCP or FTP proxy).

Perhaps you're simply taking FTP uploads and don't care about remote users' ability to download files. In that case, you can probably get away with something like what you're talking about, so long as you have a way to merge any files received during a failover back into the production file corpus. (That's probably just an XCOPY or some such and not a big deal, but not knowing your back-end systems it's hard to say.)

If you're expecting downloads from remote users during a failover then a bigger issue than network-level or application protocol-level access is going to be continuity of access to the files hosted by the FTP server. Unless you've got a way to mitigate the single point of failure of the back-end file storage you can do anything you want at the network or application layer and still be dead in the water.

You may be able to use something simple like DFS replication (or even just replication scripts with tools like ROBOCOPY or rsync) to keep the production and failover FTP servers in sync. Your SLA windows are going to dictate how close to realtime your replication consistency needs to be.

Evan Anderson
  • 141,881
  • 20
  • 196
  • 331
  • What about just putting Linux on the gateway? Or, is there a piece of hardware that you'd recommend? We need to handle 1 Gbps speeds and about 100 simultaneous connections (downloads/uploads) during most of the day. Not sure what device/manufacturer would be best for that, at a reasonable price. – SilentSteel Feb 06 '13 at 16:19
  • You could certainly get away w/ a commodity piece of hardware running Linux (or some flavor of BSD) with a minimal software load (to keep the need for software updates low). I agree that you'll spend a pretty penny for a name-brand device to do this, though you may like the support and warranty entitlement that a name-brand device would give you. – Evan Anderson Feb 06 '13 at 17:31
  • Thanks.. If we did go with a name-brand device, what manufacturer would you say could take the load and be modestly priced? – SilentSteel Feb 07 '13 at 14:37