0

I'm trying to find a solution to a DFS problem. Our servers have their own file upload methods, but we've also added FTP for large files and folders management. So how do I get DFS to play nicely with our software?

There is no way to delay synchronization times. I can control the bandwidth but the files are small and numerous so that doesn't help. Turning the service on and off throughout the day doesn't seem practical. What happens is 1 server receives a package with our command and control server. Simultaneously, the 2nd server receives the same command and package from the command and control server. Then the DFS synchronization kicks in and server1 corrupts the file package on server 2 or vice-verse.

Anyone encounter this and find a solution?

Joe Chin
  • 103
  • 3
  • Is is possible to modify your C&C server to only target one of the servers in the DFS? Or, add a 3rd target to the DFS so you can configure repl as hub & spoke. – Clayton Sep 26 '14 at 17:08
  • I've asked the developers and they said they would look into it. But for the moment I'm looking for practical solution on our production server. Kicking it to the developers means it is weeks or months away before it gets back to us. – Joe Chin Sep 26 '14 at 17:14
  • Create an exclusion for that specific file in DFSR, and configure your C&C server, or scheduled task on one server to ROBOCOPY it to the other server at the desired time? Relocate the file on a non-DFS path? – Clayton Sep 26 '14 at 18:27
  • Thanks, Craig. We ended up creating 2 folders. 1 managed by the web application pool and another managed by DFS. Similar to how Wordpress manages it's template resources internally but you can add additional files into a wp_public, for example. Now I have to find and replace the video file links to point to this replicated directory. – Joe Chin Sep 30 '14 at 13:43

0 Answers0