2

I have a large folder that I am replicating by dfs and I want to check that all files have been replicated correctly.

Currently I am running the following script at both ends.

cd e:\data\shared\
dir /a:-h /b /s > e:\data\shared\result.txt

and then using a text editor to tidy the file before using a diff tool to compare them.

Does anyone know a better way of doing this? Failing that does anyone know how to adapt my script to ignore all the files in the DfsrPrivate folders

Simon Foster
  • 2,622
  • 6
  • 38
  • 55
  • 1
    it is a shame given the vast recent improvements that the reporting and options to troubleshoot really suck. Makes it hard to justify using DFS replication for anything that needs to be reliable – JamesRyan Jul 05 '12 at 16:06

2 Answers2

4

Why not use the built-in DFS diagnostic reporting?

I've setup scheduled email reports using a script similiar to the one here in the past.

edit: Since Brent mentioned the powershell angle, there is also this script out there. I haven't used it and I don't have a DFS environment in my test environment to see what this does, but it may be worth looking at.

Rex
  • 7,895
  • 3
  • 29
  • 45
  • +1 for the built-in reports. There are also some command line and PS scripts you can run as well. – Brent Pabst Jul 05 '12 at 14:32
  • I use the built in reports but they don't tell me which files have not replicated only how many are backlogged – Simon Foster Jul 05 '12 at 14:34
  • You should be able to create a report that has the option to count the replicated files and their sizes on each member that will list the actual files that in the backlog. Often times, you will have locked files and/or files with the temp attribute set that are not replicated by DFS – Rex Jul 05 '12 at 14:43
1

Is there actually a concern that the files are not replicating?

When we setup DFS with about 30GB trans-atlantic we didn't go through and check each file, instead we just sampled the content through our tests to ensure that the system behaved the way we wanted it to.

Essentially we took a single folder, around 1GB of content (mostly PDF) and replicated it. Once we confirmed our smaller subset of data replicated it was pretty comfortable to replicate the rest without any major issue.

It would have taken forever to try and diff the contents, let alone execute the command to begin with against close to 5 million objects.

Brent Pabst
  • 6,069
  • 2
  • 24
  • 36
  • I know files are not replicating as users are complaining that files are missing. I want to know which files haven't replicated so I can look into manually copying them. – Simon Foster Jul 05 '12 at 14:36
  • If they aren't replicating manually copying them is not going to fix the underlying problem, just solve the current user complaints. DFS-R is pushy so check the event log for any errors or warnings. Get actual examples from your users and troubleshoot the problem. Especially since you have a backlog. – Brent Pabst Jul 05 '12 at 14:40