4

My company is in dire need of a redesign around how we handle user account administration. I've been tasked with automating the process. The end goal is to have the whole works triggered by the business, and IT only looking in when there's an error reported. The interim phase is going to be semi-manual. That is a level 2 tech inputs the user's info and supervises the process. The current hurdle I'm facing is user profile archiving.

Our security team requires us to archive the profile directories for any terminated user for 60 days in case the legal team requires access to their files. Our AD is as much a mess as everything else, so there are some users with home directories and some with profiles. Anyone who has a profile dir in AD also has a good deal of their profile redirected to our file servers over DFS. In order to complete the process manually you find the user in AD, disable them, find their home/profile dir, go there and take ownership, create an archive folder, move all their files over, then delete the old dir. Some users have many many gigs of nonsense and this can take quite some time. Even automated the process would not be a quick one.

I'm thinking that I need to have a client side C# GUI for the quick stuff and some server side batch script or console app to offload this long running process. I have a batch script that works decently using takeown and robocopy, but I wonder if a C# console app would do a better job.

So, my question at long last is, what do you think is the best way to handle this? I can't imagine this is a unique problem, how do other admins get this done? The last place I worked was easily 10x larger than the place I'm in now. If we would have been doing this manual crap there, they'd have needed a team of at least 30 full time workers to keep up.

I have decent skills in C#.net and batch scripting, but am a quick study and I have used most every language once or twice.

Thank you for reading this and I look forward to seeing what imaginative solutions you all can come up with.

tpederson
  • 145
  • 5
  • It seems likely that the slowness you see is completely related to the takeown/move operations. I doubt building something in C# would give any significant improvement. – Zoredache Jan 30 '11 at 06:30
  • Agreed, I'd suggest looking at some powershell scripting to do what you need here. – Sam Cogan Jan 30 '11 at 22:59
  • I was planning on using powershell 2.0 as it has the remote execution functionality. I no longer work there, but I never understood why they wouldn't just archive the TSM backups for the DFS home shares. The AD user administration is easily handled by PS or the ADquery/mod/etc CLI programs. In any case, that company seemed all too happy to pay us a lot of money to do those sort of redundant manual processes, and were entirely opposed to automation... Ohh well, on to bigger and better things; thanks all for the suggestions hope it helps someone else. -Tyler – tpederson Feb 08 '11 at 11:48

3 Answers3

1

Something that may work better is to run a script that automatically copies data using robocopy or a similar program, at a specific time during the night, to a backup server. That way, users are working directly on their own machines, but the data is being archived daily while they are not at their desks (say, 8pm to 5am).

  • 1
    This is the best answer here, we were using TSM, so that's just really fancy robocopy right? :) The other answers were good, and IT Guru has the best answer in a perfect world. However, this is reality, you get the check. Congratulations. – tpederson Feb 08 '11 at 11:53
0

I built a separate server which has all user files on the hard disk via DFS shares. I call this system the REPO.

This system has it's own unique backup policy which keeps all files of the last 2 years available to be retrieved at a moments notice (I say moment, 24 hours usually!)

I also have similar abilities with email, using GFI Mailarchiver to hold the same amount of email for all users as well. This allows me to even create profiles that are read only, for "study" by management.

For me having these servers with massive disk storage, on a separate part of the network has provided a tool to allow management to have the overview they require which it comes to following up ant legal issues that arise. There may be a cost involved, but I do prefer set it an forget it solutions, rather than ones I have to keep an eye on.

Mister IT Guru
  • 1,178
  • 3
  • 15
  • 35
0

For all general purposes, I would create an environment that uses central authentication for accessing the users shares and mount both a home directory and a windows profile segmented from both.

This will save more specific configurations for different kinds of users and archiving all data can be performed with ease using your DFS/RFS windows solutions.

Seems kind of like an opinion to me though...

Jason B Shrout
  • 394
  • 2
  • 9