1

This is the scenario.

I work in a lab where we have a few machines collecting data and they are not part of the office[IT] approve network (meaning they are off the shelve boxes with no admin restrictions and they do not comply with company policies]. Now it is getting to the point where the amount of data produced is significant and transferring everything to the network via USB drives is not a good option anymore. We would like to solve this in a couple of steps.

Step 1: Automatically copy data files maybe once a day or once a week to a drive accessible within our intranet.

Step 2: Change the way we collect data from individual files to a relational DB so we can query reports from it. [long term - Topology used to solve Step 1 should work to solve Step 2]

The IT support we have is very limited that is why I need to present them with an option almost "cooked" to solve our problem.

I was thinking that I would connect all the lab computers with a router and then the router to an IT approved server via a separate network card. That way it would allow us to separate the "Lab network" from the office network.

I appreciate any advice you could provide me with.

-Cristian

Huasillo
  • 71
  • 1
  • 6

1 Answers1

0

Based on what you've said:

  • set up a scheduled task to copy the contents of the folder to the network share every 30 minutes. You may have to use a .vb script in order to map the network drives. Using this way you can give each client a different password, alternatively you can just have the drives mapped and leave them mapped. You can definitely do the rest using robocopy (free Microsoft tool) to copy the stuff over. Its a command line tool so works well with robocopy.

  • Run a custom written piece of software on whatever machine you want to copy the CSV files into your database from the public server share. If you want it running in the background it will need to be written as a windows service. This software isn't hard at all. Using C# this could be done in a few hours. There's no off the shelf solution because your dataset will be unique. But .Net provides all the lego pieces to put together a solution.

  • For remote updates: install and configure OpenVPN on each laptop and configure it to connect to your office location. Configure it so all traffic goes to the normal route apart from traffic to the server, which goes to the VPN. I checked, its easy to configure. You'll need to open up the VPN ports on your network though, which will be serious hassle if its a large company. The user can run OpenVPN when they are out of the office. You will also need a server for the clients to connect to but I think windows server comes bundled with this so you can use one of those..

As far as I can see this is all pretty straight forward for someone with the kinds of skills needed. Hell, i'm pretty sure this could be done in less than a day by someone who knows what they are doing. You can even do it bit by bit checking it works as you go. Start with the scheduled tasks, move onto the SQL program, then set up the VPN.

Christopher Vickers
  • 1,773
  • 1
  • 14
  • 18
  • Thank you Christopher. I should clarify this data does not need to be access outside the building and then if needed it might be after "Step 2". [We would used VPN once the data is in the network] – Huasillo Aug 13 '18 at 12:49
  • Is this on windows, mac or Linux? If windows: go with the scheduled tasks. Create a batch script that is run ever 30 minutes. Get it to map a drive if not already done so, then run a robocopy script located in a folder on the laptop. I haven't done one this in years but if you spend the time researching it then you could have it done in 30 minutes. Its possible that you might need the windows task scheduler to run a visual basic script to map the drives. I'm a c# guy myself but if you get a visual basic guy to do it, he could get it done in 30 minutes. – Christopher Vickers Aug 13 '18 at 13:00
  • Can I ask why you need to copy the stuff into a database? – Christopher Vickers Aug 13 '18 at 13:03
  • Also, if you are going down the route of external use, then you could use OpenVPN to have an always on VPN connection, that only allows traffic to your server to go through the VPN. The above solution would then work externally too – Christopher Vickers Aug 13 '18 at 13:07
  • This system would be windows based. I thought about the batch scripting as a way to get the data into the "middle man" computer/server and then another to copy the file from "middle" to a network drive. For STEP 1: Data is just csv/xls files with sensors data values for each column. For Step 2: I am thinking I might have to install SQL or something like that in the middle computer and relay it to another SQL DB in the main server. – Huasillo Aug 13 '18 at 13:33