-10

i have multiple web server and one central file server inside my data center. and all my Web server store the user uploaded files into central internal file server.

i would like to know what is the best way to pass the file from web server to file server in this case?

as suggested i try to add more details to question:

the solution i came up was: after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)

SHM
  • 1,896
  • 19
  • 48
  • 4
    I guess, because this is way too broad. – Julian Jun 17 '17 at 22:01
  • 2
    https://stackoverflow.com/help/how-to-ask – Julian Jun 17 '17 at 22:16
  • @Julian Ok, fully disagree. but thanks any way.. – SHM Jun 17 '17 at 22:19
  • If you review the time line for the question you can get an idea of why the question is being reviewed so poorly https://stackoverflow.com/posts/44574165/timeline – Nkosi Jun 17 '17 at 22:25
  • 4
    first, it's unclear what kind of answer you're expecting? Just mentioning a technique? (it's not clear in the questions), your constraints are unclear, what you have tried is unclear, and least but not last, what's "the best way"? The shortest time? The least amount of server memory? Works on IE6? The least amount of code lines? Please think of others who like to answer your question. – Julian Jun 17 '17 at 22:25
  • @Julian aw, cmon. by efficient solution in software, one means least possible memory/processor consumption. and also higher performance – SHM Jun 17 '17 at 22:40
  • If you do http post - no file should be entirely loaded into memory anywhere. Client does not send whole file in one chunk - it sends http request body in streamed manner. – Evk Jun 18 '17 at 10:23
  • You could clarify a bit the storing part imho. Do you store the file to retrieve them and eventually serve them later ? Are they stored in a folder structure and if yes how ? I would personally write a handler or some code to handle the file upload on the file server. By the look of it you can run in some nasty bugs if 2 of your front-end servers are trying to upload a file with same name at the same time etc.... – Etienne Jun 23 '17 at 04:24
  • 2
    @SHM I suggest you pay attention to the comments. The question is way too broad, there are no details. What do you even *mean* by "file server"? If it *is* a file server, why don't you just *copy* the files from the web servers to it, using file commands? What does an HTTP POST have to do with a *file* server? If that machine listens to HTTP, it's a web server with limited functionality, not a file server. – Panagiotis Kanavos Jun 23 '17 at 15:40
  • @SHM: do you have Windows Server in your datacenter? I implemented such a solution a while ago, if your question is still open i can provide you some details. – deblocker Jun 24 '17 at 10:04
  • @deblocker, yess. in fact all my machines in data center are running windows server. – SHM Jun 24 '17 at 14:20
  • Open your server copy the files to a drive and paste them elsewhere :-) – Harry Jun 24 '17 at 18:40

6 Answers6

5

Is your file server just another windows/linux server or is it a NAS device. I can suggest you number of approaches based on your requirement. The question is why d you want to use HTTP protocol when you have much better way to transfer files between servers.

HTTP protocol is best when you send text data as HTTP itself is based on text.From the client side to Server side HTTP is used as that is the only available option for you by our browsers .But between your servers ,I feel you should use SMB protocol(am assuming you are using windows as it is tagged for IIS) to move data.It will be orders of magnitude faster as much more efficient to transfer the same data over SMB vs HTTP.

And for SMB protocol,you do not have to write any code or complex scripts to do this.As provided by one of the answers above,you can just issue a simple copy command and it will happen for you.

So just summarizing the options for you (based on my preference)

  1. Let the files get upload to some location on the each IIS web server e.g C:\temp\UploadedFiles . You can write a simple 2-3 line powershell script which will copy the files from this C:\temp\UploadedFiles to \FileServer\Files\UserID\\uploaded.file .This same powershell script can delete the file once it is moved to the other server successfully.

E.g script can be this simple and easy to make it as windows scheduled task

$Destination = "\\FileServer\Files\UserID\<FILEGUID>\"
New-Item -ItemType directory -Path $Destination -Force
Copy-Item -Path $Source\*.* -Destination $Destination -Force

This script can be modified to suit your needs to delete the files if it is done :)

  1. In the Asp.net application ,you can directly save the file to network location.So in the SaveAs call,you can give the network path itself. This you have to make sure this network share is accessible for the IIS worker process and also has write permission.Also in my understanding asp.net gets the file saved to temporary location first (you do not have control on this if you are using the asp.net HttpPostedFileBase or FormCollection ). More details here

You can even run this in an async so that your requests will not be blocked

if (FileUpload1.HasFile) 
          // Call to save the file.
          FileUpload1.SaveAs("\\networkshare\filename");

https://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.saveas(v=vs.110).aspx

3.Save the file the current way to local directory and then use HTTP POST. This is worst design possible as you are first going to read the contents and then transfer it as chunked to other server where you have to setup another webservice which recieves the file.The you have to read the file from request stream and again save it to your location. Am not sure if you need to do this.

let me know if you need more details on any of the listed method.

Rohith
  • 5,527
  • 3
  • 27
  • 31
  • In that case ,first option is best suited for you.You can use RoboCopy or copy using normal copy calls.Any particular reason you would like to use HTTP? E.g firewall – Rohith Jun 20 '17 at 15:18
  • there are some other file meta data (E.G user data). – SHM Jun 21 '17 at 07:09
  • .I suppose user data will be in the database,is that true ? When you store the files in a folder,you can give a unique ID to the folder and keep this as reference for your metadata or you can make the file name itself contain the id with metadata.Wihout getting more infomration,can't suggest anything. – Rohith Jun 21 '17 at 09:52
2

Or you just write it to a folder on the webservers, and create a scheduled task that moves the files to the file server every x minutes (e.g. via robocopy). This also makes sure your webservers are not reliant on your file server.

brijber
  • 711
  • 5
  • 17
1

Assuming that you have an HttpPostedFileBase then the best way is just to call the .SaveAs() method.

You need the UNC path to the file server and that is it. The simplest version would look something like this:

public void SaveFile(HttpPostedFileBase inputFile) {
  var saveDirectory = @"\\fileshare\application\directory";
  var savePath = Path.Combine(saveDirectory, inputFile.FileName);
  inputFile.SaveAs(savePath);
}

However, this is simplistic in the extreme. Take a look at the OWASP Guidance on Unrestricted File Uploads. File uploads can be the source of many vulnerabilities in your application.

You also need to make sure that the web application has access to the file share. Take a look at this answer

Creating a file on network location in asp.net

for more info. Generally the best solution is to run the application pool with a special identity which is only used to access the folder.

Derek Wang
  • 10,098
  • 4
  • 18
  • 39
ste-fu
  • 6,879
  • 3
  • 27
  • 46
  • this is synchronous, is`nt that going to be a problem later? – SHM Jun 23 '17 at 10:54
  • 1
    possibly - but I don't think so. I think you still need a thread on your web server to write the data from memory / local disk buffer to the destination no matter what. It's not like waiting for a response from a database. – ste-fu Jun 23 '17 at 11:06
0

the solution i came up was: after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)

I would suggest not posting the file at once - it's then full in memory, which is not needed.

You could post the file in chunks, by using ajax. When a chunk receives at your server, just add it to the file.

With the File Reader API, you could read the file in chunks in Javascript.

Something like this:

/** upload file in chunks */
function upload(file) {

    var chunkSize = 8000; 

    var start = 0;

    while (start < file.size) {
        var chunk = file.slice(start, start + chunkSize);
        var xhr = new XMLHttpRequest();
        xhr.onload = function () {
            //check if all chunks are and then send filename or send in in the first/last request.
        };
        xhr.open("POST", "/FileUpload", true);
        xhr.send(chunk);

        start = end;

    }
}
Julian
  • 33,915
  • 22
  • 119
  • 174
  • i`m really looking for a solution for server to server file transfer. client to server is`nt much of a problem. – SHM Jun 18 '17 at 17:22
  • There is no need to store it on both servers. On upload, just write the file on the file server. – Julian Jun 18 '17 at 17:51
  • yesss, how can i write it to file server. Doing a http post to file server from web server dose not seem to be very scalable. What u think about it? – SHM Jun 18 '17 at 18:02
  • Julian's answer is correct. This is really how you balance the load off your webserver. In your system, the web'server' is per definition a client to your file server :-) – Geert Jan Jun 19 '17 at 03:51
0

It can be implemented in different ways. If you are storing files in files server as files in file system. And all of your servers inside the same virtual network

Then will be better to create shared folder on your file server and once you received files at web server, just save this file in this shared folder directly on file server.

Here the instructions how to create shared folders: https://technet.microsoft.com/en-us/library/cc770880(v=ws.11).aspx

Victor Leontyev
  • 8,488
  • 2
  • 16
  • 36
0

Just map a drive

I take it you have a means of saving the uploaded file on the web server's local filesystem. The question pertains to moving the file from the web server (which is probably one of many load-balanced nodes) to a central file system all web servers can access it.

The solution to this is remarkably simple.

Let's say you are currently saving the files some folder, say c:\uploadedfiles. The path to uploadedfiles is stored in your web.config.

Take the following steps:

  1. Sign on as the service account under which your web site executes

  2. Map a persistent network drive to the desired location, e.g. from command line:

    NET USE f: \\MyFileServer\MyFileShare /user:SomeUserName password
    
  3. Modify your web.config and change c:\uploadedfiles to f:\

Ta da, all done.

Just make sure the drive mapping is persistent, and make sure you use a user with adequate permissions, and voila.

John Wu
  • 50,556
  • 8
  • 44
  • 80