1

I'm developing a web service using ASP.NET MVC3. One of the methods of the HTTP API receives an image from the body of a POST request and needs to store it on the disk for further processing. The beginning of the method is like this:

Stopwatch stopwatch = new Stopwatch();
stopwatch.Start();

Interlocked.Increment(ref _generateCount);

byte[] fileBuffer;
using (Stream inputStream = Request.InputStream)
{
    if (inputStream.Length == 0)
    {
        Trace.WriteLine("Submitted file is empty", "Warning");
        inputStream.Close();
        Interlocked.Decrement(ref _generateCount);
        return new HttpStatusCodeResult(400, "Content is empty");
    }

    fileBuffer = new byte[inputStream.Length];
    inputStream.Read(fileBuffer, 0, (int)inputStream.Length);
    inputStream.Close();
}
stopwatch.Stop()
... (storing the fileBuffer on disk & quite CPU intensive processing on the file)

I'm hosting the service in a small Azure instance.

Now the strange behavior I have is when I issue parallel requests to the service. Let's say I issue a first request, and a second one 5 seconds later. As you see I'm using a Stopwatch to monitor the performance. For the first request, the ElapsedTime will be very small (less than one second) but for the second one, it usually reads approx. 14 seconds!

Note that the average processing time for a single request is approx. 25 seconds (and gets to 40+ when multiple requests are being processed), so I get my input stream read 14 seconds later, although the first request hasn't finished.

How can I explain this delay ?

Thanks

ThomasWeiss
  • 1,292
  • 16
  • 30

1 Answers1

1

If your final goal is to store the file on the disk why are you loading it in-memory? You could directly write to the output stream:

public ActionResult SomeAction()
{
    var stopwatch = Stopwatch.StartNew();
    if (Request.InputStream.Length == 0)
    {
        return new HttpStatusCodeResult(400, "Content is empty");
    }

    var filename = Server.MapPath("~/app_data/foo.dat");
    using (var output = System.IO.File.Create(filename))
    {
        Request.InputStream.CopyTo(output);
    }
    stopwatch.Stop();

    ...
}

Also notice that if you issue 2 concurrent requests that attempt to write to the same file on your server, you might get corrupted data or errors as you cannot write to the same file at the same time.

Darin Dimitrov
  • 1,023,142
  • 271
  • 3,287
  • 2,928
  • Hi, I was loading the stream in memory for debug purposes, but flushing the stream directly on the disk seems indeed to solve the issue. How would you explain the difference ? Or may it come from my inputStream.Close() that would wait for some sync from IIS ? – ThomasWeiss Mar 26 '12 at 08:09
  • If I am using log4net logging for debugging purpose and writing the log statements to a particular file. Then, do the concurrent http requests will overwrite that log file? – teenup Aug 29 '13 at 08:48
  • @teenup, no the file is not overwritten, messages are appended to it. – Darin Dimitrov Aug 29 '13 at 14:42
  • I see overwritten behavior, I confirm this by replacing the logging with writing into the windows application event log. – teenup Aug 29 '13 at 14:47
  • @teenup, but are you using log4net or the code shown in my answer? Because the code in my answer obviously creates a new files every time. – Darin Dimitrov Aug 29 '13 at 14:50