I use the following code to scale and crop all images in a folder.
string fileNameWithoutExtension = Path.GetFileNameWithoutExtension(file);
string fileExtension = Path.GetExtension(file);
string filePath = Path.GetDirectoryName(file);
string newFileName = string.Empty;
long fileSize = new FileInfo(file).Length;
if (fileSize > fileSizeLimit)
{
string tempFile = System.IO.Path.GetTempFileName();
File.Copy(file, tempFile, true);
Bitmap sourceImage = (Bitmap)System.Drawing.Image.FromFile(tempFile);
System.Drawing.Image imgPhoto = ScaleCrop(sourceImage, sourceImage.Width / 4, sourceImage.Height / 4, AnchorPosition.Top);
Bitmap bitImage = new Bitmap(imgPhoto);
File.Delete(file);
newFileName = filePath + "\\" + fileNameWithoutExtension + "_" + DateTime.Now.ToString("yyyyMMddHHmmss") + "_" + CoilWarehouseProcessed + fileExtension;
bitImage.Save(newFileName, System.Drawing.Imaging.ImageFormat.Jpeg);
imgPhoto.Dispose();
bitImage.Dispose();
}
If I run the application locally (in debug mode in VS2010) and point it to a network drive then all images are processed every time.
If I run it from a our local webserver the problem is that the app may process no images, it may process 5, it may process 1, it never does all of the images in a given folder, only ever some of them... then it hangs in the clients browser.
There are no events to view via the event log... the application does not crash or error in anyway... the fact that it will process an image proves it's not a permissions issue.
Any ideas why this is happening?
EDIT: Thanks to wazdev, but I ended up testing a less intrusive (and also don't like dependencies relying on 3rd party software) solution, and it all seems good so far... Basically I changed it so that when it copies the stream to produce a new image 'System.Drawing.Image imgPhoto = ...' to use a using statement to ensure that the 'temp' image is disposed of. I also moved the delete of the original (uncropped / unscaled image) file to be the last operation (In tests it has worked fine, only time will tell once more users come online and concurrency is tested):
string tempFile = System.IO.Path.GetTempFileName();
File.Copy(file, tempFile, true);
Bitmap sourceImage = (Bitmap)System.Drawing.Image.FromFile(tempFile);
System.Drawing.Image imgPhoto = ScaleCrop(sourceImage, sourceImage.Width / 4, sourceImage.Height / 4, AnchorPosition.Top);
Bitmap bitImage;
using (var bmpTemp = new Bitmap(imgPhoto))
{
bitImage = new Bitmap(bmpTemp);
}
newFileName = filePath + "\\" + fileNameWithoutExtension + "_" + DateTime.Now.ToString("yyyyMMddHHmmss") + "_" + CoilWarehouseProcessed + fileExtension;
bitImage.Save(newFileName, System.Drawing.Imaging.ImageFormat.Jpeg);
imgPhoto.Dispose();
bitImage.Dispose();
File.Delete(file);
EDIT2: It's been live now for a few days and i've tested it every day and it is working well.. Here's all that I did;
Basically inside the ScaleCrop() call there was a GC.Collect and a Wait For Pending Finalisers() call. I removed the wait for pending call and moved the GC.Collect() to after the File.Delete().