0

I have a program accessing database and downloading images. I was using BlockingCollection for that purpose. However, to access some UI elements I decided to use combination of Backgroundworker and BlockingCollection. It reduced speed of processing considerably as compared to speed when only Blockingcollection was used. What can be the reason? Or as I am now accessing UI elements, there is reduction in speed?

Here is the code I am working on:

 private void button_Start_Click(object sender, System.EventArgs e)
    {
        BackgroundWorker bgWorker = new BackgroundWorker();
        bgWorker.DoWork += bw_DoWork;
        bgWorker.RunWorkerCompleted += bw_RunWorkerCompleted;
        bgWorker.ProgressChanged += bw_ProgressChanged;

        bgWorker.WorkerSupportsCancellation = true;
        bgWorker.WorkerReportsProgress = true;

        Button btnSender = (Button)sender;
        btnSender.Enabled = false;

        bgWorker.RunWorkerAsync();
    }

and Do_Work() is as follows:

{
        HttpWebRequest request = null;
        using (BlockingCollection<ImageFileName> bc = new BlockingCollection<ImageFileName>(30))
        {
            using (Task task1 = Task.Factory.StartNew(() =>
            {

                foreach (var fileName in fileNames)
                {

                        string baseUrl = "http://some url";
                        string url = string.Format(baseUrl, fileName);
                        request = (HttpWebRequest)WebRequest.Create(url);
                        request.Method = "GET";
                        request.ContentType = "application/x-www-form-urlencoded";
                        var response = (HttpWebResponse)request.GetResponse();
                        Stream stream = response.GetResponseStream();
                        img = Image.FromStream(stream);
                        FileNameImage = new ImageFileName(fileName.ToString(), img);
                        bc.Add(FileNameImage);
                        Thread.Sleep(100);
                        Console.WriteLine("Size of BlockingCollection: {0}", bc.Count);
                    }



            }))
            {
                using (Task task2 = Task.Factory.StartNew(() =>
                {


                    foreach (ImageFileName imgfilename2 in bc.GetConsumingEnumerable())
                    {
                        if (bw.CancellationPending == true)
                        {
                            e.Cancel = true;
                            break;
                        }
                        else
                        {
                            int numIterations = 4;
                            Image img2 = imgfilename2.Image;
                            for (int i = 0; i < numIterations; i++)
                            {
                                img2.Save("C:\\path" + imgfilename2.ImageName);
                                ZoomThumbnail = img2;
                                ZoomSmall = img2;
                                ZoomLarge = img2;
                                ZoomThumbnail = GenerateThumbnail(ZoomThumbnail, 86, false);
                                ZoomThumbnail.Save("C:\\path" + imgfilename2.ImageName + "_Thumb.jpg");
                                ZoomThumbnail.Dispose();
                                ZoomSmall = GenerateThumbnail(ZoomSmall, 400, false);
                                ZoomSmall.Save("C:\\path" + imgfilename2.ImageName + "_Small.jpg");
                                ZoomSmall.Dispose();
                                ZoomLarge = GenerateThumbnail(ZoomLarge, 1200, false);
                                ZoomLarge.Save("C:\\path" + imgfilename2.ImageName + "_Large.jpg");
                                ZoomLarge.Dispose();

                                //  progressBar1.BeginInvoke(ProgressBarChange);
                                int percentComplete = (int)(((i + 1.0) / (double)numIterations) * 100.0);
                                //if (progressBar1.InvokeRequired)
                                //{
                                //    BeginInvoke(new MethodInvoker(delegate{bw.ReportProgress(percentComplete)};))
                                //}
                            }
                            Console.WriteLine("This is Take part and size is: {0}", bc.Count);
                        }
                    }


                }))
                    Task.WaitAll(task1, task2);


            }

        }

    }
LearningAsIGo
  • 1,240
  • 2
  • 11
  • 23
  • 1
    Please share the code you're dealing with. It's difficult, if not impossible, to help with understanding code you've never seen. – Sean U Apr 09 '14 at 16:33
  • You came to mechanic shop saying hey my car is having some problem... but where is the car? How can we solve your problem? Bring your car please! (I mean post your code) – Sriram Sakthivel Apr 09 '14 at 16:34
  • Code added. Please check. @SeanU but is there any generalized answer for this question? Like does this combination always kills speed? – LearningAsIGo Apr 09 '14 at 16:39
  • Why `Thread.Sleep`? remove that and see the difference. and also am not sure why are you disposing the tasks? I think you dispose it even before it finishes. [Take a look at this, You don't need to dispose the Task](http://blogs.msdn.com/b/pfxteam/archive/2012/03/25/10287435.aspx) – Sriram Sakthivel Apr 09 '14 at 16:44
  • @SriramSakthivel U mean I should not put them in using block? – LearningAsIGo Apr 09 '14 at 16:48
  • Yes, you can remove the using block, it is redundant and remove `Thread.Sleep(100);` call in first loop – Sriram Sakthivel Apr 09 '14 at 16:51
  • `Thread.Sleep()`'s whole purpose is to stop the presses - and tie up resources in the process - for a while. If that's not what actually needs to happen (and it almost never is), then it shouldn't be used. There's almost certainly a better tool for the job. Also, note that Console.WriteLine() is a blocking call, and also needs to run synchronously - if you've got two different threads both trying to write to the console, they're going to be getting in each other's way. – Sean U Apr 09 '14 at 17:08
  • @SeanU But I am not getting any error. Or what you mean is as both threads are trying to access console at the same time, there is delay? – LearningAsIGo Apr 09 '14 at 17:28
  • Yes, because only one thread is allowed to access the console at a time. – Sean U Apr 09 '14 at 17:34

1 Answers1

1

A better option might be to make retrieving the data and writing it to disk run synchronously, and instead use Parallel.ForEach() to allow multiple requests to be in-flight at the same time. That should reduce the amount of waiting in a couple spots:

  • No need to wait for one HTTP request to complete before issuing subsequent requests.
  • No need to block on that BlockingCollection
  • No need to wait for one disk write to complete before firing off the next one.

So perhaps something more like this:

Parallel.ForEach(fileNames, 
    (name) => 
    {
        string baseUrl = "http://some url";
        string url = string.Format(baseUrl, fileName);
        var request = (HttpWebRequest)WebRequest.Create(url);
        request.Method = "GET";
        request.ContentType = "application/x-www-form-urlencoded";
        var response = (HttpWebResponse)request.GetResponse();
        Stream stream = response.GetResponseStream();
        var img = Image.FromStream(stream);

        // Cutting out a lot of steps from the 2nd Task to simplify the example
        img.Save(Path.Combine("C:\\path", fileName.ToString()));  
    });

One possible problem you could run into with this approach is that it will start generating too many requests at once. That might cause resource contention issues, or perhaps the webserver will interpret it as malicious behavior and stop responding to you. You can limit the number of requests that happen at the same time by setting the MaxDegreeOfParallelism. The following example shows how you could limit the operation to process no more than 4 files at the same time.

var options = new ParallelOptions { MaxDegreeOfParallelism = 4 };
Parallel.ForEach(fileNames, (name) => { /* do stuff */ }, options); 
Sean U
  • 6,730
  • 1
  • 24
  • 43