0

I have two tasks Job1 and Job2 and below is the code. First time I ran Job1 and Job2 in a sequence and got the stopwatch output as 844 milliseconds. Second time I commented sequential jobs and called parallel task processing and got the stopwatch result as 11352 milliseconds. My assumption/expectation was I should get the output as 422 (because 844/2) or somewhere near to it. But the result is totally opposite. I need to wait both the jobs to be completed and hence I put t1.Wait() and t2.Wait(); Please help me to process these two jobs in half of sequential processing. Please find my code below:

    public ActionResult Index()
    {
        Stopwatch s = new Stopwatch();
        s.Start();

        //Execute 2 jobs in a sequence
        Job1();
        Job2();

        //Execute 2 jobs in parallel
        //var t1 = Task.Run(() => { Job1(); });
        //var t2 = Task.Run(() => { Job2(); });
        //t1.Wait();
        //t2.Wait();

        s.Stop();
        Debug.WriteLine("ElapsedMilliseconds: " + s.ElapsedMilliseconds);
        return View();
    }

    private void Job1()
    {
        for (int i = 0; i < 1000; i++)
        {
            Debug.WriteLine(i);
        }
    }

    private void Job2()
    {
        for (int i = 2000; i < 3000; i++)
        {
            Debug.WriteLine(i);
        }
    }
user1396423
  • 193
  • 3
  • 14
  • 1
    Starting tasks can be very time consuming. You must factor in this overhead. You may never get your two tasks to be faster if you introduce parallelism. – Enigmativity Aug 03 '16 at 05:48
  • Debug.WriteLine maybe have locks/syncing code causing a lot of thead stopping and starting perhaps? http://stackoverflow.com/questions/2759301/debug-writeline-locks this suggests it. Try the process with a Thread.Sleep perhaps or something far more simple to test times. – White Dragon Aug 03 '16 at 05:54
  • Put a Thread.Sleep for about 1000 ms in both Job1 and Job2 and you will see that you actually are invoking both methods in parallell but there is always a overhead in introducing paralellism and due to that together with the type of operations you are doing you might not get the expected result. – Daniel Aug 03 '16 at 05:55
  • This isn't your whole code, and the code around it matters a LOT because tasks act totally different in a GUI scheduler-context than they do in the default scheduler-context for console apps or services. – Ben Voigt Aug 03 '16 at 05:59

2 Answers2

2

Essentially your 2nd example is slower due to the overhead of Debug.WriteLine during concurrency.

You'd notice a big difference if you remove Debug.WriteLine (which calls OutputDebugString under the hood).

MSDN states:

Applications should send very minimal debug output and provide a way for the user to enable or disable its use. To provide more detailed tracing, see Event Tracing.

As Eran states here:

"if several threads call OutputDebugString concurrently, they will be synchronized".

There's no concurrency in your first example so that could be a reason why its faster.

Also you might want to consider moving to a single wait.

Change your code from this:

//Execute 2 jobs in parallel
var t1 = Task.Run(() => { Job1(); });
var t2 = Task.Run(() => { Job2(); });
t1.Wait();
t2.Wait();

...to this:

//Execute 2 jobs in parallel
var tasks = new List<Task>();
tasks.Add(Task.Run(() => { Job1(); }));
tasks.Add(Task.Run(() => { Job2(); }));
Task.WaitAll(tasks.ToArray());
Community
  • 1
  • 1
1

To understand why Tasks have nothing to do with performance but more with responsiveness you should read up on

Context switching: https://en.wikipedia.org/wiki/Context_switch

The processor uses lots of cycles to switch to another thread

Mutexes: https://en.wikipedia.org/wiki/Mutual_exclusion

Every time you call Debug.WriteLine the Task waits until Debug.WriteLine in the other Task is done and performs a context switch.

Serve Laurijssen
  • 9,266
  • 5
  • 45
  • 98