11

It seems there are no free* .NET performance profilers that can profile on a line-by-line basis. Therefore, I am looking into using Stopwatch for profiling.

*free as in freedom, i.e. license includes commercial applications.

EDIT: In response to those who told me "buy a profiler", I would like to, but if I could spend that much money I would spend it on something else. I tried to convince my boss that a profiler is worth it, but haven't had much luck. This question is mostly based on curiosity. I would never consider Stopwatch as a replacement for a real profiler.

I have a little test app (written in C#) that measures performance differences when using a Stopwatch on a per-line basis. The test code is this:

int n = 100;
BigInteger f = 1;
for (int i = n; i > 1; i--)
{
    f *= i;
}

Here is the full code: http://pastebin.com/AvbQmT32

I have a Stopwatch for each line of code. This is my 'profiler'. I also have one Stopwatch for the whole program. This is my 'profiler profiler'.

I have the program configured as Release mode, Any CPU (on an x64 machine), and optimizations disabled.

When I run the program with the profiler disabled, I get something like this:

             Line             |  Ticks
------------------------------|----------
                              |
Total time:                   |       359

When I run it with the profiler enabled, I get something like this:

             Line             |  Ticks
------------------------------|----------
                              |
int n = 100;                  |         3
BigInteger f = 1;             |        12
for (int i = n; i > 1; i--)   |       325
{                             |
    f *= i;                   |       539
}                             |
                              |
Total time:                   |      1710
Stopwatch overhead:           |       831

Ideally, the time spent on code should be equal in both cases, but it appears that the Stopwatches have overhead that appears within their own elapsed time.

Now, it doesn't often make sense to need to profile every line of a program, as it usually works better with a divide-and-conquer approach. You can usually start by profiling chunks of code, and narrow down on any performance problems.

Also, in most applications, the average line of code will be a lot slower than the ones in the test program. This means that there will be less Stopwatch overhead.

However, there is still overhead when using Stopwatches, especially if you use a lot.

So down to the question:

What is the most effective way to use Stopwatches for profiling? How can I minimize overhead? Is it even worth it to wrap a Stopwatch around a single statement?

I appreciate your feedback.

Kendall Frey
  • 43,130
  • 20
  • 110
  • 148
  • 6
    Personally, the profilers I purchased (ANTS Profiler) were worth every cent. I would never ever think about "manual" profiling again. – Uwe Keim Mar 22 '12 at 15:18
  • 1
    Sorry for asking, but why not check into using some of the System.Diagnostics classes? I believe it contains Classes to check out CPU and Ram Usage for example. – Eon Mar 22 '12 at 15:20
  • 1
    The free version of Eqatec profiler allows its use in a commercial project: http://www.eqatec.com/Profiler/LicenseTerms.aspx – ken2k Mar 22 '12 at 15:22
  • @ken2k: Does it profile on a line-by-line basis? – Kendall Frey Mar 22 '12 at 15:24

7 Answers7

4

Considering that we often intuitively program by Single responsibility principle, not only in regard of types but the functions too, I would say there is no any sence of profiling application line by line. You will be more interested in profiling a "single responsibility" then every single line.

There are, naturally, cases when you need to have informaiton too. But do not use it in all application but in single part of a function or a single function. In this case StopWatch is a best choice. Consider that StopWatch is .NET Class so has it's even minimum overhead. You shouldn't look on absolute, but on relative values.

Hope this helps.

menjaraz
  • 7,551
  • 4
  • 41
  • 81
Tigran
  • 61,654
  • 8
  • 86
  • 123
2

You shouldn't be looking at exact numbers, you should be looking for relative differences. So that you can identify problem areas. You are looking for places to optimize that will solve your performance problem.

If you are developing code at the level to where performance is actually an issue and you need to profile it to find the offending code, then purchasing a tool to perform this task for you will more than pay for itself in the time saved. The profiler that comes with Visual Studio Premium is what I would recommend.

Charles Lambert
  • 5,042
  • 26
  • 47
2

Have you tried looking at CInject: http://codeinject.codeplex.com/documentation? It basically injects code into a DLL. Out of the box it seems to have performance monitoring, but it also lets you create your own code to inject. Seems perfect for what you're trying to do.

aquinas
  • 23,318
  • 5
  • 58
  • 81
2

First thing's first: your results are not at all surprising. If you used a commercial profiler, you would see something similar: your program will take a lot longer to run when it is being profiled than when it is not. The more granular you set the profiler, the longer you can expect it to take. When you consider that statements like "i > 1" and "i--" will likely be executed as single processor instructions, it becomes obvious why profiling the execution time of a particular line can take a lot longer than executing the line itself.

The fact that profiling is increasing the overall running time of your program should not be a concern; as several others have mentioned, what matters is not the absolute amount of time the program runs, but comparing the running time of the individual parts against each other to find the bottleneck. But there is another concern. Stopwatch will use the high frequency timer from the underlying OS if one is available; but even that may not be high enough. The high frequency timer on my Windows 7 64-bit i5-2400 (Quad core 3.10 GHz) ticks 3,020,556 times per second. That sounds like a lot; but at that rate, the processor could execute a thousand instructions between ticks. That means if you are trying to measure the time it takes to execute a single instruction, you are either going to way under or way overshoot.

You would be better off profiling at the method level. Even then you will run into frequency issues, particularly if you have small well-factored methods. But the results will be more reliable than at the line level; and once you have identified the method that is causing the bottleneck, you can examine it directly to determine how to optimize it to perform better.

All of this leaves aside the many caveats that go along with performance profiling in general, which would be outside the scope of this post. Make sure you do additional research on the subject to understand how you should interpret whatever results you get. As a quick example, your profiling may reveal that the majority of the time in your program is being spent in a particular method; but does that mean that the method itself is the problem, or that other methods are calling it to often? Answering that kind of question is where the real difficulty in profiling lies.

David Nelson
  • 3,666
  • 1
  • 21
  • 24
1

Have you compared the total time without your profiler to the time with your profiler in debug mode? My first guess is that the extra time is appearing because the extra statements for the stopwatch are preventing some register or looping optimization. So the extra time may be coming from a missing optimization and not some loss of time innternal to the stopwatch.

However, that being said, even though stopwatch has a high resolution, I would think you would notice some timing error measuring things as small as one line of code. For example 'int n = 100' is one line in assembly, so in technical terms I would think it would be only 1 tick. How much variation do you notice in your numbers from run to run? If you are getting a variation that is a significant percentage of your average value, then that valu really isn't giving you that much information.

Sogger
  • 15,962
  • 6
  • 43
  • 40
  • Optimizations are disabled, so that won't be an issue. I sometimes see variation of about 1/3 the average value, in both cases. – Kendall Frey Mar 22 '12 at 15:43
  • With optimizations disabled, switching to debug mode probably doesn't reveal the issue, but it is still possible that the slowdown is caused by the extra overhead of the stopwatch calls. The stopwatch calls use several variables internally which may be forcing your main code to do things like reload registers after the stopwatch calls for example. – Sogger Mar 22 '12 at 16:11
  • Oh, and in terms of the 1/3 variation, it depends on what you are trying to determine the number for, but a variation that big means the number is only going to be good for order of magnitude comparisons (in terms of what other answers here are saying, only good for relative comparisons). Think about a 6ft tall person being measured with a 1/3 margin of error, 'Oh, you are somewhere between 4 and 8ft tall.' Kind of a big deal. – Sogger Mar 22 '12 at 16:16
  • However, profiling is often about 'order of magnitude comparisons'. Taking 1 second longer doesn't mean anything. Taking twice as long does. – Kendall Frey Mar 22 '12 at 16:24
  • 1
    I totally agree. In terms of profiling this is almost always the case, except for some real-time applications that need to meet specific timing requirements. My comments about the variation are just trying to address your question of 'Is it even worth it to wrap a Stopwatch around a single statement?' The answer is if your margin of error is really small, then yes. If it isn't, it depends what you want to do with the value. Sometimes, the value won't even be accurate enough to do relative comparisons. – Sogger Mar 22 '12 at 16:55
0

Perhaps you should think in terms of profiling method call times, and not single lines. After all, a method call is usually a single line in the calling code. Also an interesting approach is to use a logging library to log method entry and exits, e.g., like this (using log4net as the logger of choice):

public string MyMethod(string arg)
{
    string result = null;

    if (logger.IsDebugEnabled) logger.Debug("->MyMethod(" + arg + ")");

    // do stuff
    result = "Hello world!";

    if (logger.IsDebugEnabled) logger.Debug("<-MyMethod(" + arg + ") = " + result);

    return result;
}

This will create a log file which is really useful for seeing what your application is doing, and it has the bonus of timestamping each line written to the log, so you can see if a particular method is taking a long time. With log4net you can easily change the configuration to switch the logging off, so when you don't need it you do not pay any performance penalty.

Polyfun
  • 9,479
  • 4
  • 31
  • 39
0

If you have Visual Studio Ultimate, try using Visual Studio Profiler. It's pretty good.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Yaman
  • 1,030
  • 17
  • 33