0

doing a graphic performance test with CETK (direct draw performance, gdi performance). Can someone explain how to interpret the test results? In detail, what do Rgn Count, SampleCount, Min, Max, Std Deviation etc. mean? Is there some documentation on these available?

Thanks in advance. Regards, Timm

Here, some example results:

Rgn Count=0
SampleCount=10
Min=0.000000
Max=1000.00000
Mean=700.000000
Std Deviation=577.350269
CV%=82.478610
timmf
  • 44
  • 6

1 Answers1

0

I hope you are familiar with Filters and Pins.

Sample Count:

When a pin delivers media data to another pin, it does not pass a direct pointer to the memory buffer. Instead, it delivers a pointer to a COM object that manages the memory. This object, called a media sample. An allocator creates a finite pool of samples. The allocator uses reference counting to keep track of the samples. The GetBuffer method returns a sample with a reference count of 1. If the reference count goes to zero, the sample goes back into the allocator's pool, where it can be used in the next GetBuffer call. This is somewhat similar to counting semaphores. While a filter is using a buffer, it holds reference count on the sample. The allocator uses the reference count to determine when it can re-use the buffer. This prevents a filter from overwriting a buffer that another filter is still using. A sample does not return to the allocator's pool of available samples until every filter has released it.

Standard Deviation:

The number of times the benchmark engine samples a particular test case depends on the standard deviation of an initial sample.

I'm not sure about the remaining terms. Maybe, this link could shed some light.

Gomu
  • 1,004
  • 5
  • 16
  • 36