3

I am modelling a GPU (cannot disclose which) for estimating the performance of OpenCL and OpenGL applications, The model can reasonably estimate the FLOPS of the executing app/kernel/code is there a way to estimate to Frames per Second from the FLOPS, or is it better to model the framebuffer and estimate FPS from that.

Umair
  • 68
  • 1
  • 6
  • frames per second doing what? In game programming the amount of draw calls is usually more deterministic of framerate than the GPU FLOPS. – AlexanderBrevig Sep 08 '14 at 02:50
  • You really need to model the entire system to get realistic results. There are a lot of other possible bottlenecks beyond raw GPU performance. – Reto Koradi Sep 08 '14 at 03:03
  • FPS of the display, there are certain use cases one of them is games. The Model contains CPU+GPU+Mem and the bottlenecks for each are accounted into the model; so instead for just FLOPS based on the openGL calls can a reasonable estimate be made ? – Umair Sep 08 '14 at 04:45

1 Answers1

1

As FPS is also influenced by the code that is running on the CPU, there's no way to make an accurate FPS prediction based on FLOPS alone.

You have to execute the code and measure the application's FPS at runtime. Sorry!

karlphillip
  • 92,053
  • 36
  • 243
  • 426
  • The CPU is modelled also along with memory, the model is to predict performance and there are scenarios where the hardware is not available for some reason, to ask another way for certain use cases based on OpenGL calls can the FPS then be estimated based on if the model if in the model from the moment the CPU issues the call, the GPU takes time to process it (including estimating the time for mem transfers to gpu caches, gpu core scheduling,etc) to the moment it writes to the framebuffer then by monitoring the framebuffer updates can a reasonable estimate be made? – Umair Sep 08 '14 at 04:42