I'm trying to implement a point cloud where the different points shall vary in size based on an uncertainty value associated with them. Let's say, if this value is zero, the size should be constant, if it's approaching 1, the radius of those points should vary more and more. First, the points should gain in size and after reaching a maximum, they should decrease until a minimum, and so on. A function to describe this phenomenon could be:
pointSize = x +/- c * pointUncertainty, where x = standard point size
c = scaling constant
From what I've read, this could potentially be achieved by passing a uniform timer variable to my vertex shader and compute the point size in there. However, all the points should vary in the same time, meaning that a point that has an uncertainty of 1 and a point that has an uncertainty of 0.5 should reach their minimum and maximum value of pointSize at the same time. Additionally, the whole process shouldn't be dependent on the frame rate.
I'm not sure what's the best way of getting this done, how the increase-decrease-increase-pattern could be implemented best and where to put the necessary OpenGL (4.2) commands.
EDIT: I still hope to get an answer to this question, since The whole process of how such an animation effect could be achieved is unclear to me.