I'm implementing a camera for a game and I'm using the LERP formula for smooth chasing. However, if the target moves fast enough, the camera can never reach it unless the t
((1 - t) * v0 + t * v1) value is high enough, but that is exactly the problem: some targets might still move faster than the current t
value.
This might lead to 2 problems:
- The camera will never reach the object if it's super fast
- The camera will very slowly reach the object, depending on it's current speed
How do I scale up my t
when the delta distance (abs(v1 - v0
)) gets lower, so that the camera will starts at a slow chasing rate and increases as it gets closer (therefore no targets could run away)?