So I've been working on a project in Unity, and after writing this very simple code, I started wondering about how computers with faster and slower frame rates would generate slightly different speeds.
The code:
for (i = 0; i < distance ; i++) transform.Translate(speed*Time.deltaTime,0,0);
Mostly, I was wondering if in slower computers, the moving object wouldn't move as far, and if I took out the deltaTime multiplication, would the object appear to be moving slower in a slower computer than on one with a faster FPS count.
If so, how would I solve this problem, if it's a problem at all?