I wish to move an object over time using coroutines
. I wanted to get an object from point A to point B over say 2 seconds. To achieve this I used the following code:
IEnumerator _MoveObjectBySpeed()
{
while (TheCoroutine_CanRun)
{
if(myObject.transform.position.y <= UpperBoundary.position.y)
{
myObject.transform.position = new Vector3(myObject.transform.position.x,
myObject.transform.position.y + Step, myObject.transform.position.z);
}
yield return new WaitForSeconds(_smoothness);
}
TheCoroutine_CanRun = true;
moveAlreadyStarted = false;
}
and the Step
is calculated like
private void CalculateSpeed()
{
Step = _smoothness * allDistance / timeToReachTop;
}
where allDistance
is the distance between the bottom and the top boundary.
_smoothness
is a fix value. The thing is, the bigger I get this value, the more accurate the time gets to get from bottom to up. Note that a small value here means smoother movement. This smoothness is the time the coroutine waits in between moving the myObject
.
The time is measured like this:
void FixedUpdate()
{
DEBUG_TIMER();
}
#region DEBUG TIME
public float timer = 0.0f;
bool allowed = false;
public void DEBUG_TIMER()
{
if (Input.GetButtonDown("Jump"))
{
StartTimer();
}
if (myObject.transform.position.y >= UpperBoundary.position.y)
{
StopTimer();
Debug.Log(timer.ToString());
//timer = 0.0f;
}
if (allowed)
{
timer += Time.fixedDeltaTime;
}
}
void StartTimer()
{
timer = 0;
allowed = true;
}
void StopTimer()
{
allowed = false;
}
#endregion
The results were:
When I wanted the object to reach the top under 1 second and set the _smoothness
to 0.01, the time the myObject
took to get to the top was 1.67 seconds. When _smoothness
was 0.2s, the time to actually reach the top was 1.04s.
So why is this so inaccurate and how to make it work fine?