I found this really weird glitch with the timer behavior.
I have a game that uses a timer to count the seconds to the finish of the game. I recently turned it to 100ths of a second to be precise, but it’s reading a totally different time!
So, when the timer was counting in seconds, I got a score of around 45-50 seconds. Now, with the more precise timer, it’s reading things like 2800 100ths of a second. If I am correct, 2800/100 = 28 seconds, which is nearly half of the time that showed with the 1 second timer…
I have no idea why this is happening. Is it game lag? Is it a problem with the behavior, or the engine, or my PC?
I’ve heard about the timer behavior having lag sometimes, but I don’t really know if this is just lagging. I mean, 28-45 is a big difference.
I can look and see. But it might be lag… It seems like lag would make the timer slower though.
My guess is that you went out of the tab, to set the timer. Sometimes when you go out of the flowlab tab and back in, it messes up the sync in the timer.
Have you tried timing the game using an outside timer to see which time is more correct.
I know flowlab timers aren’t very accurate at all, since if the game is slow or lags, you’ll get a much faster time than if the game runs smoothly, since for some reason when the game slows down, so does the timer.
The timer behavior uses delta time so its more accurate to real time than game time.
Making your own timer will count by fps, so if you have 30fps, than it needs to be multiplied by frames. But an outside time can also be affected by lag