Ok, so you know about "renderer.drawfps 1" in the console to see what FPS you're getting. But there are two numbers. It seems pretty obvious that the 1st one is your 'current' frames-per-second (FPS). But what's the second?
Yes, I googled, even asked on IRC, everyone seems to think it's "average FPS", but it can't be. Why do I say this?
I have the first number pretty consistently at 80-90, and the 2nd consistently between 10 and 12. If the 2nd was an average, starting that low due to 0fps during game loading or something, it would STILL rise up towards 80-90 over the course of playing. It doesn't. It just fluctuates between 10 and 20 whilst my actual FPS never goes below 40 even at the busiest of times.
So, what is it? One guess is 'Server FPS', i.e. how many times per second the server is calculating the entire game state. It seems a reasonable figure for this as well.
I guess we'll never know for sure unless we get hold of a dev, remember to ask this one in the next dev chat .
-Ath, considering poking all the sites that erroneously say the 2nd number is the average.
Yes, I googled, even asked on IRC, everyone seems to think it's "average FPS", but it can't be. Why do I say this?
I have the first number pretty consistently at 80-90, and the 2nd consistently between 10 and 12. If the 2nd was an average, starting that low due to 0fps during game loading or something, it would STILL rise up towards 80-90 over the course of playing. It doesn't. It just fluctuates between 10 and 20 whilst my actual FPS never goes below 40 even at the busiest of times.
So, what is it? One guess is 'Server FPS', i.e. how many times per second the server is calculating the entire game state. It seems a reasonable figure for this as well.
I guess we'll never know for sure unless we get hold of a dev, remember to ask this one in the next dev chat .
-Ath, considering poking all the sites that erroneously say the 2nd number is the average.
Comment