“Great frame rate” report…and questions

J

Jimko

Guest
First, let me leave no doubt in anyone’s mind that the expression “a little knowledge can be dangerous” certainly applies to me when it comes to computer technology. In my past work life as a communications engineer, the applications I used were of primary interest and importance…the pc was just a tool that made them work and I’ve never had more than a very rudimentary knowledge of PC technology. And, I don’t stay very current with the technology until I have to investigate it for something like my recent upgrade.

So, I tend to rely on other people like my young technical friend whose family owns and operates a well known local computer chain and we upgraded my machine expressly for OFF.
I’m now running this configuration:

MSI P45 neo3 motherboard
Intel Core 2 Duo E8400 3.0 Ghz
1065 Ballistix 2X1Gb (2 Gb of RAM)
NVIDIA GeForce GTX 260-216 SC
Zahlman cooler on mb, Intel Heat pipe 10cm fan, all in a server size case, lots of fans, blah, blah.
Samsung Syncmaster 2443BW 24 in. LCD

I’m running this system with the E8400 comfortably over clocked at 3.6 Ghz.

I configured for video using all the excellent info that is now on the OFF website for video set-up. What a time saver that is! Many thanks to all the contributors!
And, my sliders are set at 5-4-4-5-3 as I wanted to make all changes keeping my original settings so I could compare results…I can change them later if necessary.

Before the upgrade, my OFF P3 frame rates averaged from low 20 fps to mid 40 fps, dropping to the 'teens' often and even single digit slide shows in some cases.

Just a note that when I reinstalled CFS 3 and OFF after the upgrade I noticed that the sound in OFF would stutter at times and I soon had several CTD and a couple of blue screen/reboots as well. Uninstalling and reinstalling seemed to cure it all.

After the upgrade, WOW… frame rates have jumped to running an average of 60 to 80 fps and often exceeding 100 fps depending on the views and activity. I was delighted, but I’m cautious before hollering from the rooftops so I’ve tested and tested in QC. I discovered that the high frame rates came at a price. I was getting definite tearing and white flashes with quick TrackIR head movement that I hadn’t had before.

Research told me that “Vsync” and “Triple buffering” would likely solve the problem.
The single solution was to change my Vsync from off to “Forced On” and now my rates are capped at 60 fps on the high end but they also hold at that for just about all flight scenarios at least in QC and only occasionally drop a bit by a few frames. The video seems smoother than before as well. The scenery detail is excellent; in fact all the detail is excellent.

So, I’ve got to feel pretty happy with this configuration, but I’m also interested in hearing if others have had a similar experience with graphics in OFF P3.

And, I’m curious to hear whether others think that I should be content with this set-up or should I try something else that may improve frame rates and game functions even more...? My tech friend thinks that more than 2 Gb of this high speed RAM won’t really help much but then we don’t know for sure.

Comments, suggestions?
 
Hi Jimko: Do I read correctly that forcing Vsync on has reduced the TrackIR blue-jaggies? I'm going to give that a try.

p.s. I don't see how your frame rates could get much better...doesn't look like you need to do any more.
 
Hi neighbour...

Yes, Vsync on stopped the tearing and in my case the TrackIR "white-jaggies"!
 
Come to think of it, I haven't noticed any flashes since switching to VSinc on as well. And unless there's something really different about the way OFF calculates frame rates, isn't anything 30 and up gravy? Doesn't the eye see fluid motion in the mid 20s?
 
Nice looking rig. Not surprised you've had decent results with it. My only other comment (as mentioned in another thread). You're wasting your time enabling Triple Buffering through your driver control panel. It'll only work for OpenGL apps. In order for you to enable it in DX9, the easiest way is to use D3DOverrider which comes with Riva Tuner. It calls the function directly from the programming interface, and it works, but doesn't interact very well with some other software. I've used it successfully with BHH and it does the trick. It got rid of the "tearing" that's typically associated with V-sync, and smoothed things out considerably. Be prepared to hear some odd little Windows sounds *bing-bing* if you're using it. I believe that's by design to indicate that buffering is in fact enabled.

As with all other little black magic graphics tricks, your mileage may vary. If you decide to experiment, you do so at your own risk. I'm not encouraging it's use one way or another. Just thought I'd let you know it's out there.


Nice of you to have taken the time and effort to share what sounds like a real success story.


Cheers,

Parky
 
My card (8800GTX) draws the line at v-sync on. With triple-buffering at the same time I don't go below 30fps most of the time, but jerking and stuttering is noticeable. With v-sync on but no triple-buffering the fps is around 10 lower.

4ghz CPU.
2Gb RAM.
 
Siggi,

That IS using a 3rd party tool to enable Triple Buffering, right?? Because if you aren't, Triple Buffering ISN'T enabled.....not unless you're testing it in an OpenGL environment which BHH doesn't allow for.

Cheers,

Parky
 
Siggi,

That IS using a 3rd party tool to enable Triple Buffering, right?? Because if you aren't, Triple Buffering ISN'T enabled.....not unless you're testing it in an OpenGL environment which BHH doesn't allow for.

Cheers,

Parky

I read what you'd written and thought "Eh...?!"

I ran two campaign missions (with Sgt Test Pilot) back to back, same weather, same everything really. In the one with triple-buffering enabled I was straight into the low thirties, in the next with it disabled I was straight into the low twenties. I have no special utilities running.

If what you say is correct it must have been caused by something else, but I'm buggered if I can think what.
 
Me either, but trust me, it had to be something else.


Cheers,

Parky
 
I think it must have been a weather/cloud variation. I've just tested the triple-buffer on and off and put myself alone on the field in QC with the same weather, the difference was 5fps. I think that difference was down to clouds directly in front of me.

Rain and dark clouds have a huge impact on fps. With dark clouds/sky and rain I got 30fps. With clear sunny day and a few fluffy white clouds I got 55fps (test was done with the latter).
 
Here's a fairly comprehensive explanation of Triple Buffering that also lists the 3 most common utilities that can be used to enable it.

Very useful site in general there. Good reading for any recovering Tweakaholics Anonymous members.....:friday:


Cheers,


Parky
 
Nice looking rig. Not surprised you've had decent results with it. My only other comment (as mentioned in another thread). You're wasting your time enabling Triple Buffering through your driver control panel. It'll only work for OpenGL apps. In order for you to enable it in DX9, the easiest way is to use D3DOverrider which comes with Riva Tuner. It calls the function directly from the programming interface, and it works, but doesn't interact very well with some other software. I've used it successfully with BHH and it does the trick. It got rid of the "tearing" that's typically associated with V-sync, and smoothed things out considerably. Be prepared to hear some odd little Windows sounds *bing-bing* if you're using it. I believe that's by design to indicate that buffering is in fact enabled.

As with all other little black magic graphics tricks, your mileage may vary. If you decide to experiment, you do so at your own risk. I'm not encouraging it's use one way or another. Just thought I'd let you know it's out there.


Nice of you to have taken the time and effort to share what sounds like a real success story.


Cheers,

Parky


Hi Parky, and thanks for the clarification...

I did read exactly what you have stated in my research and I forgot to mention that triple buffering was "on" in my system so I just left it on. Obviously then, the Vsync made the only real difference as I stated, but I wondered whether turning triple buffering off would change anything. I then forgot about it in my experimenting.

Forgot to mention that the os is XP Home!

I also forgot to ask the more specific question...Is the quest for higher frame rates sensible once I've reached a certain rate and everything seems to run well? (Is more better?) There has to be a plateau given the HW/SW configuration.
 
To speak to the last question again, I'm still thinking that there is no visual difference between framerates in the 30s and framerates in the 60s, unless what framerate means in OFF is something different than number of individual frames per second. It may well mean something different. I have no idea what counts as a frame in computer rendered graphics. I know that the movie you watch on television has a frame rate of 25-30.

I'm very curious to know what difference the developers or anyone more knowledgeable about computer graphics (which would likely be pretty much anybody) has to say about this.
 
To speak to the last question again, I'm still thinking that there is no visual difference between framerates in the 30s and framerates in the 60s, unless what framerate means in OFF is something different than number of individual frames per second. It may well mean something different. I have no idea what counts as a frame in computer rendered graphics. I know that the movie you watch on television has a frame rate of 25-30.

I'm very curious to know what difference the developers or anyone more knowledgeable about computer graphics (which would likely be pretty much anybody) has to say about this.

Take a look at this which is one of many pages on the subject of game graphics:

http://www.tweakguides.com/Graphics_5.html

ps: This is the research source that I've used and the same guide that Parky recommended (back a few posts).
 
Thanks Jimko for that reference. That article, and several it references, were very informative about computer graphics. But, they lead me to conclude something a bit contrary to the jist of their suggestions. I don't know much at all about computer graphics, but I do know a bit about physics and empirical testing, and the suggestion that what a person can identify from an image flashed in as little a 1/220th of a second means that the eye can distinguish between, say, 100 FPS and 200 FPS is nonsense (just to pick on one of what I believe to be the dubious claims). nVidia's "test" of a split frame run at 30/60 FPS is also suspect for me, although I don't know how they labeled the examples, and so can't be sure about bias. I'd like to see that myself, and will hunt around for it.

At any rate, though, some of the now classic experiments in the psychology of perception, repeated almost endlessly now, such as Kohler's famous experiments (which I bet you could find on the web), show pertty conclusively that even without blurring, the brain sees motion within a very definite range of interval because of what is known as beta phenomenon (referenced, albeit deeply, by the guide you refer to when it talks about persistance). Anyway, I don't want to sound persnickety (I'm terribly persnickity, actually, but I prefer not to sound as if I am), but the guide you referred to did say that FPS in games means, basically, individual frames per second. Since that's the case, I think FPS around 30, that STAYS around 30, of course, is plenty good, and certainly any higher than 60 would seem to me to be chasing rapidly diminishing, if not imaginary, returns. But, hey, who am I to stand in the way of another man's pursuit of bigger and better! If only they made a little pill for FPS....

Edit: I just wanted to make clear that my skepticism is aimed at the authors of those guides, who are, at least in part, in the business of selling us things to give us higher FPS. I like bigger and better as much as the next guy! Trust me, I'm lusting after Siggi's new monitor!
 
Hi Jimko: Do I read correctly that forcing Vsync on has reduced the TrackIR blue-jaggies? I'm going to give that a try.

Forcing Vsync on has reduced TrackIR jaggies for me too. No doubt about it.
 
Thanks Jimko for that reference. That article, and several it references, were very informative about computer graphics. But, they lead me to conclude something a bit contrary to the jist of their suggestions. I don't know much at all about computer graphics, but I do know a bit about physics and empirical testing, and the suggestion that what a person can identify from an image flashed in as little a 1/220th of a second means that the eye can distinguish between, say, 100 FPS and 200 FPS is nonsense (just to pick on one of what I believe to be the dubious claims). nVidia's "test" of a split frame run at 30/60 FPS is also suspect for me, although I don't know how they labeled the examples, and so can't be sure about bias. I'd like to see that myself, and will hunt around for it.

At any rate, though, some of the now classic experiments in the psychology of perception, repeated almost endlessly now, such as Kohler's famous experiments (which I bet you could find on the web), show pertty conclusively that even without blurring, the brain sees motion within a very definite range of interval because of what is known as beta phenomenon (referenced, albeit deeply, by the guide you refer to when it talks about persistance). Anyway, I don't want to sound persnickety (I'm terribly persnickity, actually, but I prefer not to sound as if I am), but the guide you referred to did say that FPS in games means, basically, individual frames per second. Since that's the case, I think FPS around 30, that STAYS around 30, of course, is plenty good, and certainly any higher than 60 would seem to me to be chasing rapidly diminishing, if not imaginary, returns. But, hey, who am I to stand in the way of another man's pursuit of bigger and better! If only they made a little pill for FPS....

Edit: I just wanted to make clear that my skepticism is aimed at the authors of those guides, who are, at least in part, in the business of selling us things to give us higher FPS. I like bigger and better as much as the next guy! Trust me, I'm lusting after Siggi's new monitor!

Hey Griphos,

A good commentary, and one which I would be at a loss to argue with...it's well out of my realm of pitiful knowledge on the topic...:isadizzy:

But, there are many other sources that indicate, rightly or wrongly, that 60 fps is a valuable asset on today's computer game graphics. It makes me curious, to say the least, what the consensus is on that figure.

More to the point of my question though, is the fact that I wondered if there might also be some other advantages that I haven't even considered, beyond solely visual graphic quality (which admittedly is the prime criteria), that might improve game functionality with faster frame rates. I guess I used to have to "think outside the box" in my work so it's kind of an old habit! (albeit, perhaps an annoying one...:redface:)
 
Ah, well, there is a good question that goes way above my pay grade. I'll be curious to see what answers you get to it. It's kind of interesting, though, that in FSX world, it seems that people lock the framerate at 30 or whatever (sometimes 18 on slower computers) so that cycles can be spent drawing pretty pictures above that. I don't know if that works either, but perhaps it suggests that the ability to get higher FPS in a game comes at the cost of something else, although a system that can give you 100s can probably spare something for the pretty pictures too.
 
Is there any relation between game fps and monitor refresh rate? 60hz has always been the standard below which you get eye strain with flickering.
 
I think in ye olde days Vsync on and you have 30 fps meant you had 1 frame on the screen for 2 syncs, so yeah would look worse I think.

60fps means you can show 1 frame per sync.

More fps gave smoother mouse controls and feedback too if I rem correctly.
 
Back
Top