Burma's Finest said:
At the risk of sounding pedantic I think you have this backwards: only the slowest of graphics cards render at only 40fps (especially on C2K5). I have three separate machines which all have completely different quality graphics cards and all of which render at a minimum of 60fps.
It's different in India.
Rarely would u find systems with GFX cards.and if they have it, the it generally 3 gen old like GeForce MX , the FX series etc.
They can't give 60 fps.
Burma's Finest said:
Here is how I have always understood it:
'... refresh rate also plays an important part in the 3D world...in the form of VSYNC. But what is VSYNC?
Well before we can answer that, a little discussion of how 3D works is required. In order to get a smoother transition between frames in 3D games, the video card puts the contents of the upcoming frame into its frame buffer. [The frame buffer is part of the local memory that resides on the video card itself] It then moves the contents of the frame buffer to the screen. When this is complete, the frame buffer gets the next frame. This process repeats its self over and over.
Now, what is VSYNC? Well, VSYNC is basically the synchronizing of buffer swaps with your monitor's refresh rate. With VSYNC enabled, frame rates will not exceed the monitor's current refresh rate for that particular resolution. For example, if your monitor is using a refresh rate of 85Hz at 800x600, with VSYNC enabled, you will theoretically never exceed 85fps. So the refresh rate creates an artificial barrier that limits the frame rate.
So what happens if you are playing on an older monitor that only supports a 60Hz refresh rate. Will you have to live with a maximum of 60fps (assuming that your system can generate more fps)? Not necessarily. Newer video cards give you the option of disabling VSYNC. What happens is that this allows the buffer swapping to occur without synchronizing with the monitor's refresh rate. If it really was the refresh rate limiting you, disabling VSYNC may allow you to obtain frame rates in excess of 60fps. This, unfortunately, can also cause what are called 'visual anomalies': image tearing and flashing polygons. Some games run fine with VSYNC enabled, while other games crumble when VSYNC is disabled.' [Source:
www.d-silence.com]
Most importantly:
'Frame rates will appear higher with vsync off but the amount of information displayed is limited by your monitor in that instance. A monitor with an 85 Hz refresh rate will only display 85 frames per second even if the videocard is rendering 400 frames a second. The visual quality may decline as a result also because some frames could be skipped as a result of excess rendering resulting some visual anomalies, image tearing and flashing polygons.' [Source :
www.experts-exchange.com]
'Computer games often allow vertical synchronization as an option, but is sometimes disabled because it has the effect of limiting frame rates to the monitor's refresh frequency creating a psychological disadvantage although not a visible one. Games that use double buffering but cannot keep up with the refresh frequency are usually limited in frame rate to a divisor of the refresh rate, but this can be avoided by the use of more buffers albeit at the cost of more video memory.' [Source:
http://en.wikipedia.org/wiki/Vertical_sync]
they all mean the same thing, just what I said earlier.
However, if u notice, they all talk about what happens when GFX cards cross maximum threshold of resolution.
And GFX cards would only cross 60, 85 FPS, when they are capable of it.
and if they did, there would hardly be any talk of increasing performance, because 60 FPS is more than enough.
However, I can assure u that 9200e doesn't have ability to cross 40FPS, forget 60.
So, if your card isn't powerful enough, then there is hardly any change in FPS whether u have refresh rate of 60 or 85.
Burma's Finest said:
Two views that provide a balanced summation to the discussion:
"I don't have any clue why someone would disable VSync for gameplay. The only legit reason for this is to benchmark 3D card performance without the monitor's refresh rate skewing the results. Regarding a 'philosophical VSync difference between Direct3D and OpenGL', that's nutty. There is no visual benefit to having a game render more frames per second than your monitor is displaying." - Tim Sweeney, Epic Games
notice he says "here is no visual benefit to having a game
render more frames per second than your monitor is displaying."And I totally agree with him.however, unlike him, I don't have GeForce 7900 GTX in QUad configuration powering my system.
Also, there is another fact I would bring to notice.Many games put a max cap of 85 or 100 FPS, to prevent cheating
(yes, u can cheat when u have 100+ FPS.Happened in Quake III)
Burma's Finest said:
"Although I would agree that there are no actual 'visual' benefits to disabling vsync (in fact tearing can make things look pretty god-awful), the ability to squeeze a couple more frames per second is a tweak I used quite often when playing graphically intensive games on lower performance systems. Was there tearing...damned right! Did it speed up the games? Sure did, but when you're getting 15-20fps in your favorite game, anything is an improvement, and the odd graphic glitch is a worthwhile tradeoff!" -
Paul Bonnette from MadOnion (the 3D Mark 2000 people)
[Source:
www.d-silence.com]
all sh*t.
I have been playing games and tweaking them since Quake III came out.
I have looked , and tweaked so many options.I have seen so many custom tweaked config files.These files are specially written by gamers to provide max gaming performance with excellent tradoff of visual quality.
And one thing common in all of them is comment about Vsync.It says
" recommeded Disabled.Enable only if it causes visual artifacts"
However, DO remember one thing.Refresh rate of 60 FPS on a CRT is DAMAGING TO EYES.