8800gt + 1920x1200 + CS:S = ____ FPS?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Originally posted by: OCNewbie
Originally posted by: I Saw OJ
Cant the human eye only see something like 30fps?

"the" human eye might be limited to that, but my eye can see much higher than 30. I believe I can see at least 100fps. I'm sure a lot of other people will say the same.

Yeah, I bet you can see 100fps on your 60hz monitor.... :roll:
 

Oakenfold

Diamond Member
Feb 8, 2001
5,740
0
76
Originally posted by: LOUISSSSS
that fps was with my 1680x1050 monitor. i'm asking about 1920x1200.

how about with vsync on? will my frames stay at 59-60 stable?

Congrats on the new monitor. I figured that you must have upgraded. I did find a link to another forum, this is on a mac with a Core 2 extreme @ 2.8.
Claims he's getting 100fps
With your setup you should be solid if that claim holds true.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
well i'm still awaiting NCIXus.com to ship it. i ordered it on xmas (Dec 25th) and it has now been a week and it still hasn't been shipped out (no tracking number)
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
118
116
Originally posted by: Modular
Just get a 9800GTX+ or an HD4850, seriously.

Why? If you are playing CS, an 8800GT is fine. I play it cranked with absolutely everything cranked at 1920x1200 on a 2900XT and I don't get any stuttering or anything. Same with TF2 actually.

KT
 

thujone

Golden Member
Jun 15, 2003
1,158
0
71
at worse you might have to fiddle with what level of AA you can get away with.

at high resolutions that's about the only thing you have to worry about in a game this old.


i use an 8800gts and i usually just leave AA off in most games. but i'm impatient.
 

mxyzptlk

Golden Member
Apr 18, 2008
1,888
0
0
Originally posted by: OCNewbie
Originally posted by: I Saw OJ
Cant the human eye only see something like 30fps?

"the" human eye might be limited to that, but my eye can see much higher than 30. I believe I can see at least 100fps. I'm sure a lot of other people will say the same.

anything better than 12 frames per second is usually enough to fool your brain into thinking it's seeing the illusion of continuous movement and not individual static images.
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
Originally posted by: mxyzptlk
Originally posted by: OCNewbie
Originally posted by: I Saw OJ
Cant the human eye only see something like 30fps?

"the" human eye might be limited to that, but my eye can see much higher than 30. I believe I can see at least 100fps. I'm sure a lot of other people will say the same.

anything better than 12 frames per second is usually enough to fool your brain into thinking it's seeing the illusion of continuous movement and not individual static images.

lol. My eyes can see 1 million FPS. You weak "humans."

Get over yourself and enjoy the game :)
 

ussfletcher

Platinum Member
Apr 16, 2005
2,569
2
81
I believe humans see somewhere in the realm of 500 fps. The reason movies and such look fluid at lower fps is because of motion blur. Since most games don't support blurring, the higher the fps the better it will look.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
Originally posted by: thujone
at worse you might have to fiddle with what level of AA you can get away with.

at high resolutions that's about the only thing you have to worry about in a game this old.


i use an 8800gts and i usually just leave AA off in most games. but i'm impatient.

isn't it true that at the high res of 1920x1200, the level of AA (or was it AF? or was it both?) doesn't really matter and/or most people cannot tell the difference anyway?
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: OCNewbie
Originally posted by: I Saw OJ
Cant the human eye only see something like 30fps?

"the" human eye might be limited to that, but my eye can see much higher than 30. I believe I can see at least 100fps. I'm sure a lot of other people will say the same.

If you are talking about average fps than 100fps isn't too far off because frame rates fluctuate in a game. It's hard to tell the difference from constant 60fps vs 100fps.
 

OCNewbie

Diamond Member
Jul 18, 2000
7,596
25
81
Originally posted by: Azn
Originally posted by: OCNewbie
Originally posted by: I Saw OJ
Cant the human eye only see something like 30fps?

"the" human eye might be limited to that, but my eye can see much higher than 30. I believe I can see at least 100fps. I'm sure a lot of other people will say the same.

If you are talking about average fps than 100fps isn't too far off because frame rates fluctuate in a game. It's hard to tell the difference from constant 60fps vs 100fps.

I'm talking about Quake 2 with a maxfps of 112 or maxfps of 90 for example. I can definitely tell the diff between maxfps of 90 or maxfps of 112. And with a GTX 260, or just about any current vid card, Quake 2 is going to be pegged at 100+ FPS in any situation. I'm thinking the Q2 engine might be a little different though than other engines. My comments were in regards to just this one game.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: LOUISSSSS
Originally posted by: thujone
at worse you might have to fiddle with what level of AA you can get away with.

at high resolutions that's about the only thing you have to worry about in a game this old.


i use an 8800gts and i usually just leave AA off in most games. but i'm impatient.

isn't it true that at the high res of 1920x1200, the level of AA (or was it AF? or was it both?) doesn't really matter and/or most people cannot tell the difference anyway?

No. There's more to it than that.

If you really want to see the what difference is, choose a game where you can save at any point. Take screen shots of the save game (so the image is consistent) with different levels of AA and AF, then zoom in about 400% on the images to look at the pixels to see the difference.

The most important thing AA does, in my opinion, is reduce "edge crawling." You'll see this for example on the edge of a dark object with a bright background or the other way around. The edge will appear to "crawl" as your point of view changes.

AF basically sharpens textures that are viewed at an angle. When you look at a checkerboard straight on, you see perfect squares with alternating colors. When you tilt it, the "squares" are now rectangles with a smaller area... tilt it even more, and add complexity to the pattern, and eventually you completely loose some detail, creating a blurry image. AF attempts to analyze the texture and reduce the size appropriately for the angle it's being viewed without sacrificing important image detail.

Resolution has an indirect effect on both of those, as a higher resolution can display smaller pixels, making aliasing less visible, and texture blurring less noticeable, but they both still exist. And it's not necessarily resolution that's important, it's the pixel density. 1920x1080 on a 20 inch display is very dense and the images will look very smooth and detailed. 1920x1080 on a 60 inch plasma TV viewed at the same distance won't look nearly as smooth... you'll be able to focus on individual pixels easily.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
so are the factors of AA/AF MORE visible on a 20.1in 1680x1050 panel or a 24in 1920x1200 panel?
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
oh, and btw, i just got my 1920x1200 monitor and cs on it on ALL MAX SETTINGS in game give me smooth gameplay. i have seen drops below 100, but nothing that wasn't smooth.

the in game performance test gave me about 115fps w/ all max settings.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: LOUISSSSS
so are the factors of AA/AF MORE visible on a 20.1in 1680x1050 panel or a 24in 1920x1200 panel?

You can look at the pixel pitch (see: link) specification to find out rather than doing the math...

A 24 inch 1920x1200 display has a pixel pitch of .27mm
A 20 inch 1680x1050 display has a pixel pitch of .258mm

So the 20 inch display has a higher pixel density and the effects of aliasing and texture blurring will be slightly less noticeable.

You can actually search newegg for monitors using pixel pitch as the search criteria if you want to see what size/resolution displays have the highest density.

High pixel density is not always a good thing, though. Stolen from the LCD thread in the video forum:

Pixel Pitch/Dot Pitch: Dot pitch is the size (mm) of any given pixel on the matrix. Rarely do the width and height every vary (i.e. non-square pixels). Smaller dot pitches will provide a more fine picture with more accurate/sharp fonts. AA (antialiasing) and scaling will work better with a smaller dot pitch. It will also makes fonts appear smaller unless you use compensate with the rather shaky Windows DPI settings (they screw up a lot of dialogs). A larger dot pitch will give you bigger and slightly less sharp fonts along with bigger images. Those with eye trouble are generally advised to use the bigger dot pitch displays since the overall image is easier to see.

In terms of gaming and AA, though... you may need to use 4X AA on a low density display to achieve the same level of "smoothness" you see as using 2X AA on a high density display. Hope that makes sense. :)
 

Nik

Lifer
Jun 5, 2006
16,101
3
56
Originally posted by: videogames101
Originally posted by: OCNewbie
Originally posted by: I Saw OJ
Cant the human eye only see something like 30fps?

"the" human eye might be limited to that, but my eye can see much higher than 30. I believe I can see at least 100fps. I'm sure a lot of other people will say the same.

Yeah, I bet you can see 100fps on your 60hz monitor.... :roll:

My LCD is 59-60hz and I can tell the difference between 30fps and 100fps. You don't just stand still in video games, numbnuts. Simply moving your mouse around will present the difference between the two.
 

OCNewbie

Diamond Member
Jul 18, 2000
7,596
25
81
I run a 19" CRT btw at 112Hz refresh rate. I'm one of those slow LCD adapters. Once OLED comes out, or something that for all intents and purposes eliminates input lag and can match CRT's refresh rate, then I will buy a flat panel/LCD-type screen. I will say that the best LCD I've tried had a 5ms response time and unsure of it's input lag. Maybe one of those really nice LG 227's (unsure of exact model #) would be sufficient for me.