• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

We don't need no 9800 PRO or XT!!!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: VIAN
Nice. I like this one. Thats why cards become so obsolete. Halo is should be considered a tomorrow's game because: A. It is DX9, B. It was recently released, and C. It takes cards like the XT and give them less than 60 frames. How's that, 500 card and barely run Halo. I just a little angry.

Barely run Halo? I don't think so. My Ti4600 can push around 36fps @1024x768 with NO slowdowns. A 9800XT will probably be up around 45-50. Plenty smooth IMO. Play the game before you complain about it sucking on any kind of hardware.

Date / Time: 10/2/2003 4:14:12 AM (3545518ms)
2200MHz, 512MB, 128M nVidia GeForce4 Ti4600 (DeviceID=0x0250) Driver=6.14.10.4403 Shader=1.3
C:\Program Files\Microsoft Games\Halo\halo.exe -vidmode 1024,768,85 -timedemo (Version=1.0.1.580)
Frames=4700
Total Time=127.82s
Average frame rate=36.77fps
Below 5fps= 7% (time) 0% (frames) (10.087s spent in 12 frames)
Below 10fps= 8% (time) 0% (frames)
Below 15fps= 8% (time) 0% (frames)
Below 20fps= 20% (time) 6% (frames)
Below 25fps= 33% (time) 13% (frames)
Below 30fps= 39% (time) 18% (frames)
Below 40fps= 53% (time) 31% (frames)
Below 50fps= 75% (time) 59% (frames)
Below 60fps= 92% (time) 83% (frames)
Memory used Max=161MB, Min=139MB, Ave=151MB
 
My Ti4600 can push around 36fps @1024x768 with NO slowdowns. A 9800XT will probably be up around 45-50.

For single player that's not so bad, no way I would want to play multiplayer at that framerate though. For average 100-150FPS is usually pretty good for a multiplayer title(although Halo is excellent on both ends and everybody is pretty screwed on the framerate issue right now so it shouldn't be that big of a deal for the moment, a year from now not too many people on this board would consider a 9800XT good for Halo).
 
Then again, not to many people care about Halo anymore either, and certianly not a year from now. Played and beat it on Xbox. Moving on.
 
Halo ruled on XBOX and you know what, it's gonna rule some more on the PC. Better graphics, updated levels, and a mouse and keyboard. Come on, you must not have liked Halo. I own Halo for XBOX and I'm gonna buy it again cause it rocks. As far as frame rates are concerned. 60 frames is my bag. I'm satisfied with 30 frames minimum, but comeon, who wouldn't want 60fps. And the fact is that the XT is also a new card, I would expect any game to be out to run at 60fsp minimum. Like 70fps - 80fps so I can have some headroom.

And this is without AA & AF at 1024x768, I play at 1280x960, losing some frames there - nock that 50fps to about 45fps or less, just a crappy estimate. And what about those physco people that go all the way 1600x1200x32 and AA & AF. It's gonna look like crap for them when they play Halo with the XT at 15-20fps. So how great is the card anyway. Its crap. To me, it no better than the 9700 pro.
 
Guess im just not a Halo whore like you are. Half Life whore, yes thats what I am. XBox ports? No. If your kicking the 9800XT down for Halo, be sure to kick the 5900 ultra down for not running Tomb Raider or Half Life 2 worth a damn.
 
I think it's a good thing that new cards come out before the next batch of big games (CoD, HL2, D3, Q4, Deus Ex: IW, Thief III, etc.) do. It'll lower the prices of previous cards.
 
Originally posted by: VIAN
Halo ruled on XBOX and you know what, it's gonna rule some more on the PC. Better graphics, updated levels, and a mouse and keyboard. Come on, you must not have liked Halo. I own Halo for XBOX and I'm gonna buy it again cause it rocks. As far as frame rates are concerned. 60 frames is my bag. I'm satisfied with 30 frames minimum, but comeon, who wouldn't want 60fps. And the fact is that the XT is also a new card, I would expect any game to be out to run at 60fsp minimum. Like 70fps - 80fps so I can have some headroom.

And this is without AA & AF at 1024x768, I play at 1280x960, losing some frames there - nock that 50fps to about 45fps or less, just a crappy estimate. And what about those physco people that go all the way 1600x1200x32 and AA & AF. It's gonna look like crap for them when they play Halo with the XT at 15-20fps. So how great is the card anyway. Its crap. To me, it no better than the 9700 pro.

What's crap is that Halo is a console port that they seem not to have really optimised.

According to other posters, it looks worse than Unreal 2 and is slower as well, hence the problem is not the graphics card, but is most probably the game not being optimised for PC.

GTA3 is a similar case.
 
Originally posted by: BenSkywalker
My Ti4600 can push around 36fps @1024x768 with NO slowdowns. A 9800XT will probably be up around 45-50.

For single player that's not so bad, no way I would want to play multiplayer at that framerate though. For average 100-150FPS is usually pretty good for a multiplayer title(although Halo is excellent on both ends and everybody is pretty screwed on the framerate issue right now so it shouldn't be that big of a deal for the moment, a year from now not too many people on this board would consider a 9800XT good for Halo).

100-150fps???? What's the purpose of that? First off, you can't even percieve anything higher than 100fps.... Second, most monitors don't have a refresh rate that high, so you'll get vertical tearing unless you use vsync.... Third.... WHy the hell do you think you need 100-150fps? That's just ridiculous. I play games a 1280x1024 w/ 4xAA and 8xAF. And if my frames drop down in the the thirties, it doesn't bother me. It never goes about 60fps, because I use vysnc.

In conclusion, I believe you're an idiot. I Mean.... 150fps? For what?
 
Nebor... do you play online FPS games? Something along the lines of Quake 3 or UT2k3 or CS or BF1942 etc etc? 30 FPS in a game like those is unplayable... 30 FPS in a game like Flight Sim that doesn't have fast moving objects on the screen, or where you're not panning at a high rate is just fine. But if you think 30 frames per second in an intense online FPS is acceptable, either you have low standards, or... dare I say, you are the idiot.
 
150fps average usually means a minimum fps higher than that associated with 100fps average. So, higher is always better, particularly in twitch-reflex games like multiplayer online FPSs.

In conclusion, don't ad-hom, because it usually reveals the attacker as the fool. (Especially one who appears to have his card vsync'ed at 60Hz. I hope, for your eyes' sakes, you're using an LCD.)
 
Originally posted by: Lonyo
Originally posted by: VIAN
Halo ruled on XBOX and you know what, it's gonna rule some more on the PC. Better graphics, updated levels, and a mouse and keyboard. Come on, you must not have liked Halo. I own Halo for XBOX and I'm gonna buy it again cause it rocks. As far as frame rates are concerned. 60 frames is my bag. I'm satisfied with 30 frames minimum, but comeon, who wouldn't want 60fps. And the fact is that the XT is also a new card, I would expect any game to be out to run at 60fsp minimum. Like 70fps - 80fps so I can have some headroom.

And this is without AA & AF at 1024x768, I play at 1280x960, losing some frames there - nock that 50fps to about 45fps or less, just a crappy estimate. And what about those physco people that go all the way 1600x1200x32 and AA & AF. It's gonna look like crap for them when they play Halo with the XT at 15-20fps. So how great is the card anyway. Its crap. To me, it no better than the 9700 pro.

What's crap is that Halo is a console port that they seem not to have really optimised.

According to other posters, it looks worse than Unreal 2 and is slower as well, hence the problem is not the graphics card, but is most probably the game not being optimised for PC.

GTA3 is a similar case.

Actually, Halo was originally planned for the PC. Bungie needed a publisher, and M$ stepped in, and didnt publish it, but bought the entire Bungie company, then ported it over to XBox, now Gearhead software (I think it's them) ported it BACK over to PC coz M$ wanted to wait 2 years before releasing it for the PC.

complicated, I know :\
 
100-150fps???? What's the purpose of that?

Always have fun answering this one 🙂

First off, you can't even percieve anything higher than 100fps....

Wrong. The USAF has conducted studies in which pilots were shown images of jets for 1/200th of a second could not only identify that they had seen a jet, but could accurately state what type it was.

Second, most monitors don't have a refresh rate that high, so you'll get vertical tearing unless you use vsync..

My monitor does at the setting I play multiplayer FPSs at, 135Hz actually. Vertical tearing is no big deal when I'm focusing on not getting my @ss fragged.

Third.... WHy the hell do you think you need 100-150fps?

Because with an average of 150FPS your minimum framerate will likely be above 70FPS most of the time, even that isn't ideal.

I play games a 1280x1024 w/ 4xAA and 8xAF. And if my frames drop down in the the thirties, it doesn't bother me. It never goes about 60fps, because I use vysnc.

Don't know what kind of games you are talking about, so that doesn't mean too much. Having WC3 drop in to the 30s is no problem at all, same with CivIII, C&C Generals, SimCity4, MS FS 2K4 etc, talk about Quake3 and it is a major issue.

In conclusion, I believe you're an idiot. I Mean.... 150fps? For what?

Let's assume your false assupmtion that 100FPS is the max you can see. You know when your video card draws a frame that isn't the one you are seeing? Frames get drawn to a back buffer first, and then if you are using double buffering get 'flipped' and displayed on to your monitor. Your mouse input you see on the screen is what you were doing the frame prior, not at that exact point in time. The situation is amplified if you use tripple buffering as you are two frames behind. With an average framerate of 100FPS your minimum is likely to be about 47FPS give or take, with double buffering the latency from frame buffer operations effectively gives you 23.5FPS with double buffering from input to execution time(if you have perfect reflexes) and a whopping 15.67FPS when using tripple buffering. In laymans terms, if 100FPS were actually the maximum you could see, you would need 200FPS minimum to eliminate perceptable input latency running double buffered and 300FPS minimum running tripple buffered. That means around 460FPS or so average. Check out the settings the pros run some time, 100FPS average is far too slow for any really competitive FPS gamer.

In the example you gave for how your rig runs, you are looking at an effective input of 15FPS running in double buffered mode and 10FPS in tripple buffered. If you can't feel how slow that is, all the power to you. I certainly wouldn't be questioning the capacity of other people who state their desires for framerate if I were you 🙂
 
Originally posted by: VIAN
I don't know why all the fuss, the XT is only about 10 frames faster than the 9700 Pro. It's not worth the money to me. I think that anybody with a 9700 Pro is a fool to think about getting either of those, just wait for next gen. They don't offer that much performance. Next gen cards are guarranteed to offer at least 25% performance increase, at least.

I went from an 8500 LE to a 9800 pro. I needed it.
 
Higher FPS is always useful, because if you have some to spare, you can take that chunk and covert it to higher image quality.
 
I don't even think the XT or the nv38 are "10" frames faster on avg than their counter parts. They may just have higher peek points during little action in med quality settings . Nothing worth the extra 50-100 dollars. Specially when you can just over clock the 9800pro to just about the same speeds.
 
Originally posted by: Phantron
Ben, I hate to say this, but, you really need to get a life dude.


and some better information. fps doesn't have much to do with being shown an image for extremely short periods of time...it's a bad example. the brain identifying an image that is basically still (can't interpret motion from one flashed frame) is totally different than motion. the receptors in the eye only react at a certain speed...they react and send the image to the brain...they also don't "turn off" immediately, and send informationt to the brain for more time than the light was actually hitting them, giving the brain more time to process the information. this leads to motion blur.

you can't have a discussion about framerate without discussing motion blur. movies are 24fps, tv is 30fps interlaced (60 total refreshes per second...each interlaced frame is refreshed 30 times per sec)

movie cameras have shutters on them. they are open for a period of time, and moving images during that time are blurred. if you pause a dvd you will see this. if stopped at a specific frame during a scene with lots of motion, you'll see blurred images because of the shutter on the camera. computers do not have motion blur, and therefore need a higher framerate to produce smooth motion.

regardless of this, you simply cannot visually tell the difference over a certain limit, which even assuming genetic differences between people's eyes, etc. is much lower than 100fps. it's more like 60-75. so visually, there is no benefit from 100+fps

(i'm not saying high fps isn't good...i know the reasons for it go beyond just what you're seeing on the screen, average framerates, games responding better...timing issues, etc i'm just saying that visually, there is no benefit.)
 
Ben, I hate to say this, but, you really need to get a life dude.
Nice convincing point.
and some better information. fps doesn't have much to do with being shown an image for extremely short periods of time...it's a bad example
And what have you offered as proof? The fact is, in the many discussions I've had about motion, and the upper limits of human perception, is that it has never been determined. Using the minimum framerates used for film display, says more about economics than the science of perception. Its been proven through the use of subliminal messages, that the brain has a tremendous amount of processing power that is not easily measured. For example, I swear I can see the difference between the rotation of the fan in my car when its idleing at 1000 RPM over when its ramped up on my old RX7 up to 9000RPM+...why is that? What is the highest framerate that a person can perceive? How do you prove it exactly?
 
Edit: Oops, I quit reading after you started the tired old movie comparison.

And to the jackass that told BenSkywalker he needs to get a life, you need to shut your cake hole pal. Ben knows more about graphics than you ever will.
 
so you thinking you see your fan turning faster is proof?
of course 24fps has to do with economics...it being the cheapest acceptable framerate. if the movie industry could get away with 23 or 20, of course they'd use it.

did you ignore what i said about motion blur? look it up on google. your brain blurs images together to produce motion...haven't you ever used a flip book?

a computer doesn't have motion blur, so the framerate needs to be higher, but there is still a limit.
i'll try to find some studies to find the truth...i dont care about right and wrong...
 
It's interesting though that you're trying to tell someone that you know better than they do what they perceive. If somebody can tell the difference between 60 and 120 fps, who are you to tell them they can't?
 
Maskirovka

Do you have your display set to 60 Hz? If not, perhaps you should practice what you preach.

That's why I refuse to use LCDs. The slow refresh drives me nuts.
 
i was referring to the fact that his perception is not proof...that's all. i didn't tell him he was wrong.

the more frames your computer produces per second, the better the on screen motion will be blurred (by your brain). so objects moving faster will appear to have more realistic movement. that's the reason high fps looks better. it doesn't change the fact that your eye/brain processes information based on motion, not still images. i'm just saying the pilot image flashing thing is a bad example.

yes it proves your brain has amazing processing power...and it proves that you can react to something very quickly (especially pilots who have been trained to react on instinct) but it doesn't mean the eye can tell the difference between rock solid 100fps and 1000fps

that's what i was saying...that i'm pretty sure there's a perceptual limit, and his example was a bad one. don't fkn flame me because he knows a lot about comptuer graphics...settle down, chief. you can prove people wrong without insults...christ.
 
Originally posted by: BoberFett
Maskirovka

Do you have your display set to 60 Hz? If not, perhaps you should practice what you preach.

That's why I refuse to use LCDs. The slow refresh drives me nuts.

when did i say 60fps was the limit of human perception? 60Hz doesn't bother some people...but not too many people aren't satisfied with 75-85 Hz...i haven't met anyone that can tell the difference between 85 and 100Hz and if you can, fine...of course each person's visual perception is different. but there's a point where no human can really tell the difference.

edit: visual difference...i know there are gameplay advantages to having higher fps
 
Back
Top