OMG X800 only 30+ fps

cnhoff

Senior member
Feb 6, 2001
724
0
0
LMAO because of all that dumb***es whining around, wether they should sell their precious x800's and get a GT :D

It can never be enough, i guess... :frown:

I mean we are talking about 30+ fps at 1600 with 4*AA and 8* AF, are you guys crazy???

Other games will see the Ati cards ahead, so who cares about those frames in Doom3 at this insane settings?
 
Apr 14, 2004
1,599
0
0
Let me know when you pay $400+ for a card to have it crap out 2 months after you get it. People paying that much expect a lot; when I paid $400 for a gpu I expect rock solid 1600x1200 gaming for at least a year, preferrably with some AF.
 

cnhoff

Senior member
Feb 6, 2001
724
0
0
Those are the same guys, that have upgraded their vid card from a 9800XT to a X800 right after it came out...Deliberately "burning" so much cash is their decision to make but although i am (or was) a hardware enthusiast myself and can understand why someone does this, changing an X800 with a 6800 GT just because it performs "less excellent" in Doom3 goes beyond me...

From what i have read, 6800 and X800 are pretty on par, so why shouldn't it be the other way round in the next blockbuster game?!
 

Slimline

Golden Member
Jul 19, 2004
1,365
2
81
From what I hear, the x800 will walk all over the GT in half life 2...so it all depends on the game. And isnt doom3 open gl? Ati has never been THAT successful with open gl...Directr 3d is ther accelerant.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: GeneralGrievous
when I paid $400 for a gpu I expect rock solid 1600x1200 gaming for at least a year, preferrably with some AF.

Yeah thats reaching a little high there since the newer stuff with in that time frame may make your setup drop to its knees and crawl....I can see 1280x1024 being a little more realistic...but thats my opinion
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
"From what I hear, the x800 will walk all over the GT in half life 2..."


Thought this has been disproved already? Unless your listening to Gabe Newell
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: GeneralGrievous
Looks like the 6800 is doin fine.
Do you really want to pay so much money for a card then run it without AA/AF or at lower resolutions?

there are some people who do prefer 10x7 over anythign higher man...just because 16x12 is there...doesnt mean it HAS to be used...besides its his money he can do with it what he pleases.
 

cnhoff

Senior member
Feb 6, 2001
724
0
0
Actually i just wanted to make a reply in the Doom 3 benchmark discussion thread, somehow i pushed the wrong button. Starting a new thread for this is a bit lame, i know...well, s**t happens. :roll:

I have been to India for 3 months and have seen the new forum design yesterday for the first time.

Anyways, i have been reading every little scrap of Doom 3 coverage i could get my hands on, since i got back from there and when i read, that it was very playable imho at 1600*1200 4*aa 8*af on ANY CURRENT SYSTEM i was just amazed!!!

Then reading about that guy asking "Should I sell my x800pro to some poor soul and get a 6800 GT?" was just too much for me :D

I'll be the POOR SOUL anytime...
 

FluxCap

Golden Member
Aug 19, 2002
1,207
0
0
I wish we really knew how close or far apart the cards were in Half Life 2. I have a feeling Nvidia will be so close it won't matter which card you play. Currently, you can't say the same in Doom 3.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: GeneralGrievous
Looks like the 6800 is doin fine.
Do you really want to pay so much money for a card then run it without AA/AF or at lower resolutions?

Noticed you didn't comment on the Xbit graphs there. So what "collusion" do you think Nvidia pulled off to score better in HL2 beta according to xbit. Did Nvidia pay them off do you think? Were the latest nvidia drivers used and really old Cats were used on the ATI parts? Hmm. How about optimizations that sacrifice image quality. Thats always a good one.

I am not trying to be smug. Its just that someone just slammed what you said about ATI's 10% lead in the HL2 beta by providing an actual benchmark showing ATI behind by a few %. Current GT owners should be pleased that they can run "the beta at least" at 16x12 and still get near 50 fps with either ATI or NV.

Comment?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: aggressor
God forbid we listen to the guy who made the game.....

According to digitlife, the XT is 10% ahead of the Ultra in the HL2 beta @ 1600x1200. Adding AA/AF will likely increase that lead.[/quote]

All I've seen is:

http://www.ixbt-labs.com/articles2/digest3d/0604/itogi-video-hl2-wxp.html

Looks like the 6800 is doin fine.[/quote]

I really didn't want to add to this thread, given the incindiary nature of the first post, but I had to comment on that linked page. I've seen that link bandied about on several of the threads here, and I just have to say that the numbers on that page don't add up. At all.

A Radeon 9200 (non-SE, 128-bit memory), is nearly the same speed as a much higher-clocked 9800SE (still 128-bit memory), and actually faster than a 9600 (again, 128-bit memory, but the 9600 is clocked higher).

What is up with that? I have a 9200, and while I would love to say that the card is as fast as a 9600 or a 9800SE, believe me, I know that it's not. Those numbers are so wrong it's not even funny.

Actually, looking much more carefully, it appears that the 9200 is the only non-DX9-capable card in that entire list. So it actually could be competitive, frame-rate wise, against some of the other normally-faster cards, if it has a much lighter workload to render.

That calls into question the competency of the tester though - did they honestly think that the 9200 is a DX9-class card instead of DX8.1-class?

Considering the numbers shown, and the reputation of XBitLabs for "accuracy", I would definately take that page with more than the average-sized grain of salt, and consider it rather worthless for "proving" anything, IMHO.

Also, their webmaster is a bit incompetent, there is no link from that page to get back to the article that it was part of.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: VirtualLarry
Originally posted by: aggressor
God forbid we listen to the guy who made the game.....

According to digitlife, the XT is 10% ahead of the Ultra in the HL2 beta @ 1600x1200. Adding AA/AF will likely increase that lead.

All I've seen is:

http://www.ixbt-labs.com/articles2/digest3d/0604/itogi-video-hl2-wxp.html

Looks like the 6800 is doin fine.[/quote]

I really didn't want to add to this thread, given the incindiary nature of the first post, but I had to comment on that linked page. I've seen that link bandied about on several of the threads here, and I just have to say that the numbers on that page don't add up. At all.

A Radeon 9200 (non-SE, 128-bit memory), is nearly the same speed as a much higher-clocked 9800SE (still 128-bit memory), and actually faster than a 9600 (again, 128-bit memory, but the 9600 is clocked higher).

What is up with that? I have a 9200, and while I would love to say that the card is as fast as a 9600 or a 9800SE, believe me, I know that it's not. Those numbers are so wrong it's not even funny.

Actually, looking much more carefully, it appears that the 9200 is the only non-DX9-capable card in that entire list. So it actually could be competitive, frame-rate wise, against some of the other normally-faster cards, if it has a much lighter workload to render.

That calls into question the competency of the tester though - did they honestly think that the 9200 is a DX9-class card instead of DX8.1-class?

Considering the numbers shown, and the reputation of XBitLabs for "accuracy", I would definately take that page with more than the average-sized grain of salt, and consider it rather worthless for "proving" anything, IMHO.

Also, their webmaster is a bit incompetent, there is no link from that page to get back to the article that it was part of.[/quote]

Could it be they just threw it in there just so 9200 owners can see what they are in for? Suddenly because a 9200 non dx9 card was thrown in the mix, the reviewers competency is called into question by you? I think your looking for a reason, any reason at all, to discredit this review. Wow.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: VisableAssassin
Originally posted by: GeneralGrievous
Looks like the 6800 is doin fine.
Do you really want to pay so much money for a card then run it without AA/AF or at lower resolutions?

there are some people who do prefer 10x7 over anythign higher man...just because 16x12 is there...doesnt mean it HAS to be used...besides its his money he can do with it what he pleases.

And if that is the case, the whole thread is pointless, because the X800Pro seemed to do OK at 10x7 with AA/AF, IIRC.
But if not, one card is good for one game, one for the other, purportedly.

In any case, we should wait until we have numbers from both games from more than one source.
 

Fuchs

Member
Apr 13, 2004
160
0
0
so to recap, both cards have their Pro's and Con's and anyone can have whatever opinion they want. :p
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Doom3 will be playable at 30FPS, so I don't really see what the issue is. Sure, NVidia cards are stellar in their performance in Doom3, but it's not like the X800s are going to be utter crap when the game is said to be playable on a 64MB GeForce 3.
 
Apr 14, 2004
1,599
0
0
So what "collusion" do you think Nvidia pulled off to score better in HL2 beta according to xbit.
A better score at 1024x768 is meaningless to me. Here's your 10%. It's not like the Ultra Extreme is relevant when you can't buy one anywhere without signing up for a raffle at evga.com.

Current GT owners should be pleased that they can run "the beta at least" at 16x12 and still get near 50 fps with either ATI or NV.
I am. Forgive me for waiting for AA/AF benches before getting all worked up and excited.

here are some people who do prefer 10x7 over anythign higher man...just because 16x12 is there...doesnt mean it HAS to be used...besides its his money he can do with it what he pleases.
Agreed, but you really don't need such a fancy graphics card for 10x7 running do you? I'd say the majority of people spending this sort of money are playing at 1600x1200 or 1280x1024 if their monitor doesn't support the former. I can do 1024x768 on a $100 used 9700. It's certainly up to the end user of course.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: GeneralGrievous
So what "collusion" do you think Nvidia pulled off to score better in HL2 beta according to xbit.
A better score at 1024x768 is meaningless to me. Here's your 10%. It's not like the Ultra Extreme is relevant when you can't buy one anywhere without signing up for a raffle at evga.com.

Current GT owners should be pleased that they can run "the beta at least" at 16x12 and still get near 50 fps with either ATI or NV.
I am. Forgive me for waiting for AA/AF benches before getting all worked up and excited.

here are some people who do prefer 10x7 over anythign higher man...just because 16x12 is there...doesnt mean it HAS to be used...besides its his money he can do with it what he pleases.
Agreed, but you really don't need such a fancy graphics card for 10x7 running do you? I'd say the majority of people spending this sort of money are playing at 1600x1200 or 1280x1024 if their monitor doesn't support the former. I can do 1024x768 on a $100 used 9700. It's certainly up to the end user of course.

Whether you can buy the card or not is irrelevant as I hear the X800XT's are having clocking problems in production and its slim pickins for them as well. And since you only choose to acknowledge the Ultra, then yes, the XT at 16x12 is a whole 5 fps ,50 to 55, 10% than nvidia's 2nd fasteset card. Not the fastest.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
From what I hear, the x800 will walk all over the GT in half life 2...so it all depends on the game. And isnt doom3 open gl? Ati has never been THAT successful with open gl...Directr 3d is ther accelerant.

Doubt it to be honest. The HL2 beta being used by Digitlife is probably the worst case scenario for the 6800. The game has not optimized worth salt and if it was is only minimally optimized for the 5900.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
This isn't really surprising in the slightest - we've known all along that Nvidia is much better for OGL than ATI and that in an apples-to-apples comparison of similarly performing cards (in DirectX) from the two companies in OpenGL the ATI will lose.

This is right in line with most other OpenGL comparisons; Nvidia in the lead by 15-20% with their 6800U vs the X800XT.

With that said, 1600X1200 with AA and AF is pretty taxing; minimum framerates are probably doggish on everything, and IMO AA is not necessary at this resolution anyway.

However, 1600X1200 with AF should look beautiful, and Nvidia still leads comfortably.

This is the predicted coup for Nvidia; now we get to wait for HL2, which so far looks to run about 10% faster on the X800XT than the 6800U.

There's nothing really more to say - Nvidia has caught up with ATI (and surpassed them in features with SM3) this generation, and their OpenGL is the same as always - solid.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,090
32,626
146
I'll rock the boat and agree with the OP. Why get bummed your r420 based card isn't as fast as the nv40 ones in D3 if you still get playable frame rates@high res and settings? We have all speculated for at least the last 24 months that nv=D3 ATi=HL2 for best playing experience. However, none of us imagined this game would be so well coded that it could be made to play exceptably on even low end systems of that time period!

Carmack is a F'in artist! and we all get to enjoy his Masterpiece in our own homes :beer: Now stop whining or gloating as the case may be, and except the fact that no one loses out with this game ;)

To conclude, there are only varying degrees of stoke to be obtained in the D3 experience=it's all good.</logical conclusion>