• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Woot! 5800 Ultra lands in box, games cower....

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Why do you feel Rollo has the right to define what is an acceptable resolution but I can't? Or more specifically, why do you feel it's acceptable for him to pick the resolution that his favourite GPU fetish just happens to match the performance of superior cards at?

again, this sounds more like a personal vendetta or something.. i mean, really.. what diff does it make? it's not like he's making the entire world conform to play at "his" resolution. why do you feel it's "unacceptable" for him to compare stuff at the resolution he plays at?

the generally accepted "acceptable resolution" is 1024x768, 768 anyways. it's the most common used (by far, tho i think 1280 will replace that soon) and testing standard.

Says who? You? Again why are you allowed to make resolution calls but I'm not?

i'm not making a call. but observation is pretty simple... there is a huge disparity in visual quality between 320 and 1024.. @320 the pixelation is so extreme many objects are unrecognizable. no char. details are visible whatsoever, other then genderal shape and color. the difference between 1024 vs 1600 is much less extreme. i'd post examples, but the problem is comparing the 320 shot to a 1024 shot would be like comparing a thumbnail to a screenshot.

Quite honestly I don't really care.

if that's true, then quit posting; in saying you "don't care", yet taking time to post on the subject, you're actions contradict your words.
 
Originally posted by: Pete
BFG, not every monitor can game at 16x12 at 85+Hz. My 19" only does a semi-uncomfortable 70Hz,...
And mine only does 60Hz at 16x12. That's not an option for me. The highest res. with a decent refresh rate on my monitor is 1280x1024 and I can get 75Hz there but the picture is grainy.

1152x864 is my preferred resolution now.
 
Originally posted by: VIAN
Because no cards are powerful enough to run UT2004 at 16X12
Anything from the 5900 and the 9700 Pro can run UT 2004 at 1600x1200. You don't even need AA at that point so why bother. Not many monitors look nice however with that resolution. 22" monitors harbor that rexolution as optimal, but anything monitor smaller and you're not getting the best picture quality.

I play at 10x7 too, 2xaa, 4xaf.

I do have a 22" monitor that can run 16X12X32 at 85Hz, so I gave it a try just to make the oddly silent BFG happy. I found that UT2004 still ran pretty well at that resolution (50s and 60s) with no AA/AF, but putting on the Quincunx AA and 4 X AF dropped the performance to the 40s-50s, but it would drop into the 30s when I was fighting the bots. So I found pretty much the same thing those "unknown" guys over at Firing Squad found: that UT2004 is pretty slow at 16X12 with some AA/AF going on.

I didn't play at those settings with my 9800 Pro either, because I like to win more than make my card stutter in big battles.

The guys that say the 5800 is noisey:
It is totally silent in 2d, the fan doesn't run.
In 3d it's definitely noticeable, it sounds like a case tricked for OCing with some a big fan or two. With the audio of the game going it's not that noticeable.
 
Originally posted by: CaiNaM
Why do you feel Rollo has the right to define what is an acceptable resolution but I can't? Or more specifically, why do you feel it's acceptable for him to pick the resolution that his favourite GPU fetish just happens to match the performance of superior cards at?

again, this sounds more like a personal vendetta or something.. i mean, really.. what diff does it make? it's not like he's making the entire world conform to play at "his" resolution. why do you feel it's "unacceptable" for him to compare stuff at the resolution he plays at?

the generally accepted "acceptable resolution" is 1024x768, 768 anyways. it's the most common used (by far, tho i think 1280 will replace that soon) and testing standard.

Says who? You? Again why are you allowed to make resolution calls but I'm not?

i'm not making a call. but observation is pretty simple... there is a huge disparity in visual quality between 320 and 1024.. @320 the pixelation is so extreme many objects are unrecognizable. no char. details are visible whatsoever, other then genderal shape and color. the difference between 1024 vs 1600 is much less extreme. i'd post examples, but the problem is comparing the 320 shot to a 1024 shot would be like comparing a thumbnail to a screenshot.

I think his point is not that you shouldnt game at 10x7 but if youre going to compare GPU performance using 10x7 with no AA and no AF. Those arent exactly the best settings to do so with.
 
I think his point is not that you shouldnt game at 10x7 but if youre going to compare GPU performance using 10x7 with no AA and no AF. Those arent exactly the best settings to do so with.
That can't be his point because I say right there in the first post:

BTW- for those who nominated it for "Worst Card Ever" in Videoclone's thread, I can tell you when you can't tell the difference between the 5800 U and a 9800 Pro at UT2004, and you fire it up no problems, head into the online and start winning games at either 10X7 4X8X or 10X7 Quin/4X, you're probably not playing with the worst card ever.

10X7 4X8X is a perfectly valid setting to test video card performance at UT2004. I have surmised this by seeing a wide variation in the performance in botmatch at 10X7 4X8X.

High end cards at 10X74X8X

Uh oh Midrange cards on a faster cpu much slower at 10X7 4X8X

So you see, the video card you have does make some difference at UT2004 10X7, 4X8X.


 
I think his point is not that you shouldnt game at 10x7 but if youre going to compare GPU performance using 10x7 with no AA and no AF. Those arent exactly the best settings to do so with.

i understand the concept, however AA and AF are being used, at least in the context of this discussion. as far as "comparing gpu performance using 10x7", what's wrong with comparing the setting that are used for playing with? perhaps 1280 would be a bit more appropirate, but the point isn't saying this card is better than that card (the 9800pro is obviously more powerful; no one is arguing that here), rather the point is the 5800u has perfectly acceptable performance at the resolutions the game is most often played at. i don't understand all the fuss over this simple, obvious fact.
 
Rollo 03/14/2004 2:16 PM
"Look at that crap. A Radeon 9800XT losing 58% of it it's performance in PS2, at a crap setting like 10X7, no AA/AF. Pathetic"

LOL, At least you're consistent.
 
Originally posted by: rbV5
Rollo 03/14/2004 2:16 PM
"Look at that crap. A Radeon 9800XT losing 58% of it it's performance in PS2, at a crap setting like 10X7, no AA/AF. Pathetic"

LOL, At least you're consistent.


How is that inconsistent?!

I do think 10X7, no AA/AF is a crap setting for a 9800XT. I think the 10X7 4X8X I was playing at, and posted about, is a pretty reasonable setting for all cards in the 9700/9800, 5800/5900 level to play at.

The point I was making in the quote you tried to use out of context is that BFG never would accept 10X7 no AA/AF on any other game, but he tried to use it as evidence of the ATI PS2 "superiority".
 
LOL and out trot the ATI fans.
ATi fans? I'm not the one producing a soap opera around the 5800. What's your next 5800 thread going to be called, "my 5800's fan spun up for the first time today?"

Need even more proof BFG is an ATI sales rep?
If I'm an ATi sales rep then you must be a 5800 sales rep.

Q: Why don't I run UT 2004 at 16X12, ATI fans?
A: Because no cards are powerful enough to run UT2004 at 16X12.
So there are no resolutions between 1600 x 1200 and 1024 x 768?

The stupidity of your logic is increasing every day Rollo.

LOL you have no concept of reality anymore BFG.
LMAO. Pot-kettle-black. Again Rollo, what happened to 1152 x 864 and 1280 x 960? Did those resolutions slip away from your reality?

Or wait, don't tell me: there's no difference between them and 1024 x 768 so that means they don't count, right?
rolleye.gif


Not only that but your own links show that your previous card - the 9800 Pro - is beating the 5800 Ultra at the very same settings you use. So after coming back here to proclaim equality when the benchmarks clearly show otherwise, you start accusing me of loosing touch with reality?

Poor BFG. More fan nonsense.
Another thing of interest :- first your 9700 Pro was too slow at 1024 x 768 with 0xAA and 0xAF. Then you picked up your first 5800 Ultra and your preferred setting mysteriously increased to 4xAA and 4xAF. Then after complaining about your 9800 Pro being too slow you again came back to a 5800 Ultra and your setting has yet again increased to 4xAA and 8xAF.

So after complaining about a lack of speed on both faster ATi cards, you have twice picked up even slower nVidia cards and then proceeded to increase the detail settings on them.

Is that what you'd call rational and non-zealot behaviour Rollo? Is that what you class as having a grasp on reality?

You are nothing more than a troll and I don't buy your "I'm only doing this because I like trying different cards" argument. Not for one second. You can't even produce one consistent or valid argument as to explain your behaviour in this whole fiasco.

10X7 4X8X is a perfectly valid setting to test video card performance at UT2004
Except when we did this the first time you claimed the 9700 Pro was too slow with 0xAA and 0xAF. Then each time you picked up an even slower nVidia card and you changed your tune to make it fit into your warped logic.

I do think 10X7, no AA/AF is a crap setting for a 9800XT.
Yet it was perfectly acceptable when you were trying to prove that a 5800 is equal to a 9700 Pro, right?

The point I was making in the quote you tried to use out of context is that BFG never would accept 10X7 no AA/AF on any other game, but he tried to use it as evidence of the ATI PS2 "superiority".
ATi's PS2.0 superiorty doesn't exist in your reality, much like the missing resolutions between 1600 x 1200 and 1024 x 768 don't.
 
why do you feel it's "unacceptable" for him to compare stuff at the resolution he plays at?
Because he's trolling that's why. He's not content to simply play at those settings; instead he feels the need to start a new thread every week and proclaim equality using ridiclous and invalid premises.

If I started posting "a GF3 is equal to a 5800 because 640 x 480 is an acceptable resolution" would you feel that's a valid post to make? Because that's exactly the logic that Rollo uses - use CPU limited resolutions to make ridiculous claims to try to justify his purchases.

the generally accepted "acceptable resolution" is 1024x768,
It's also generally accepted that the 5800 is a failure and that nobody would want to trade it for a 9700/9800. So where does that leave us then? Do we use public opinion or not?

there is a huge disparity in visual quality between 320 and 1024..
(1) Just like there's a huge disparity between 1600 x120 and 1280 x 960.
(2) That disparity doesn't make the 5800 equal to the 9800.

f that's true, then quit posting; in saying you "don't care", yet taking time to post on the subject, you're actions contradict your words.
No I mean I don't care about public opinion and I instead work with facts.

If you care about public opinion then you'd have to conclude that Rollo's behaviour is both illogical and abnormal. So which is it? Are we using public opinion or not?
 
i really don't see where he stated anywhere the 5800 is supreme.. just that it runs current games at acceptable framerates, and that it's not the "piece of shite" many like to claim it is. he's stated many times the 9800pro he gave up was the better card; that at higher resolutions and higher aa/af settings it's much faster, and so on...

it just baffles me why this has to turn into such a pissing match. basically his whole premise is that it's not as fast, but it's kinda cool and it plays todays game just fine at reasonable settings. what's wrong with that? you guys make it out to be a much bigger thing that it is. he's hardly attacking your beloved ati.. tho he's been put on the 'defensive' by others. i just don't see why some in the ati camp have to take offense at that...
 
i really don't see where he stated anywhere the 5800 is supreme
He didn't - he said it was equal using his CPU limited settings as evidence.

basically his whole premise is that it's not as fast, but it's kinda cool and it plays todays game just fine at reasonable settings.
That isn't what he's saying. Look at his first post - that sets the stage for what has been his attitude in the entire saga.

Basically he states something ridiculous and then tries to argue with everyone who points out that his argument is illogical and invalid.
 
Originally posted by: BFG10K
i really don't see where he stated anywhere the 5800 is supreme
He didn't - he said it was equal using his CPU limited settings as evidence.

basically his whole premise is that it's not as fast, but it's kinda cool and it plays todays game just fine at reasonable settings.
That isn't what he's saying. Look at his first post - that sets the stage for what has been his attitude in the entire saga.

How is "Cant tell the difference" saying that his card is better?

He even admitted that it clearly ran better on FC.

I surmise that you only like to shoot down anything positive about an nvidia card. BFG.
 
How the mighty have fallen. 🙁

I used to respect your knowledge, BFG, now you're just a parody. You've gone completely fan boy.

Where to begin?

ATi fans? I'm not the one producing a soap opera around the 5800. What's your next 5800 thread going to be called, "my 5800's fan spun up for the first time today?"
Pardon me for living BFG. I was happy to finally get a 5800 Ultra, which I've wanted to try for a year.

So there are no resolutions between 1600 x 1200 and 1024 x 768?
The stupidity of your logic is increasing every day Rollo.
The question was asked why don't I play at 1600X1200. I responded to it. With your "logic", should I have told him why I don't play at 12X10?
rolleye.gif


Again Rollo, what happened to 1152 x 864 and 1280 x 960? Did those resolutions slip away from your reality?
Again, no one asked about them, and even if they did, I was responding to a post I quoted about 16X12?

Or wait, don't tell me: there's no difference between them and 1024 x 768 so that means they don't count, right?
I never said that, why are you implying I did?

Not only that but your own links show that your previous card - the 9800 Pro - is beating the 5800 Ultra at the very same settings you use. So after coming back here to proclaim equality when the benchmarks clearly show otherwise, you start accusing me of loosing touch with reality?


I am accusing you BFG. You are right in that the links I posted show the 9800Pro beating the 5800 Ultra at the resolutions I play at. You'll note;however, I said in my original post I can't tell the difference, not that my 5800 Ultra beats the 9800Pro?
Could you tell the difference BFG? In my link, at my settings, the 9800 Pro is running at 78.2 average fps and the 5800 Ultra is at 71.1 fps. (your 9700Pro trails at 69.6 BTW) Too close to call with no counter on, ol buddy.

Another thing of interest :- first your 9700 Pro was too slow at 1024 x 768 with 0xAA and 0xAF. Then you picked up your first 5800 Ultra and your preferred setting mysteriously increased to 4xAA and 4xAF. Then after complaining about your 9800 Pro being too slow you again came back to a 5800 Ultra and your setting has yet again increased to 4xAA and 8xAF.
You've said this a lot of times, and I've never understood why. A. I said my 9700 Pro was too slow to run at the settings you wanted me to run it at.(inexplicably- you'll note I've never told you what settings to use) and at the time I was playing 10X7 0X0X. I had a 19" monitor then and a XP1600+, followed be a P4 2.53. I only played online, ands wanted the fastest framerates possible. B. I only ran the 5800 (OCd to Ultra speed) with the AA/AF settings to prove to you it could do it about as fast as your 9700Pro, which you doubted for unknown reasons.

Because he's trolling that's why.
How am I "trolling" BFG? I paid $212. shipped for a 5800 that turned out to be DOA and got stiffed by the college kid who sold it to me. I traded my excellent 9800Pro that I paid $379 for to get a 5800U, a rare card I've wanted for over a year. You don't think that's worth a post? Who made you the post police? I didn't post any flame bait, I only posted I like my new card and that for the way I use it, it's been about as good.

Because that's exactly the logic that Rollo uses - use CPU limited resolutions to make ridiculous claims to try to justify his purchases.
Of course, you still haven't bothered to prove that 10X7X32 4X8X is a cpu limited situation by showing some cards that are below the 5800/9700 level perform the same at that setting, you just keep saying it is like a tape recorder. I say you don't prove it because you can't

It's also generally accepted that the 5800 is a failure and that nobody would want to trade it for a 9700/9800. So where does that leave us then
The 5800 has a following. Like I said, are there any other cards at 500Mhz GPU? No. Any others with 1.8ns DDR2? No. Any others with that elaborate heatpipe system the brings cool air in, blows heated air out? No.
Beyond that, why am I "nobody"? I went to college a few times, work at a big company in IS, try to be a good husband/father/neighbor in my suburb? I earned the money for the cards, why do care if I buy them? Are you the board police on who should buy what cards now too?

That disparity doesn't make the 5800 equal to the 9800.
No, but at 10X7, 4X8X, the 5800 Ultra is the functional equivalent of a 9800 Pro at UT2003. No other card in the world that doesn't have a 9700/9800/5900 can make that claim, so it's in the top few cards.

If you care about public opinion then you'd have to conclude that Rollo's behaviour is both illogical and abnormal. So which is it?
You're pretty good at ignoring whatever contradicts you. There are people in this thread who understand my trade.
More importantly, what is this vendetta you have against me and my cards? Does it anger you that much that I'd trade a better card than you have for one you consider not as good? Why?
It's a free country BFG. I'm starting to think you're the video card Taliban.










 
Some games "cower" with a GF2, so what? The 5800 Ultra isnt near as good as other cards in its price/performance catagory.

Try enabling AA/AF and see the 5800 Ultra cower to games. But Rollo thinks ATi's AA isnt better. Even nVidiots realize that it is..
 
Originally posted by: Ackmed
Some games "cower" with a GF2, so what? The 5800 Ultra isnt near as good as other cards in its price/performance catagory.

Try enabling AA/AF and see the 5800 Ultra cower to games. But Rollo thinks ATi's AA isnt better. Even nVidiots realize that it is..


You mean like in the links I posted where it's running 4X AA 8X AF as fast as a 9700Pro, the card it was meant to compete against Ackmed?
rolleye.gif


LOL some people can't seem to breathe if someone is apparently questioning whether or not their stuff was the best, even when the guy doing the questioning has already admitted their stuff has a slight edge.
 
Pardon me for living BFG. I was happy to finally get a 5800 Ultra, which I've wanted to try for a year.
I see. So when you praise a card you're "living" but when I praise it "I'm an ATi fan that has lost your respect"?

Can you at least make an effort to post something legitimate for a change?

The question was asked why don't I play at 1600X1200.
Nice attempted backpedal. The fact is you've always touted 1024 x 768 and ignored everything else solely because the 5800 just happens to run reasonably equal with the faster cards you've voluntarily given up.

I never said that, why are you implying I did?
Because your equality comments have always been founded around running at 1024 x 768 and about 1600 x 1200 being too slow. Therefore in your reality those resolutions simply don't exist because they don't fit anywhere into your story.

Could you tell the difference BFG?
That isn't even remotely the issue here so stop clouding it. The issue is your proclaimed equality at 1024 x 768 and hence that the cards are equal. They are not equal, they are merely the same when CPU limitations are a factor. That doesn't make them equal anymore than a Ti4200 is equal to a 5900 when running at 320 x 240 x 16.

You've said this a lot of times, and I've never understood why.
I've understood perfectly. Each time you appear to change your settings to correspond to the highest possible setting on the 5800 that is still showing reasonable equality. In otherwords you shift the goal-post each time so that the 5800 comes out looking like it's equal to faster cards.

I've read the rest of your explanation but quite frankly it holds no water. First you complain about speed issues and then you pick up slower cards, only to then crank up the settings so that they run even slower than your original cards did. I'm sorry but that behaviour simply cannot be explained logically.

How am I "trolling" BFG?
By saying things like "a GF3 is equal to a 5800 because I run at 640 x 480".

Of course, you still haven't bothered to prove that 10X7X32 4X8X is a cpu limited situation by showing some cards that are below the 5800/9700 level perform the same at that setting,
Am I also required to prove 4xAA and 4xAF are CPU limited? What about 0xAA and 0xAF? Should I be proving every setting every time you shift the goal-post so it suits your equality argument?

I say you don't prove it because you can't
It's actually very easy to prove - look at the comparison of the graphs. Each time the resolution is raised beyond 1024 x 768 the faster cards distance themselves further and further from the slower cards. It's benchmarking 101, something you don't even appear to understand yet you continually make equality comments.

Would you run a CPU test of an Athlon 64 against a P3 using a GF2 MX running at 2048 x 1536? Of course you wouldn't because it's an invalid test. The setting is not suited for testing the performance you are trying to test and it's also not suited for making comparison claims later on. And whether you like running at 2048 x 1536 or not has absolutely no bearing on the fact that saying both processors are equal is invalid.

Exactly the same thing applies to 1024 x 768 using most games out there today on high-end cards.

The 5800 has a following.
That's lovely but that fact doesn't make it equal to the certain cards you claim it's equal to.

why do care if I buy them? Are you the board police on who should buy what cards now too?
Buying the card has never been the issue; the issue with your antics and comments you make after you buy the card.

It's like me buying a P3 and then starting posts that it's equal to an A64 because I ran tests @ 2048 x 1536 on my GF2 MX and both processors came out "equal".

No, but at 10X7, 4X8X, the 5800 Ultra is the functional equivalent of a 9800 Pro at UT2003.
At 640 x 480 the 5600 is "equal" to a 5800.
At 320 x 240 a 5200 is "equal" to a 5800.
At 2048 x 1536 on a GF2 MX the P3 is "equal" to an A64.

So what?

More importantly, what is this vendetta you have against me and my cards? Does it anger you that much that I'd trade a better card than you have for one you consider not as good? Why?
You always try to turn this around and claim I'm attaking you as a person. I'm not - I simply object to your behavour.
 
Originally posted by: Ackmed
Some games "cower" with a GF2, so what? The 5800 Ultra isnt near as good as other cards in its price/performance catagory.

Try enabling AA/AF and see the 5800 Ultra cower to games. But Rollo thinks ATi's AA isnt better. Even nVidiots realize that it is..

deja moo

😀
 
You win BFG, the 5800 is junk. As a matter of fact, i dont know why stores even sell products other than Athlon FXs and Radeon 9800 XTs. :brokenheart:
 
Originally posted by: Acanthus
You win BFG, the 5800 is junk. As a matter of fact, i dont know why stores even sell products other than Athlon FXs and Radeon 9800 XTs. :brokenheart:
LOL That's about what it comes down to. If you're not using BFGs stuff, get set to be flamed.

BFG:
It's actually very easy to prove - look at the comparison of the graphs. Each time the resolution is raised beyond 1024 x 768 the faster cards distance themselves further and further from the slower cards.

Apparently one of us doesn't understand it. I keep asking you to prove it because:
A. I've posted a link to a reputable website that shows lesser cards like 9600Pros running UT2003 bot match at much slower speeds than a 5800 or 9700, with a faster cpu, which tells me that setting isn't cpu limited. Wouldn't they all be the same, or the 9600Pro faster with a faster cpu if that were the case?

B. In the link I already posted, look how your theories seem untrue

UT2003, 3GHz cpu, 9700Pro/9800Pro/5800 Ultra, all at 4X8X

800X600
9700P 83.1fps
9800P 83.9fps
5800U 79.6fps

10X7
9700P 69fps
9800P 78fps
5800U 71 fps

12X9
9700P 44 fps
9800P 52 fps
5800U 50 fps

16x12
9700P 30fps
9800P 36fps
5800U 30 fps

LOL Yeah big trend going on there over 10X7 BFG. Let's see: Your 9700 Pro goes from losing by 2fps to losing by 6 fps to breaking even!

Maybe if we raised the resolution some more and got the framerate down in the teens your 9700 Pro would start winning!

What about the "superior" 9800 Pro?
Well we go from 78/71 at 10X7 to 52/50 at 12X9, so the indiscernible lead drops from 10% to less than 5%, but shoots up to 20% at 16X12! Too bad you really wouldn't be able to play UT2003 at an average 36 or 30fps isn't it BFG? It might have been the only scenario that would have come close to making your point, but both cards are so slow at this moint it really doesn't matter at all. And you call me insane?
rolleye.gif
 
Originally posted by: i82lazyboy
rollo, do drop us a few pics here if you gots the time: my email. I find the 5800 Ultra very intriguing.

Hope you have broadband, 1MB coming at you. The stomped.jpg shows the Abit one I bought on Ebay, and the sleaze sent it in an envelope, told me he couldn't give me my money back because he needed it for Spring Break, and wanted me to ship it back to him so he could try to defraud Abit by saying it died in use, under warranty. Most of you have seen what I think of fraudulent RMAs, when my wife suggested I wait and take him up on it, I made sure no one would be RMAing the card.

The other three are back, top and back of case views. Check out the big Zalman fan in front of it. That thing is mounted to a 3com NIC I hacksawed to make it a slot fan, it's supposed to mount above the card.

There is one for sale for $220 over at nVNews if you'd like to play around with one too.

Hey BFG- my father in law showed me a 1960s Browning Auto 5 12 gauge today that he just inherited. I told him if he ever wanted to get rid of it, I'd give him the $800 the gun shop told him it's worth. Why would I do that BFG?!?!?! I already have a '96 Browning Gold semi auto that is technically superior, my behavior is illogical and objectionable! No sane man would downgrade shotguns just to own one that's rarer, would they?!?!?!

LOL I think you must be a very literal and practical person BFG. The world needs them too, more power to you. 😉
 
Awsome, thnx.

As much as I'd like to get my gritty fingers on one, I'm saving my money for a NV40. For now I'll just settle for having a look @ some pics of "the beast" instead of going out and getting 1.
 
Back
Top