Is it me or is [H]ardOCP an ATI Fanboy site !!!!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Apr 14, 2004
1,599
0
0
The posts over at nVNews say the $300 6800s are shipping next week, I'm personally interested to see where those fall in the hierarchy. For a buyer who has to watch his money, would it be worth $100 more to get 10% more performance (and brilinear, SM2 performance at that) from a X800 Pro? (if it turns out to be that small of a difference)
Next week huh? Yeah, let's all snatch up this generation's version of the FX5200.
 
Apr 17, 2003
37,622
0
76
Originally posted by: GeneralGrievous
The posts over at nVNews say the $300 6800s are shipping next week, I'm personally interested to see where those fall in the hierarchy. For a buyer who has to watch his money, would it be worth $100 more to get 10% more performance (and brilinear, SM2 performance at that) from a X800 Pro? (if it turns out to be that small of a difference)
Next week huh? Yeah, let's all snatch up this generation's version of the FX5200.

its not that bad at all, performance is supposed to be better than a 9800 XT which would be nice for $300
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: GeneralGrievous
The posts over at nVNews say the $300 6800s are shipping next week, I'm personally interested to see where those fall in the hierarchy. For a buyer who has to watch his money, would it be worth $100 more to get 10% more performance (and brilinear, SM2 performance at that) from a X800 Pro? (if it turns out to be that small of a difference)
Next week huh? Yeah, let's all snatch up this generation's version of the FX5200.

You have to remember General that not everyone has as much money as you do and can't preorder X800XTs.

To them, a card that's 10-15% slower than a X800Pro (if it is that much slower) for $100 or more less might be a big deal?

In any case, I hardly think the performance delta of the 6800 will be comparable to the 5200.

I'm starting to think you work for ATI....
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: videoclone
yep rollo the X800XT is a very nice card ... ... pitter about the features set i think the R500 and NV50 are going to be something cool... do u think they will be DX10 ?

Don't know. Is there anything more reliable than the Inquirer that has been said about them?
 
Apr 14, 2004
1,599
0
0
To them, a card that's 10-15% slower than a X800Pro (if it is that much slower) for $100 or more less might be a big deal?
10-15% slower? Right. :roll: If someone wants a card with crap 128 mb memory and slowass 2002 clocks they can go ahead and pick up that card.

I guess this is a victory for Nvidia though. Finally after 2 years they have a card that beats the 9700 Pro!

Here's some benchmarks from some chinese site that managed to get their hands on one. That Farcry benchmark looks promising.
 

eastvillager

Senior member
Mar 27, 2003
519
0
0
Originally posted by: Childs
Originally posted by: Acanthus
Hocp has had problems with ATi bias for a while.

I think that huge permenant ATI ad on their homepage has something to do with it.



...or, as I've already mentioned, ATI is WINNING, so of course the hardware enthusiast site likes ATI.


do a bit of research and try to pick when you think hardocp became 'biased' towards ATI. I think you'll find a direct coorelation with the time period when ATI stole the crown from NVIDIA.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Of course, this card isn't really priced or meant to compete against X800 Pros General Grievous. :roll:

You really do like to pimp ATI aged tech though, I'll give you that.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Any moron who thrives on whether or not ATIs vaporware is a little more dense than the Vapor Nvidia launched deserves that overpriced 12 pipe stop gap design they get in the X800 Pro.

I have 3 months until I build a new machine. As long as the 6800 GT is out by then I wont care.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: Rollo


I'd also be really interested to know what you arm chair rocket scientists who are bashing nVidia for their paper launches do for a living.
I'm guessing it's not quite as meaningful as what nVidia is doing:
"D'oh Mr. Rollo I am an A+ box jockey screwing down motherboards! I am well qualified to disrespect the most successful GPU engineering firm on the planet- I passed the A+ cert!"

Uh huh.

nVidia is 3rd in GPU sales and marketshare behind Intel and ATI.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: ZimZum
Originally posted by: Rollo


I'd also be really interested to know what you arm chair rocket scientists who are bashing nVidia for their paper launches do for a living.
I'm guessing it's not quite as meaningful as what nVidia is doing:
"D'oh Mr. Rollo I am an A+ box jockey screwing down motherboards! I am well qualified to disrespect the most successful GPU engineering firm on the planet- I passed the A+ cert!"

Uh huh.

nVidia is 3rd in GPU sales and marketshare behind Intel and ATI.

But NVIDIA makes a lot more than ATi does... hrm.
 

Viper96720

Diamond Member
Jul 15, 2002
4,390
0
0
Just read a review on the visiontek x800. Didn't seem biased to me. Heck there's a win a BFG 6800 Ultra thing flashing on the screen. Guess they're Nvidia lovers now.
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Well just chalk up another win for the X800pro against the 6800U. The X800pro beat the 6800U in the 8 game hardware.fr* review. So I?d hardly call HardOCP bias. The X800pro was fast in the Driverheaven and Nordic reviews too.

The X800pro is looking very fast where it really counts -- DX9, AA/AF, and shader intensive games. As new games come out (where NV hasn?t had a chance to ?hack/cheat? somehow) like FIFA2004 and Collin McRae 04, the X800pro is looking like a good match for the 6800U. If you like OGL and gaming and without AA/AF the 6800U is faster. But the vast majority of new games are DX and AA/AF usage is pretty well a given.

I also think it is a pretty big oversight of NV to only offer a ?usable? AA up to 4AA ... xbit .... Anyone who is using 6AA on a 9800, or likes high levels of AA is going to feel like they downgraded by going to a 6800.

------
*Note ? the 6800U and X800pro tied in that chart but the 6800U is running a lower shader path in Farcry -- so the X800pro would have been faster than the 6800U on even terms ?..8 games ??.. UT3,Farcry, Tomb Raider, Splinter Cell, Il-2 FB, Warcraft III, Collin McRae 04, FIFA 2004 ?l?
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: Acanthus
Originally posted by: ZimZum
Originally posted by: Rollo


I'd also be really interested to know what you arm chair rocket scientists who are bashing nVidia for their paper launches do for a living.
I'm guessing it's not quite as meaningful as what nVidia is doing:
"D'oh Mr. Rollo I am an A+ box jockey screwing down motherboards! I am well qualified to disrespect the most successful GPU engineering firm on the planet- I passed the A+ cert!"

Uh huh.

nVidia is 3rd in GPU sales and marketshare behind Intel and ATI.

But NVIDIA makes a lot more than ATi does... hrm.

Makes a lot more what?
 

Diablo6178

Senior member
Aug 23, 2000
448
0
0
What is most bothersome about this whole debate is that nVidia only has 1 feature to hold over ATi SM3.0 and even then from the tech descriptions it's not a big advance over SM2.0 only a more optimized path to take less of a performance hit.

So your basing your purchase decision on one feature as if it was the end all and be all of graphics. It's not. It doesn't truely make the game look that much better. Ultimately it just seems rediculous.

nVidia was forced to go back and re-evaluate the product line after the perofmance of the Geforce 4 they came back with the 6800 Ultra, but even from a engineering perspective the part is too power hungry and generates too much heat. Everyone is landblasting Intel for Prescott. Imagine these two in one box without overclocking. Now nVidia is scrabling to release the 6800 Ultra Extreme which is even faster I.E. more power hungry and hotter. It just doesn't seem like a good design desicion to me. Maybe the engineering side of me can't get over that one aspect to appreciate the part.

ATi didn't include one feature in their "next gen" part.

Lets hold on here for just one second. Neither of these parts is "NEXT GEN" neither is prequalifed for DirectX 10 and DirectX 10 doesn't even have a release date. Lets hope that it's before Longhorn 2006-2007, 9.1c is just around the corner.

So ultimately your looking to spend $400 on any of these cards to get the samething you have now just with AA/AF with varying degrees of "betterness". Image Quality is a mute point. Pop in a game and either card will display an acceptable representation of what the developer intended. If not they WILL fix it. I say buy what works best in your system because these to parts are as equal as the two companies can make them.

***Update***
Upon reading some of the X800 owner statements it too is running very hot. Make the last statement even more true.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
What is most bothersome about this whole debate is that nVidia only has 1 feature to hold over ATi SM3.0 and even then from the tech descriptions it's not a big advance over SM2.0 only a more optimized path to take less of a performance hit.

Dynamic branching
Geometry instancing
Texture lookups in the VS which will make displacement mapping faster.
Much more friendly programmer model.

There are a few other things but these are the big ones.

nVidia was forced to go back and re-evaluate the product line after the perofmance of the Geforce 4 they came back with the 6800 Ultra, but even from a engineering perspective the part is too power hungry and generates too much heat. Everyone is landblasting Intel for Prescott. Imagine these two in one box without overclocking. Now nVidia is scrabling to release the 6800 Ultra Extreme which is even faster I.E. more power hungry and hotter. It just doesn't seem like a good design desicion to me. Maybe the engineering side of me can't get over that one aspect to appreciate the part.

May eat 15-20% more power than the X800XT. And that is just a maybe. It "might" eat as much power as the 9800 Pro. If you think the 6800 Ultra is a power hog then you must feel the same about the 9800 Pro.

Lets hold on here for just one second. Neither of these parts is "NEXT GEN" neither is prequalifed for DirectX 10 and DirectX 10 doesn't even have a release date. Lets hope that it's before Longhorn 2006-2007, 9.1c is just around the corner.

I agree the ATI card isnt much of a next gen card. But the 6800 is at least a mini next gen card as it now supports SM 3.0. A shader model that will be used until DirextNext comes out with longhorn.
 

Diablo6178

Senior member
Aug 23, 2000
448
0
0
Originally posted by: Genx87

Dynamic branching
Part of the Pixel Shader Improvements and included in SM3.0
Geometry instancing
An interesting a useful feature but only for identical objects left in the frame buffer. How this relates to object movement is a very interesting question and weather or not it treats objects at different points of motion as the same.
Texture lookups in the VS which will make displacement mapping faster.
Seems like more of an architecture optimization to me then a true feature.
Much more friendly programmer model.
nVidia released their new programming model long before the release of this card. I don't really consider this a feature for the card anyway.

There are a few other things but these are the big ones.

May eat 15-20% more power than the X800XT. And that is just a maybe. It "might" eat as much power as the 9800 Pro. If you think the 6800 Ultra is a power hog then you must feel the same about the 9800 Pro.

9800 Pro= 75Watts
X800XT PE= 76Watts
6800 Ultra= 120Watts


Not to mention it chews up an extra PCI slot with stock cooling.

I agree the ATI card isnt much of a next gen card. But the 6800 is at least a mini next gen card as it now supports SM 3.0. A shader model that will be used until DirextNext comes out with longhorn.

So you advocate using hardware with features that might be invalidated by the next version of DirectX. ATi did that with the first Radeon, their pixel shaders were broken .
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
9800 Pro= 75Watts
X800XT PE= 76Watts
6800 Ultra= 120Watts
Not to sound like some NV fanboy but I remembered that those numbers are no where near correct; (unless you are buying some PR nonsense) just checked out the old Tom's review...the 6800U idles at 6 watts less than the 9800XT, and although the 6800 with full load is 25 W more than x800XT (not PE), your numbers don't even make sense...Cmon, don't post that the x800's top of the top uses only a single watt more than an 9800pro when it's functionally the same core doubled. We all know the 6800's are gonna eat more power but this is a discussion on a purported fanboy site and you post numbers that even the ATi PR department couldn't stomach?
 

Diablo6178

Senior member
Aug 23, 2000
448
0
0
Toms Hardware

"The power consumption of the X800 XT is about the same as that of its predecessors in 3D applications."

Trusted Reviews

"But amazingly, despite the higher clock speeds, ATI has managed to keep the power consumption to a minimum. Even the X800 XT card draws less power than the Radeon 9800XT"


I appoligize about the missnomer. But the intent of the comparison is still valid. 25 Watts as you pointed out is a big difference. Especially considering the fact that it is 90% converted to heat on semiconductor chips.

ATi changed their manufactufing process from 130nm to 110nm which is where the Speed scaling and power reduction came from. Along with the lower power consuming memory.

The thing to point out on Tom's Article is that it may not have hit the peak for either card with 3DMark 2003. Doom 3 or Half Life 2 may be more stressful to these cards, consuming more power, on their release. Where one or the other tops out is a very subjective thing and can't be determined by monitoring the power draw of a PC from the wall.

What happens if the actual power use of the Radeon is 70 Watts and the Geforce is at 95 Watts ? What if Doom 3 causes a 20-30% increase to the draw for both?

**Update***
Also at what point do we ignore the power and heat components of a product for the shear goodness it has?
 

TimisoaraKill

Senior member
Dec 17, 2000
510
0
0
Some Nvidiots are mad because H did not benched some useless 3Dmark2003 , Aquamark and tons of obsolete OGL games , instead they benched only popular releases , is this wrong ? ,... you don't buy a expensive card to play packman 3D.
Ati x800 pro perform very well in D3D and can actualy hold very well with 6800U , i don t say that is better but he keep up very tight .
The guy gave his personal opinion , is for sure compatible with the Nvidiot point of view but still i read a lot of good reviews from him .
All reviewers are more or less biased , none of them make exception but this is the nature of the human been .
In this Anand forum there a lot of guys who love Nvidia , very agressive posters , you can't realy talk about ATI here in good , when that ATI truble appeared (trilinear cheat , optim. or whatever) i read some 4-5 posts here about the same thing and some guys just keep bumping them like mad only to be a top post .
So is this forum useless because some hot boys don't let you post your opinion ?,...i dont think so , there is some good and some bad as well , you just need to pickup what make sense to you .
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
This is seriously off topic but what the hey...

Just because D3 or HL2 look prettier doesn't mean the power draw will be any different...if the card is running at full capacity (which practically any quality benchmark tries to show) then the card is at/near it's peak power consumption. Basically, any program that isn't completely static with it's FPS (an old game with a FPS limit for instance) and never drops below that set point is probably pushing your video card as hard as it can. Your card in any newer game is likely working at max capacity, but that may mean it gets 60 fps, whereas in Doom3 the card working as hard as it can is only 25 fps...but they are both running at the card's top speed, so why would the power draw change? And no, total power draw comparisons on the same system with different video cards is not subjective, it's scientific: you have a control (the base system and all its components) and you have a single variable (the video card), thus any difference you record can be attributed entirely to the variable (video card). Oh and you might want to check out the other thread on the x800pro's heat, all the x800pro owners in the forum are reporting ~73C under load, while nvnews (take that as you may) puts the 6800U at 57C under load.

P.S. to TimisoaraKill...you almost had a point, but this
the nature of the human been
...well, it just ruined it.

Reread some of the posts, I personally dislike the Hocp methods as opposed to being mad at the results.
 

Diablo6178

Senior member
Aug 23, 2000
448
0
0
Ummm...OK Most of what you said is baseless.

If any Card is being used to 100% it's capacity in "most" all 3d apps. Then how can nVidia and ATi possibly optimize anything for their drivers?

They couldn't. There would be no room for Performance increases. Period.

Since driver releases usually help performance to varying degrees for 3d apps then only logical choice is that NO 3d app is 100% efficient or using 100% of a cards capacity for performance. What ATi and nVidia do is reorder execution of rendering to better use the floating point pipeline and get a scene out faster. As a result one can come to the conclusion that as NICE as 3dmark 2003 is if it were using every last bit of a cards performance then driver improvements would have no effect on the FPS since both would be effectively maxed. What they are in reality is maxed for that particular order of execution. ATi and nVidia make changes in drivers to improve this. Thus using more of a cards capablilities.

Now if a cards rendering can be altered to improve the FPS for a scene then a card will heat up as more of the pipeline is used to generate the scene. Since Doom 3 and Half Life 2 will use very complex rendering the cards are going to be more taxed for rendering a scene and the power requirement will go up as more of the pipeline is used. This is exaclty why the X800 Pro with 12 Pipelines running requires less power then the X800XT which has 16 running. They are the same Die. The more a pipeline is used and is optimized for a 3d app the more the power it requires.

Drivers for either card are not fully optimized yet so they have no where to go but up on power consumption and heat output.

73C on die may be an acceptable range for the X800. How do I know how this persons case is setup and if he has any active cooling at all? It could be a Dell 4600 with one exhaust fan.

You are relying on an assumption that in Tom's review the 6800 Ultra is actually using 110 Watts. There is no scientific proof of this, just his guess. Is that very scientific?
 
Apr 14, 2004
1,599
0
0
Anyone who is using 6AA on a 9800, or likes high levels of AA is going to feel like they downgraded by going to a 6800.
6xAA is worthless. The 9800 is far too slow to run 6xAA on newer games. Even my X800 XT couldn't run 1600x1200 6xAA/16xAF quite as well as I would have liked. Though now that it is overclocked I probably can.

Regardless, at higher resolutions you can barely even notice 6xAA. And if you aren't playing at 1600x1200 or at least 1280x1024, its not like you need a new card.