ATI x800 XT vs Nvidia 6800 Ultra Poll

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_obsidian

Senior member
May 4, 2004
438
0
0
Originally posted by: Insomniak
Anyone who thinks ATi has better drivers needs their head examined.

NVIDIA drivers absolutely rock. Performance and bugs aside, the feature set in them is amazing, and the layout is sooo much better than ATI's. I have just started playing with the new nview a little while ago and I love it.
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<fanATIc

I'm getting my next card for HL2 and as far as I can tell the XT is holding the lead in DirectX games, though the picture may change it time I doubt it.
 

BugsBunny1078

Banned
Jan 11, 2004
910
0
0
Originally posted by: i82lazyboy
not another repost... sigh
Someone already whined about people reposting in this forum somewhere . Why dont you use the search function before posting?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I voted for the X800pro just because it's faster in Far Cry, which is the game I play most right now and is my only reason for upgrading my graphics card.

That said, I'm waiting to ask Duvie how the 6800U performs for professional 3D work. I have a feeling it's going to be hella good. ATi can't seem to produce good workstation drivers unfortunately.

It's great to see things so competetive again, it gives everyone more choice and will hopefully drive down prices in the long run.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,739
156
106
both cards rock, but i think i would rather overclock the nvidia card and run UT all day :)
i think these new nvidia cards will offer more future compatibility and performance in upcoming games that can use all the junk that's taking up die space :)
reminds me of a P4EE haha

but either way the price/performance are so close now it might just come down to who runs the best in your favorite game :)

btw nvidia's drivers rock
but those beta ones didn't work with directx in Unreal tournament (the original)
and my computer locked up when running it in opengl mode
 

Bucksnort

Golden Member
Aug 17, 2001
1,062
0
0
With the power draw of the 6800 you got to figure the basic design is nowhere as optomized as the ATI. ATI lower power draw than last generation and performance right with the nv, not thats optomized and efficient!
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Bucksnort
With the power draw of the 6800 you got to figure the basic design is nowhere as optomized as the ATI. ATI lower power draw than last generation and performance right with the nv, not thats optomized and efficient!

Its not design or engineering thats reducing their power draw and heat, its Low-K silicon.
 

ZombieJesus

Member
Feb 12, 2004
170
0
0
And Nvidia's card draws so much because of its high transister count and all the features its loaded up with.
 

Bucksnort

Golden Member
Aug 17, 2001
1,062
0
0
"Its not design or engineering thats reducing their power draw and heat, its Low-K silicon."
Any other intelligent comments? The silicon or other components have nothing to do with design and optomization?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Bucksnort
With the power draw of the 6800 you got to figure the basic design is nowhere as optomized as the ATI. ATI lower power draw than last generation and performance right with the nv, not thats optomized and efficient!

Hey Crazyman, thats what happens when you shrink a die, have mucho less transistors and use low-k. ;)
 

g3pro

Senior member
Jan 15, 2004
404
0
0
I should be getting the 6850U for the SM3.0, faster graphics, better IQ, and better support.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Remember, go BFG. They did a lot of people right and I have no doubts they will continue to do so.
They are usually priced aggresively as well. ASUS are waaaaaay overpriced always as is Gigabyte.
Just a heads up. Disclaimer: Not a BFG sales rep. :)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I was surprised at how almost all the reviews declared the x800xt pe the winner, some with much enthusiasm. Nvidia used to control sites like Toms. What has happened here? Either the Ati product has a much better feel (in something that doesn't show up on the bench marks), or Nvidia has created a lot of ill will. Personally I thought the benchmarks seemed even - but I don't have a clear understanding of what each game is testing. Still I see ATI as having a clear win on the very top of the line. The 6800 ultra - ultra draws just to much power at idle for my taste - that said I wish both cards had designed a way to use less power when at idle. I tend to leave my computer running at nights (bad habit, not green). Using over 100watts to listen to internet radio is a super waste in my mind. The 6800 vanilla looks very promising to me - 16 pipes good overclock and decent power use.

Anyways the debate has changed from ATI crapping their pants to features, cost and nationality. This is good. As a Canadian I can gloat for a day, even if none of this stuff is designed or made in Canada. :beer:
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: ronnn
I was surprised at how almost all the reviews declared the x800xt pe the winner, some with much enthusiasm. Nvidia used to control sites like Toms. What has happened here? Either the Ati product has a much better feel (in something that doesn't show up on the bench marks), or Nvidia has created a lot of ill will. Personally I thought the benchmarks seemed even - but I don't have a clear understanding of what each game is testing. Still I see ATI as having a clear win on the very top of the line. The 6800 ultra - ultra draws just to much power at idle for my taste - that said I wish both cards had designed a way to use less power when at idle. I tend to leave my computer running at nights (bad habit, not green). Using over 100watts to listen to internet radio is a super waste in my mind. The 6800 vanilla looks very promising to me - 16 pipes good overclock and decent power use.

Anyways the debate has changed from ATI crapping their pants to features, cost and nationality. This is good. As a Canadian I can gloat for a day, even if none of this stuff is designed or made in Canada. :beer:

Don't get too proud to be a Canadian... EA is based entirely in Canada now...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Bucksnort
"Its not design or engineering thats reducing their power draw and heat, its Low-K silicon."
Any other intelligent comments? The silicon or other components have nothing to do with design and optomization?

I think it has to do with manufacturing, which is not done directly by either ATi or nVidia.
 
Feb 28, 2004
72
0
0
If money was no object, the X800XT of course.

However, back in the real world, I'm going to have to see how retail X800pro / 6800GT boards &amp; drivers compare. Judging from the benchies from different sites, they're aboot equal across the board performance-wise, with the X800pro taking the biggest lead in Far Cry &amp; Painkiller. Interesting that xbitlabs tests on the HL2 beta put nVidia in the lead now though - don't know if that was still partial precision.

I'm leaning towards the GT since I plan on keeping it at least a couple of years (still got my GF2!) and I figure SM3.0 will come in handy by then if the GT has the horsepower to really use it when it's needed. Although, the X800pro will be available sooner and I'm getting tired of waiting, and having to hijack my mate's PC to play Far Cry and Painkiller.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Jeff7181
Don't get too proud to be a Canadian... EA is based entirely in Canada now...


All Canadians are very smug about nationality, is why we wear the maple leaf on our packs.
:D
 

ShinX

Senior member
Dec 1, 2003
300
0
0
ATi's cooling secret is the .13u low-k (micron) transistor size. Allows the card to run @ high frequencies @ lower voltages.
.13u low-k transistors + more stuff added in a smaller area + Low voltages = cool runnings and more processing. Thats also a very good thing for the overclock potential.

Now no more damned questions about the cooling :)
 

SilverTrine

Senior member
May 27, 2003
312
0
0
6800 chokes on the only fairly intense Ps2.0 game out there: FarCry. When even more effects are thrown in there it will just slow it down even more.
 

Link

Golden Member
Jan 10, 2000
1,330
0
0
Originally posted by: ShinX
ATi's cooling secret is the .13u low-k (micron) transistor size. Allows the card to run @ high frequencies @ lower voltages.
.13u low-k transistors + more stuff added in a smaller area + Low voltages = cool runnings and more processing. Thats also a very good thing for the overclock potential.

Now no more damned questions about the cooling :)

nVidia's heating secret is also the .13u.

Why then does NV40 run hotter than R420?
Because of the NV40 has more transistors than R420, and more transistors due to better features?
But the current games and benchmarks cannot utilize those features.
Also, I think NV40 die was made in the U.S.A (by IBM). while R420 was made in Taiwan.:D
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: ronnn
Originally posted by: Jeff7181
Don't get too proud to be a Canadian... EA is based entirely in Canada now...


All Canadians are very smug about nationality, is why we wear the maple leaf on our packs.
:D

I wasn't referring to your nationally exactly... it was more of a slam against EA =)
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: SilverTrine
6800 chokes on the only fairly intense Ps2.0 game out there: FarCry. When even more effects are thrown in there it will just slow it down even more.

Assuming SM 3.0 doesn't replace SM 2.0 before many new games are released.
 

Wolfdog

Member
Aug 25, 2001
187
0
0
I must say that it is good to have all this competition here this year. Although there are no doom3 or hl2 benchmarks. Both of which do rank highly on the new stuff that I would like to see. When it comes down to it though both are way too expensive. The real winner in my mind will come at the price/performance market that won't go live until summer. You know the cards that you can actually afford. It will be truely interesting to see as both cards mature and bugs get worked out where it all stands in six months. Oh wait, then they will do thier next product refresh.