Why are people going crazy over the 512 MB GTX?

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
If you look at HardOCP's review (specfically BF2), the performance delta between both in high IQ settings (e.g. HQ AF/TRSSAA vs HQ AF/Adaptive AA) is minimal yet the X1800XT is $150 cheaper. Couple that with the fact that the X1800XT offers better AF filtering in addition to 6x adaptive AA, I see no reason to get excited about the new nVidia card.


Edit:

Although throughout the thread I may have used high IQ with best possible IQ interchangeably, this is exactly what I mean when I say high IQ: The use of transparent SSAA for nVidia cards and Adaptive AA for ATi cards both using high quality AF. Furthermore, I consider 4xTRSSAA/4xAAA and HQ AF as the best possible IQ when testing these cards because nVidia lacks a 6xTRSSAA mode in their drivers. I'm well aware there are better IQ settings (e.g. 8xAA) but it is the best IQ that can be used when comparing the two in an apples-to-apples fashion.

The benchmarks I picked below are selective for a very good reason. Not many sites utilize TRSSAA/AAA and HQ AF when comparing these two cards. So if you feel my choice of benchmarks in this thread are selective, feel free to find other benchmarks utilizing similar quality settings (TRSSAA/AAA) that do not show the XT/GTX being close in performance.

CoD 2 1280x1024:

X1800XT using 4xAAA/16xHQ: 26 min, 36 avg, 57 max
7800GTX using 4xTRSAA/16xHQ: 24 min, 34 avg, 44 max


CS: Source: 1600x1200

X1800XT using 4xAAA/16xHQ: 28 min, 115 avg, 271 max
7800GTX using 4xTRSSAA/16xHQ: 45 min, 136 avg, 170 max (nVidia kicked ass w/min).

Tiger Woods: 1600x1200

X1800XT using 4xAAA/16xHQ: min 16, avg 26, max 59
7800GTX using 4xTRSSAA/16xHQ: min 13, avg 26, max 47

HardOCP BF2 Results (apples to apples) 1600x1200:

X1800XT using 4xAAA/16xHQ: min 28, avg 58.6, max 93
7800 GTX using 4xTRSSAA/16xHQ: min 37, avg 58.3, max 90
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
Yes, we needed another thread on the exact same topic the other 1000 are on.


:cookie: for you sir.
 

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
<-- will wait for prices to go down, then get caught up on the R580's and forget all about the 512 GTX's.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I dont consider the difference minimal, I consider it a spanking by Nvidia.

Besides, X1800XT was $699.99 on day of launch, just give it a couple of weeks.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Because it's the fastest consumer GPU in the whole world (and this is afterall the Video forum)? You'll see auto forums getting happy over a new Porsche being released too. It also represents a leap in technology. Now let me put my panties in the dryer.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
also, I think Nvidia's AA looks a lot better than ATI's. Just look at the IQ comparison in the BFG 7800GT SLI review at Rage3D.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
No, we didn't need another thread, really. As far as cost is concerned, it's the first day, for goodness sakes. eTailers gouge what they think will be a hot product, because they can, and people will pay for it. It will, however, even out fairly quickly. What say you then? There is no disputing the superiority of the card, and when talking in pairs, the comparison becomes moot (since X1800 Crossfire cards are nonexistent).
 

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
Anantech painted a different picture in there review of the 7800GTX 512
They found it to be allot faster then the X1800XT in ?ALL? games tested: ? I trust Anantech more then HardOcp and there stupid style of benchmarking .. In the end the 7800GTX is the king of the graphics market and right now X1800XT sits along side 7800GTX 256 maybe a little above?.But 7800GTX 512 is alone at the top as the undisputed king without question!

This is a really good thing !!! ATI have the R580 ready to rumble and this will force them to bring this baby out allot sooner then they had planed. Yey!

^__^ Wish the CPU market was this heated up!
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Ronin
No, we didn't need another thread, really. As far as cost is concerned, it's the first day, for goodness sakes. eTailers gouge what they think will be a hot product, because they can, and people will pay for it. It will, however, even out fairly quickly. What say you then? There is no disputing the superiority of the card, and when talking in pairs, the comparison becomes moot (since X1800 Crossfire cards are nonexistent).


I don't care about crossfire or SLI. Only about 0.001% of gamers will spend $1500 on an SLI setup. What I am concerned with is single card performance, IQ and price. Of those 3 factors, ATi wins in 2 of them. Yes the 512 MB GTX is the fastest card, there's no denying that. However, it still offers inferior AF (shimmer shimmer), no 6xAA (so you either go from 4x to 8x without a real middleground) and costs $150 more. It's funny, now nVidia fans are sounding like ATi fans did on X1800XT's release, "it just got released, just you wait, it will get cheaper, I promise!". Yeah so will the X1800XT over time and I doubt the 512 GTX will ever have price parity with the X1800XT let alone be cheaper. For the cost, the 512 GTX is simply not worth the price of admission, especially since it brings nothing new technologically over the 256 mb version.

Something nVidia fans keep brushing aside is high IQ performance. If you're going to pay $750 for a card, you're going to want to turn on TRSSAA/HQ AF all the time. When you compare the GTX to the XT at those settings, the difference really isn't that large - at least not enough to justify the $150 premium.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Matt2
also, I think Nvidia's AA looks a lot better than ATI's. Just look at the IQ comparison in the BFG 7800GT SLI review at Rage3D.

That depends on the monitor you're using. On an LCD NV's 2x and 4x AA look smooth and Ati's looks like a chain link on every polygon edge. On a CRT Ati's AA looks crisp and clean, while Nv's looks like a smudged mess.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Because it's the monster and a kick in ATI's already self-kicked face.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
Hardocp is using the highest playable setting which means that they're not comparing the two cards fairly
anandtech and firingsquad shows the reality of the two cards
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Who are you trying to convince anyways? Us or you?

AS far as I'm concerned the performance difference between the X1800XT vs 7800 512 is the same as the 7800GT vs 7800GTX 256, thus the $100 (speaking MSRP) difference between the X1800XT and the 7800 512 is warranted. New tier in performance equals new tier in price. I personally think that anyone willing to spend $600 for an X1800XT is selling themselves short when you can get a 7800GTX 512 for $650 (again, speaking MSRP).

The price of the 7800GTX 512 will fall according to the price of the X1800XT.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
If you look at HardOCP's review, the performance delta between both in high IQ settings (e.g. HQ AF/TRSSAA vs HQ AF/Adaptive AA) is minimal yet the X1800XT is $150 cheaper
HardOCP's testing methodology is flawed for multiple reasons. Instead of testing apples vs apples benchmarks at a range of settings they simply present their opinion of what is playable. That's great and all but it severely restricts their ability to accurately test the cards.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: moonboy403
Hardocp is using the highest playable setting which means that they're not comparing the two cards fairly
anandtech and firingsquad shows the reality of the two cards


Well look at their BF2 numbers, both use high IQ settings and you see the XT has performance parity with the 512 GTX. The problem with a lot of reviews is they use standard AA/AF with top end cards for performance figures when they should supplement those reviews with max IQ settings since that is what you should expect to play with for such expensive cards.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: BFG10K
If you look at HardOCP's review, the performance delta between both in high IQ settings (e.g. HQ AF/TRSSAA vs HQ AF/Adaptive AA) is minimal yet the X1800XT is $150 cheaper
HardOCP's testing methodology is flawed for multiple reasons. Instead of testing apples vs apples benchmarks at a range of settings they simply present their opinion of what is playable. That's great and all but it severely restricts their ability to accurately test the cards.


Oh I agree, I'm not a huge fan of their methodology either. However, their review is one of the few that includes trssaa and adaptive aa from what I've seen. I'd mention driverheaven but then a lot of people would cry biased even though their reviews tend to be really good.
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
Originally posted by: 5150Joker
Originally posted by: VIAN
Because it's the monster and a kick in ATI's already self-kicked face.

Yeah a $150 price premium monster.

If your spending $600 on a video card already, whats another $150?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Matt2
Who are you trying to convince anyways? Us or you?

AS far as I'm concerned the performance difference between the X1800XT vs 7800 512 is the same as the 7800GT vs 7800GTX 256, thus the $100 (speaking MSRP) difference between the X1800XT and the 7800 512 is warranted. New tier in performance equals new tier in price. I personally think that anyone willing to spend $600 for an X1800XT is selling themselves short when you can get a 7800GTX 512 for $650 (again, speaking MSRP).

The price of the 7800GTX 512 will fall according to the price of the X1800XT.


I'm not out to convince anyone but I am voicing my objective opinion.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rage187
Originally posted by: 5150Joker
Originally posted by: VIAN
Because it's the monster and a kick in ATI's already self-kicked face.

Yeah a $150 price premium monster.

If your spending $600 on a video card already, whats another $150?

An additional 25% isn't a small amount. While we're at it, why not raise it another 50%? I mean what's another few hundred dollars right?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Oh I agree, I'm not a huge fan of their methodology either
Then why start a thread based on their faulty data?

However, their review is one of the few that includes trssaa and adaptive aa from what I've seen.
That's great but it doesn't change the fundamental issue at hand. Middling resolution settings coupled with shifting goal-posts doesn't really show much.
 

rcabor

Member
Nov 2, 2004
73
0
0
Originally posted by: Matt2
I dont consider the difference minimal, I consider it a spanking by Nvidia.

Besides, X1800XT was $699.99 on day of launch, just give it a couple of weeks.

I paid $599 on launch day.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: BFG10K
Oh I agree, I'm not a huge fan of their methodology either
Then why start a thread based on their faulty data?

However, their review is one of the few that includes trssaa and adaptive aa from what I've seen.
That's great but it doesn't change the fundamental issue at hand. Middling resolution settings coupled with shifting goal-posts doesn't really show much.


There is nothing inherently faulty about their data. I said I don't like their methods, that's all. You pretty much answered your own question with the second quote though. I neglected to mention driverheaven because the nVidia fans cry bias whenever that site is brought up. But just for the sake of argument:

CoD 2 1280x1024:

X1800XT using 4xAAA/16xHQ: 26 min, 36 avg, 57 max
7800GTX using 4xTRSAA/16xHQ: 24 min, 34 avg, 44 max


CS: Source: 1600x1200

X1800XT using 4xAAA/16xHQ: 28 min, 115 avg, 271 max
7800GTX using 4xTRSSAA/16xHQ: 45 min, 136 avg, 170 max (nVidia kicked ass w/min).

Tiger Woods: 1600x1200

X1800XT using 4xAAA/16xHQ: min 16, avg 26, max 59
7800GTX using 4xTRSSAA/16xHQ: min 13, avg 26, max 47

HardOCP BF2 Results (apples to apples) 1600x1200:

X1800XT using 4xAAA/16xHQ: min 28, avg 58.6, max 93
7800 GTX using 4xTRSSAA/16xHQ: min 37, avg 58.3, max 90

Feel free to point to more reviews that show high IQ comparisons, the more the better. Like I keep saying, for video cards of this caliber and price, max IQ settings is what should be considered the true benchmark.