• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why are people going crazy over the 512 MB GTX?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: 5150Joker
It's funny, now nVidia fans are sounding like ATi fans did on X1800XT's release, "it just got released, just you wait, it will get cheaper, I promise!". Yeah so will the X1800XT over time and I doubt the 512 GTX will ever have price parity with the X1800XT let alone be cheaper. For the cost, the 512 GTX is simply not worth the price of admission, especially since it brings nothing new technologically over the 256 mb version. .
Q F T

Being honest is a double-edged sword. 😉

Originally posted by: 5150Joker
Something nVidia fans keep brushing aside is high IQ performance. If you're going to pay $750 for a card, you're going to want to turn on TRSSAA/HQ AF all the time. When you compare the GTX to the XT at those settings, the difference really isn't that large - at least not enough to justify the $150 premium.
Yep, I think bit-tech's review was also one of the few reviews which took that into account.
 
Not always the case 5150Joker ?. The X800XT just doesn?t have the memory or fill rate to compete once Unreal 4 comes out ?. AA and AF wont be an option with that game unless you want a slideshow and people buying a new high end card want the best ? And it?s the 7800GTX 512

24Pipelines at 550Mhz VS 16Pipelines at 625Mhz
1800Mhz ram VS 1400Mhz

This is no competition ATI?s only hope is the R580

5150Joker STOP looking at that HardOCP review .. ?ALL? the other reviews show the 7800GTX 512mb Obliteratiing the X1800XT in all game benchmarks

HardOCP are wrong ? all the Zillion other sites are right .. in this situation the majority WINS ? PS: HardOCP are ATI fanboys! Ive seen countless reviews that are completely different from other websites ? they are full of crap!
 
Originally posted by: videoclone
Not always the case 5150Joker ?. The X800XT just doesn?t have the memory or fill rate to compete once Unreal 4 comes out ?. AA and AF wont be an option with that game unless you want a slideshow and people buying a new high end card want the best ? And it?s the 7800GTX 512

24Pipelines at 550Mhz VS 16Pipelines at 625Mhz
1800Mhz ram VS 1400Mhz

This is no competition ATI?s only hope is the R580

5150Joker STOP looking at that HardOCP review .. ?ALL? the other reviews show the 7800GTX 512mb Obliteratiing the X1800XT in all game benchmarks

HardOCP are wrong ? all the Zillion other sites are right .. in this situation the majority WINS ? PS: HardOCP are ATI fanboys! Ive seen countless reviews that are completely different from other websites ? they are full of crap!



lol... 😕
 
There is nothing inherently faulty about their data
Yes, there is. You can't use it to compare GPUs because it tends to be CPU limited and/or because it's not apples vs apples.

I neglected to mention driverheaven because the nVidia fans cry bias whenever that site is brought up.
Uh, why are you comparing ATi MSAA modes to nVidia SSAA modes? The two aren't even comparable!

For that matter why are you cherry picking games and settings where the difference isn't high (not to mention unplayable) and ignoring anything that contradicts your flawed view?

Doom 3, 2048x1536, 8xAF, 4xAA
7800: 61.8.
X1800: 47.4.
7800 = 30% faster.

HL2, 1600x1200, 16xAF, HDR
7800: 62.
X1800: 48.0.
7800 = 25% faster.

FEAR, 1280x960
7800: 82.7
X1800: 71.7
7800 = 15% faster.

Click.

Like I keep saying, for video cards of this caliber and price, max IQ settings is what should be considered the true benchmark.
Yes, they should but that doesn't mean you should do something looney like compare SSAA to MSAA.
 
although 512gtx is a fantastic card...i don't see the real need for it as of now
i'll wait till more demanding games come out =P
 
No we are not sounding like Nvidia fanboys we are just basing our opinion after reading the Anandtech review of the card? please if you don?t trust anantech and there ability to judge the performance of a videocard then why are you posting in this forum maybe you should head over to HardOCP

I still remember when HardOCP reviewed the 6800Ultra ? lol it had an 9800XT beating it in some tests and coming pretty close to even frame rates on other tests.

Thinking about this now it just goes to show how sad HardOCP are and how much they love ATI ? They even changed the way they benchmark products to make ATI look better.

6800Ultra VS 9800XT .. HAHA What a joke even a 6600GT blows it away ....but HardOCP made out the 9800XT to look like a very competitive videocard at the time.
 
Originally posted by: BFG10K
There is nothing inherently faulty about their data
Yes, there is. You can't use it to compare GPUs because it tends to be CPU limited and/or because it's not apples vs apples.

I neglected to mention driverheaven because the nVidia fans cry bias whenever that site is brought up.
Uh, why are you comparing ATi MSAA modes to nVidia SSAA modes? The two aren't even comparable!

For that matter why are you cherry picking games and settings where the difference isn't high (not to mention unplayable) and ignoring anything that contradicts your flawed view?

Doom 3, 2048x1536, 8xAF, 4xAA
7800: 61.8.
X1800: 47.4.
7800 = 30% faster.

HL2, 1600x1200, 16xAF, HDR
7800: 62.
X1800: 48.0.
7800 = 25% faster.

FEAR, 1280x960
7800: 82.7
X1800: 71.7
7800 = 15% faster.

Click.

Like I keep saying, for video cards of this caliber and price, max IQ settings is what should be considered the true benchmark.
Yes, they should but that doesn't mean you should do something looney like compare SSAA to MSAA.


I quoted numbers that used AAA vs TRSSAA, both are comparable even if they aren't exactly alike. 😕 The numbers you listed do not use the highest IQ possible, way to miss the point.
 
Originally posted by: videoclone
No we are not sounding like Nvidia fanboys we are just basing our opinion after reading the Anandtech review of the card? please if you don?t trust anantech and there ability to judge the performance of a videocard then why are you posting in this forum maybe you should head over to HardOCP

I still remember when HardOCP reviewed the 6800Ultra ? lol it had an 9800XT beating it in some tests and coming pretty close to even frame rates on other tests.

Thinking about this now it just goes to show how sad HardOCP are and how much they love ATI ? They even changed the way they benchmark products to make ATI look better.

6800Ultra VS 9800XT .. HAHA What a joke even a 6600GT blows it away ....but HardOCP made out the 9800XT to look like a very competitive videocard at the time.


I just laughed at your post because typically it's ATi fans accusing them of being in love with nVidia. BTW guys if the 512 GTX falls to the $600 range and is less than $100 more the XT, I'll gladly take back what I said and declare the GTX worth the money. Right now price is playing a huge part in my dissatisfaction with the card for what it offers relative to the XT.
 
^_^ I got rid of my Gainward 6800GT GS clocked at 400/550 AGP and got a X800GTO PCI-E when i upgraded to a X2 3800 i'm not an NV fan i'm an Anantech FAN !! i go by there reviews allot more then any other site and like i said .... When unreal 4 comes out The people who got the 7800GTX512 will be allot better off then those with X1800XT cards.
 
I agree, 750 for a card is absurd especially given that xbox360 launches in a week for 350 less but with a much higher capacity. You do not need to spend 750 on a video card period. I got my 6800NU last year for 200 at BF and it has been driving my 2405 wonderfully (which by the way, should be on your list before the 7800 512). I'll just sit out this generation and laugh when my 300 dollar geforce 8 pwns this 🙂.
 
I'm going to do the same thing touchmyichi ... and i love my 2405FPW the X800GTO runs it fine too.. games at native rez 1920x1200 i'll be looking more for a DX10 card if the geforce 8600GT midrage card is any good it will be on my to buy list. ^_^ and may be just as fast as the high end cards we are talking about now!
 
I quoted numbers that used AAA vs TRSSAA, both are comparable.
Not necessarily.

The numbers you listed do not use the highest IQ possible, way to miss the point.
  • So testing a Voodoo1 at 800x600 compared to a modern card at 2056x1536 is valid because both cards are using their highest IQ possible?
  • You want highest possible IQ yet you produce benchmarks from a middling/low resolutions like 1280x1024 and 1600x1200? If you want to troll at least be consistent about it.
  • The highest settings usually produced slideshows in modern games, as is shown by your "useful" data. Wow, you've shown us one card can do 34 FPS and another can do 36 FPS. This means what exactly?
  • As for your BF2 edit, 1600x1200, 4xAA is 66.2 FPS vs 57.3; the nVidia card is ~16% faster, again using playable settings.
 
Originally posted by: BFG10K
I quoted numbers that used AAA vs TRSSAA, both are comparable.
Not necessarily.

The numbers you listed do not use the highest IQ possible, way to miss the point.
  • So testing a Voodoo1 at 800x600 compared to a modern card at 2056x1536 is valid because both cards are using their highest IQ possible?
  • You want highest possible IQ yet you produce benchmarks from a middling/low resolutions like 1280x1024 and 1600x1200? If you want to troll at least be consistent about it.
  • The highest settings usually produced slideshows in modern games, as is shown by your "useful" data. Wow, you've shown us one card can do 34 FPS and another can do 36 FPS. This means what exactly?
  • As for your BF2 edit, 1600x1200, 4xAA is 66.2 FPS vs 57.3; the nVidia card is ~16% faster, again using playable settings.
5150Joker = Owned !!! :beer::thumbsup:😉:thumbsup::beer: hmmm good stuff. This post is over!
 
Originally posted by: BFG10K
I quoted numbers that used AAA vs TRSSAA, both are comparable.
Not necessarily.

That's it? Why not? At least provide a reason.


So testing a Voodoo1 at 800x600 compared to a modern card at 2056x1536 is valid because both cards are using their highest IQ possible?

Seems to me you just confused yourself there buddy. I'm talking about high IQ settings, not resolution; resolution will vary depending on how much the game taxes a video card but you can keep the high IQ settings consistent. Speaking of which, who cares about 2048x1536? How many people do you know that game at that resolution or even have monitors capable of going that high? The majority of gamers (including enthusiast) game at 1600x1200 or below. So taking numbers at 1600x1200 or below with high IQ settings is much more relevant than 2048x1536. :roll:

You want highest possible IQ yet you produce benchmarks from a middling/low resolutions like 1280x1024 and 1600x1200? If you want to troll at least be consistent about it.

Highest IQ settings possible and if you look at the numbers for 1280x1024, they break the >30 fps barrier for smooth play. Since when is 1600x1200 considered a midend resolution for the average gamer? I didn't know everyone had already upgraded to a Dell 2405 caliber monitor. 😕

The highest settings usually produced slideshows in modern games, as is shown by your "useful" data. Wow, you've shown us one card can do 34 FPS and another can do 36 FPS. This means what exactly?

CoD 2 is a taxing game and with the highest IQ settings it is still very much playable, that's what it means. What does using 2048x1536 mean to most gamers? Nothing.

As for your BF2 edit, 1600x1200, 4xAA is 66.2 FPS vs 57.3; the nVidia card is ~16% faster, again using playable settings.


Now either you're trolling or you're just slow. The BF2 results I quoted at 1600x1200 using high IQ proved my point that there isn't a large distinction between the two cards when using those settings, especially for the price. I never disputed that the GTX was faster (at highest IQ or not). Seems to me you're arguing with yourself.

Just to reiterate to you in lamens terms since you can't seem to grasp the concept of what I'm discussing:

I want to see numbers comparing the two cards using TRSSAA/HQ AF vs AAA/HQ AF and that is what I consider high IQ settings. Now what is so hard about that to understand? Seems to me you completely ignored all my posts where I kept mentioning this and decided to start trolling, good job! :thumbsup:
 
Originally posted by: BFG10K
Oh I agree, I'm not a huge fan of their methodology either
Then why start a thread based on their faulty data?

However, their review is one of the few that includes trssaa and adaptive aa from what I've seen.
That's great but it doesn't change the fundamental issue at hand. Middling resolution settings coupled with shifting goal-posts doesn't really show much.


:thumbsup:
You logic is overwhelming..hnce why it seems to be flying over his head!!!!



5150...Nice you can find 1 benchmark in the review of many to come up with a terrible conclusion known as this thread....
 
Originally posted by: Duvie
Originally posted by: BFG10K
Oh I agree, I'm not a huge fan of their methodology either
Then why start a thread based on their faulty data?

However, their review is one of the few that includes trssaa and adaptive aa from what I've seen.
That's great but it doesn't change the fundamental issue at hand. Middling resolution settings coupled with shifting goal-posts doesn't really show much.


:thumbsup:
You logic is overwhelming..hnce why it seems to be flying over his head!!!!



5150...Nice you can find 1 benchmark in the review of many to come up with a terrible conclusion known as this thread....



There is no logic behind his post at all. He's provided benchmarks for settings I said were not representatitve of high IQ settings. As for one benchmark, that's the only apples-to-apples comparison I could find in their review. I did provide several more from driverheaven.
 
Originally posted by: 5150Joker
Originally posted by: Matt2
Who are you trying to convince anyways? Us or you?

AS far as I'm concerned the performance difference between the X1800XT vs 7800 512 is the same as the 7800GT vs 7800GTX 256, thus the $100 (speaking MSRP) difference between the X1800XT and the 7800 512 is warranted. New tier in performance equals new tier in price. I personally think that anyone willing to spend $600 for an X1800XT is selling themselves short when you can get a 7800GTX 512 for $650 (again, speaking MSRP).

The price of the 7800GTX 512 will fall according to the price of the X1800XT.


I'm not out to convince anyone but I am voicing my objective opinion.


LOL, had to laugh at
the Objective Opinion
 
Originally posted by: CalamitySymphony
Originally posted by: 5150Joker
Originally posted by: Matt2
Who are you trying to convince anyways? Us or you?

AS far as I'm concerned the performance difference between the X1800XT vs 7800 512 is the same as the 7800GT vs 7800GTX 256, thus the $100 (speaking MSRP) difference between the X1800XT and the 7800 512 is warranted. New tier in performance equals new tier in price. I personally think that anyone willing to spend $600 for an X1800XT is selling themselves short when you can get a 7800GTX 512 for $650 (again, speaking MSRP).

The price of the 7800GTX 512 will fall according to the price of the X1800XT.


I'm not out to convince anyone but I am voicing my objective opinion.


LOL, had to laugh at
the Objective Opinion


Where have I shown any bias? I've presented data to backup what I said and clearly outlined what I consider high IQ settings (that the majority of sites fail to use) although BFG decided to twist that into a resolution argument. Guess he considers midgrade AA + shimmering AF at high resolution to be high IQ.
 
That's it? Why not? At least provide a reason.
Because nVidia has two modes of TrAA.

I'm talking about high IQ settings, not resolution
Riiiiight, because we all know resolution has no impact on IQ.:roll:

Speaking of which, who cares about 2048x1536?
Anyone that cares about testing GPUs properly.

How many people do you know that game at that resolution or even have monitors capable of going that high? The majority of gamers (including enthusiast) game at 1600x1200 or below.
Oh goodness, not this crap again. "The cards are equal, except for the settings most people don't use, but those don't count so the cards are equal! Haha! You lose!"

This is typical childish reasoning, using an appeal to popularity logical fallacy in the face of hard facts.

So taking numbers at 1600x1200 or below with high IQ settings is much more relevant than 2048x1536.
That's your opinion. What isn't opinion that is low/middling resolutions are generally useless for testing video cards, regardless of what most people can or can't do.

Highest IQ settings possible and if you look at the numbers for 1280x1024, they break the >30 fps barrier for smooth play.
How there hell can a low resolution like 1280x1024 be "highest IQ settings possible"? And 30 FPS is a slideshow, especially if it's an average.

CoD 2 is a taxing game and with the highest IQ settings it is still very much playable,
No it isn't. Your settings aren't even "highest IQ" and even those settings aren't playable.

What does using 2048x1536 mean to most gamers? Nothing.
What it means to most gamers is irrelevant. What is means to testing GPUs accurately is very relevant.

The BF2 results I quoted at 1600x1200 using high IQ proved my point that there isn't a large distinction between the two cards
Your selective benchmarks simply show what you want to see. Look at any website and they'll back Anand's.

It seems to me your definiton of "best quality IQ" is 4xAAA/4xTrAA which is quite laughable given it isn't even the best AA possible, much less best possible IQ. You're simply trolling and cherry picking benchmarks while being blatantly inconsistent with your own standards.
 
There is no logic behind his post at all. He's provided benchmarks for settings I said were not representatitve of high IQ settings.
And yours were?

1280x1024 is a low resolution.
4xTrAA/4xAAA is a low AA setting.

Again I'll ask, where are these "high IQ settings" you keep harping about?
 
BFG, no offense, but you're starting to sound a bit like BenSkywalker. I do agree with your point that 20xx by 15xx as an important and relevant way to test GPU's, but you're asking too much out of this generation's GPU's if you consider both 1280X1024 and 4x TrAA/4xAAA to be low settings. Put those two together and performance is.. well, just plain low (or at least mediocre) 😉 Especially on anything but an SLI setup, and forget anything below a 7800 series in that case.

----------

And to the OP, I'm neither wearing panties at the moment, nor getting said hypothetical panties wet . And even if I was wearing something soft and silky, it would strictly be for comfort 😛 😉.
 
Originally posted by: videoclone
Not always the case 5150Joker ?. The X800XT just doesn?t have the memory or fill rate to compete once Unreal 4 comes out ?. AA and AF wont be an option with that game unless you want a slideshow and people buying a new high end card want the best ? And it?s the 7800GTX 512

24Pipelines at 550Mhz VS 16Pipelines at 625Mhz
1800Mhz ram VS 1400Mhz

This is no competition ATI?s only hope is the R580

5150Joker STOP looking at that HardOCP review .. ?ALL? the other reviews show the 7800GTX 512mb Obliteratiing the X1800XT in all game benchmarks

HardOCP are wrong ? all the Zillion other sites are right .. in this situation the majority WINS ? PS: HardOCP are ATI fanboys! Ive seen countless reviews that are completely different from other websites ? they are full of crap!

Actually, memory bandwidth is the least of Ati's worries, because even with slower mem the x1800xt takes a smaller hit from enabling AA (dont remember which review it was, but it must have been either anand's or techreport). The gtx does have it beat handily in fillrate, but that's just one part of the equation. The gtx does not beat the xt by such a margin that if a game that wasnt playable on one would be playable on the other (assuming the game does not favor Ati or Nv). Also, speaking of Unreal 3 (not 4), by the time it comes out, the r580 might be already available.
 
Originally posted by: BFG10K
Because nVidia has two modes of TrAA.

So now you're saying TRMSAA = AAA? LOL.

Riiiiight, because we all know resolution has no impact on IQ.:roll:

Sure it does, but not in the context of this thread and the entire point that I've had to reiterate over and over yet some people are still fail to understand something so simple.

Anyone that cares about testing GPUs properly.

What relevance does that resolution have to the average gamer? None.

Oh goodness, not this crap again. "The cards are equal, except for the settings most people don't use, but those don't count so the cards are equal! Haha! You lose!"

Again, resolution has nothing to do with high IQ settings that I've talked about (TRSSAA/AAA and HQ AF). That's something you decided to throw in there along with midgrade AA/AF and decided to call it the highest IQ possible. Yeah most people that don't own a $750 card don't use those settings. However, the ones that do pay for it might want to use them don't ya think?


That's your opinion. What isn't opinion that is low/middling resolutions are generally useless for testing video cards, regardless of what most people can or can't do.

Again, why do you keep harping on resolution? If resolution is all that matters, why bother with AA/AF at all? After all, insanely high resolutions that most gamers don't have access to is the end all be all of IQ according to you.

How there hell can a low resolution like 1280x1024 be "highest IQ settings possible"? And 30 FPS is a slideshow, especially if it's an average.

And that's your opinion. 30 fps is perfectly fine in CoD 2 and is far from a slideshow. What's the point of a higher resolution with midgrade quality settings? Why even bother with a $750 card then?

No it isn't. Your settings aren't even "highest IQ" and even those settings aren't playable.

More opinion.

What it means to most gamers is irrelevant. What is means to testing GPUs accurately is very relevant.

I see, so gamers that are actually going to buy this card and play it at the resolution and IQ settings they want don't matter. What matters is that the card scales to a resolution most people can't use? Nice logic.

Your selective benchmarks simply show what you want to see. Look at any website and they'll back Anand's.

They have to be selective since there aren't very many benchmarks that use TRSSAA/AAA and HQ AF. I was very clear about this throughout the thread.

It seems to me your definiton of "best quality IQ" is 4xAAA/4xTrAA which is quite laughable given it isn't even the best AA possible, much less best possible IQ. You're simply trolling and cherry picking benchmarks while being blatantly inconsistent with your own standards.

It is the highest IQ setting you can use to compare the two cards since nVidia lacks 6xTRSSAA. If I demanded the best IQ possible, nVidia would probably lose (performance wise). If you're going to reply, at least use some logic and common sense. Going to go have dinner and I'm sure I'll see more circular arguments from you that have no relevance to this thread.
 
High rez is to show the POWER of a videocard ....

My X800GTO runs Quake 3 at 400FPS when at 800x640 and a 7800GTX512 would run it at 400FPS at that same rez too..... so the only way to show how much more powerful the card can be is to go to crazy resolutions.

I played half-life 2 at 1280x1024 with 8xaa and 16xaf and it looks 10 times better at 1920x1200 no aa no af ... same goes for BF2 is looks allot better at 1920x1200 then at a lower rez with AF and AA ^_^ Rez comes first then comes AA and AF it will always be like this get over it.
 
Back
Top