• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why are people going crazy over the 512 MB GTX?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
The best style of benchmarking by far is done by the tech report. other sites just put up average fps, when in fact the framerate could be dipping into the 20's and the average is distorted by 100+ fps when the reviewer is staring at a wall. what the tech report do is record a 60sec fraps demo during heavy gameplay and then they post the average fps, as well as the low median fps. this not only tells you how fast the card is when staring at a wall, but also gives a good comparison of the worst case scenario. my 7800gt can hit 100+ in COD2 when nothing is going on (throwing off the average) but that dont mean $hit when the smoke and explosions go off and dips into the 20's. read this review of the 7800 ultra http://techreport.com/reviews/2005q4/geforce-7800gtx-512/index.x?pg=2
 
So now you're saying TRMSAA = AAA?
What on Earth are you talking about?

Sure it does, but not in the context of this thread
You mean not in the context of your trolling. You asked for "best possible IQ" but now you're telling me resolution doesn't count because it's in the wrong "context". Again it's nothing to do with "context", its simply your childish trolling.

What relevance does that resolution have to the average gamer? None.
What relevance do either of these cards have to the average gamer? None.

Again, resolution has nothing to do with high IQ setting
If this isn't trolling then you quite simply must be an absolute idiot. There is no other way to describe your actions.

That's something you decided to throw in there along with midgrade AA/AF and decided to call it the highest IQ possible.
Excuse me? You're the one producing mid-range AA as evidence, not me. And you then have the gall to turn around and claim I'm the one that presented those settings?

Yeah most people that don't own a $750 card don't use those settings. However, the ones that do pay for it might want to use them don't ya think?
I see, so when it comes to high resolution it doesn't count because most people don't have it. But when it comes to $750 video cards it doesn't matter that most people don't have those because there are some that will use them?

How old are you? Are you over the age of say...twelve?

Again, why do you keep harping on resolution?
Because you demanded "best possible IQ" and resolution is integral to IQ.

If resolution is all that matters, why bother with AA/AF at all?
That's nothing more than a ridiculous strawman. You're the one that demanded "best possible IQ" but then started arguing the resolution doesn't count because it doesn't fit into your delusional idea of image quality.

1280x1024 is not "best possible IQ".
4xTrAA/4xAAA is not "best possible IQ".

This entire thread was constructed around the basis of the reasoning level of that of a six year old, and your arguments since have done nothing but follow that trend.

What's the point of a higher resolution with midgrade quality settings? Why even bother with a $750 card then?
You tell me - you're the one producing mid/low settings as "evidence" for your "high IQ" settings.

More opinion.
You think resolution increasing IQ is opinion? Again I'll ask are you trolling or are you just a clueless simpleton?

I see, so gamers that are actually going to buy this card and play it at the resolution and IQ settings they want don't matter.
Who says they want to play at your settings?

They have to be selective since there aren't very many benchmarks that use TRSSAA/AAA and HQ AF.
And? This has what do to with requesting "highest IQ settings" and being a blatant hypocrite afterward?

It is the highest IQ setting you can use to compare the two cards since nVidia lacks 6xTRSSAA.
No, the highest IQ setting you can compare is 2048x1536. Of course accepting this would mean you'd have to retract this entire trainwreck of a thread and retract your three cherry picked benchmarks.
 
Originally posted by: BFG10K
So now you're saying TRMSAA = AAA?
What on Earth are you talking about?

Sure it does, but not in the context of this thread
You mean not in the context of your trolling. You asked for "best possible IQ" but now you're telling me resolution doesn't count because it's in the wrong "context". Again it's nothing to do with "context", its simply your childish trolling.

What relevance does that resolution have to the average gamer? None.
What relevance do either of these cards have to the average gamer? None.

Again, resolution has nothing to do with high IQ setting
If this isn't trolling then you quite simply must be an absolute idiot. There is no other way to describe your actions.

That's something you decided to throw in there along with midgrade AA/AF and decided to call it the highest IQ possible.
Excuse me? You're the one producing mid-range AA as evidence, not me. And you then have the gall to turn around and claim I'm the one that presented those settings?

Yeah most people that don't own a $750 card don't use those settings. However, the ones that do pay for it might want to use them don't ya think?
I see, so when it comes to high resolution it doesn't count because most people don't have it. But when it comes to $750 video cards it doesn't matter that most people don't have those because there are some that will use them?

How old are you? Are you over the age of say...twelve?

Again, why do you keep harping on resolution?
Because you demanded "best possible IQ" and resolution is integral to IQ.

If resolution is all that matters, why bother with AA/AF at all?
That's nothing more than a ridiculous strawman. You're the one that demanded "best possible IQ" but then started arguing the resolution doesn't count because it doesn't fit into your delusional idea of image quality.

1280x1024 is not "best possible IQ".
4xTrAA/4xAAA is not "best possible IQ".

This entire thread was constructed around the basis of the reasoning level of that of a six year old, and your arguments since have done nothing but follow that trend.

What's the point of a higher resolution with midgrade quality settings? Why even bother with a $750 card then?
You tell me - you're the one producing mid/low settings as "evidence" for your "high IQ" settings.

More opinion.
You think resolution increasing IQ is opinion? Again I'll ask are you trolling or are you just a clueless simpleton?

I see, so gamers that are actually going to buy this card and play it at the resolution and IQ settings they want don't matter.
Who says they want to play at your settings?

They have to be selective since there aren't very many benchmarks that use TRSSAA/AAA and HQ AF.
And? This has what do to with requesting "highest IQ settings" and being a blatant hypocrite afterward?

It is the highest IQ setting you can use to compare the two cards since nVidia lacks 6xTRSSAA.
No, the highest IQ setting you can compare is 2048x1536. Of course accepting this would mean you'd have to retract this entire trainwreck of a thread and retract your three cherry picked benchmarks.


I can't believe I wasted time replying to you. Anyone with an iota of intelligence could understand exactly what I'm describing when I refer to high IQ since I clearly outline it - over and over. What's even funnier is that you started trollng this thread by turning it into a resolution argument and then try to claim I'm trolling. :roll: Furthermore, you resort to childish insults - a hallmark of defeat.
 
but you're asking too much out of this generation's GPU's if you consider both 1280X1024 and 4x TrAA/4xAAA to be low settings.
Unless you're running something very new like Fear or CoD2 then those settings are a cake-walk for those video cards.

Regardless, I'm not advocating those settings per-se. What I advocate is best possible IQ while maintaining a playable framerate.

The premise of this thread was founded around the basis of some cherry picked benchmarks running at a cherry picked setting and "concluding" there's "no difference" between the cards.

What's worse is the OP demanded "best possible IQ" but then employed hypocritical double standards - going as far as claiming resolution has nothing to do with IQ - when it obviously didn't fit into his delusional idea of reality.

Anyone with half a brain can see that at playable settings a 7800 GTX 512 is significantly faster than a X1800 XT in most games.
 
I can't believe I wasted time replying to you.
The feeling's mutual, I assure you.

Anyone with an iota of intelligence could understand exactly what I'm describing when I refer to high IQ since I clearly outline it - over and over.
Yes, I think we can all see the hypocritical double standards you employed to create this farce of a thread.

What's even funnier is that you started trollng this thread by turning it into a resolution argument and then try to claim I'm trolling
Yes, because we all know resolution has nothing to do with IQ and therefore I must be trolling if I bring it up. :roll:

Furthermore, you resort to childish insults -
The curious minds want to know if this thread is product of trolling or something else. I know I certainly do.
 
bfg what he is getting at is the "high quality" tab in the games, rather than best all around image quality....we all know rez will determine this the most, but to have them at the same res, and one on "high quality" and another on "medium quality", there will for the most part, be a huge difference there
 
Originally posted by: BFG10K
but you're asking too much out of this generation's GPU's if you consider both 1280X1024 and 4x TrAA/4xAAA to be low settings.
Unless you're running something very new like Fear or CoD2 then those settings are a cake-walk for those video cards.

Regardless, I'm not advocating those settings per-se. What I advocate is best possible IQ while maintaining a playable framerate.

The premise of this thread was founded around the basis of some cherry picked benchmarks running at a cherry picked setting and "concluding" there's "no difference" between the cards.

What's worse is the OP demanded "best possible IQ" but then employed hypocritical double standards - going as far as claiming resolution has nothing to do with IQ - when it obviously didn't fit into his delusional idea of reality.

Anyone with half a brain can see that at playable settings a 7800 GTX 512 is significantly faster than a X1800 XT in most games.


That's hilarious, you advocate best possible IQ while maintaining playable framerates ,something hardocp tests for, yet you blast them for having flawed testing. :roll:
I never demanded the "best possible IQ", I reasoned that benchmarks do not show the whole picture by ignoring high IQ settings. Of course, you being who you are, decided to twist that and troll this thread and flood it with personal attacks.
 
Originally posted by: Snakexor
bfg what he is getting at is the "high quality" tab in the games, rather than best all around image quality....we all know rez will determine this the most, but to have them at the same res, and one on "high quality" and another on "medium quality", there will for the most part, be a huge difference there


Trust me he knows that. He's trying to twist what I said into "best possible IQ" and a resolution argument. Then when he knows he's wrong, he resorts to childish insults (for which he was reported).
 
This thread could enter the Guinness World Records for most abuse of the quote button.

Seriously, it's day one. If you can hold your water for a little while, the price will drop.
 
Originally posted by: 5150Joker
Originally posted by: Snakexor
bfg what he is getting at is the "high quality" tab in the games, rather than best all around image quality....we all know rez will determine this the most, but to have them at the same res, and one on "high quality" and another on "medium quality", there will for the most part, be a huge difference there


Trust me he knows that. He's trying to twist what I said into "best possible IQ" and a resolution argument. Then when he knows he's wrong, he resorts to childish insults (for which he was reported).
5150Joker The best image Qty would be the Super high rez with full AA and AF ... sure these games would run at 3-5 FPS but i bet a zillion dollars that the 7800GTX will be the one running at 5FPS and the X1800XT will be the one running at 3 ... = 7800GTX512 Pawns the X1800XT ... stop trying to prove to everyone that the world is flat.... we all know its round!
 
Wow...another thread about this....we didn't need it.

5150, I've read this thread and it seems to be you saying all the time that the 7800 just sucks....you darn well look like a fanbiotch to me. This thread is really going nowhere, proving nothing other than that people here can create USELESS threads, and abuse the quote button....

This whole thread is flawed, simply because you want to bash ATi, when they are just as bad on launch day. If you want to bash them, wait for the prices to stabilize, then see if you can still come on here and post random crap.

Can we get a lock?
 
What will you be saying next ... a Pentum 4 -D is better then a Athlon X2 because it costs $150 less .. what your saying here is just as crazy!

Why is ATI shitting itself if your so right ... I'll tell you why ... because they know something that?s you dont!

100 websites say the 7800GTX512 BLOWS the hell out of the X1800XT and 2 of them say it doesn?t...

And lets not all forget this great Pro ATI post of yours lol hehe

WAKE UP and read the Zillion reviews that all say your wrong and also have the people at the ATI PR department running around like headless chickens.,. give up already!
 
wow nvidia went out for blood. When will this be avail for laptops? 😀 Seriously i have an XPS2 and its friggin slow now. Still nothing can beat my VooDoo5 6000!
 
Originally posted by: moonboy403
Hardocp is using the highest playable setting which means that they're not comparing the two cards fairly
anandtech and firingsquad shows the reality of the two cards

Hardocp shows what you're really getting by paying the premium. As a consumer, I'd like to know what it translates to. Sure I'll get better frames... but can I really crank the IQ up? So far it looks like it can run TR SSAA over TR MSAA on a single card. Which is about the only difference. That's what matters. That's what I'm going to base spending $$$ to upgrade on.
 
That's hilarious, you advocate best possible IQ while maintaining playable framerates ,something hardocp tests for, yet you blast them for having flawed testing.
If you're going to compare anything it has to be done in a valid fashion. What I play at or don't play at has absolutely nothing to do with this.

I never demanded the "best possible IQ"
OMFG, this is beyond comical now. Have you now added lying to your repertoire?

Some direct quotes from you:
Like I keep saying, for video cards of this caliber and price, max IQ settings is what should be considered the true benchmark.
The numbers you listed do not use the highest IQ possible, way to miss the point.
Highest IQ settings possible and if you look at the numbers for 1280x1024, they break the >30 fps barrier for smooth play

You even go as far as to contradict yourself within the same sentence. Truly amazing. Most trolls would've given up by now but you just carry on.

bfg what he is getting at is the "high quality" tab in the games, rather than best all around image quality....we all know rez will determine this the most, but to have them at the same res, and one on "high quality" and another on "medium quality", there will for the most part, be a huge difference there
Trust me he knows that. He's trying to twist what I said into "best possible IQ" and a resolution argument. Then when he knows he's wrong, he resorts to childish insults (for which he was reported).
So tell me Joker, which games enable 4xTrAA/4xAAA when you click the high quality tab? And which games also automatically enable 1600x1200 as well?

I can't wait to hear your answer so that you can prove I'm "twisting" things.
 
Originally posted by: IntelHydralisk
Originally posted by: moonboy403
Hardocp is using the highest playable setting which means that they're not comparing the two cards fairly
anandtech and firingsquad shows the reality of the two cards

Hardocp shows what you're really getting by paying the premium. As a consumer, I'd like to know what it translates to. Sure I'll get better frames... but can I really crank the IQ up? So far it looks like it can run TR SSAA over TR MSAA on a single card. Which is about the only difference. That's what matters. That's what I'm going to base spending $$$ to upgrade on.

their methodology is flawed from the beginning, as "best IQ" is subjective, as is "playability". i do appreciate their graphs showing min/max fps, however they make that point moot by not making an apples to apples comparison.

many people prefer lower res and higher aa/af, or some combination thereof. many people also "tweak" settings to be able to "dial in" their preference better. having to limit yourself to their preference to accept their conclusion is ridiculous.

it's much more logical to determine a baseline from an "apples to apples", where consumers can then draw their own conclusions rather than those of the hocp reviewers.

at it's core, it's nothing more than them applying their own

 
What will you be saying next ... a Pentum 4 -D is better then a Athlon X2 because it costs $150 less .. what your saying here is just as crazy!

Videoclone said it right, 5150 is trying to convince us that $750 is not worth it eventhough this is at least $50 above the Nvidia MSRP, its the first release day, within 2 weeks it would be 99% sure of falling to at least to the MSRP if not below it.

BFG is selling at 580//1750 MHZ
XFX is selling at 580//1700 MHZ

Anandtech's and Firingsquad reviews are based on default clock of 550//1700 and at this speed 512MB GTX is clearly better than x1800xt. BFG's standard factory overclock would be even faster.

5150 picked different AA sampling to compare which isnt a identical comparison, since Anandtech benches put x1800xt in inferior position and joker refuses to list benches from AT, we'll try firingsquad with identical AA.

With 4xAA 8xAF COD2 1280
GTX 512 = 42.7 (10.6% increase)
X1800XT = 38.6

With 4xAA 8xAF COD2 1600
GTX 512 = 33.5 (8.4% increase)
X1800XT = 30.9

http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx_512mb/page13.asp

Half Life2 (same engine as CS source) 1600x1200 4xAA 8xAF
GTX 512 = 85.2 (21.3% increase)
X1800XT = 70.2

BF2 Results (apples to apples) 1600x1200: 4xAA 16xAF
GTX 512 = 54.7 (12.3% increase)
X1800XT = 48.7

Since 5150Joker likes hardocp so much

What this means is that this coolers (512MB GTX) loudest level is still only about as loud as the BFGTech GeForce 7800 GTX OC and Radeon X1800 XTs FIRST level. The first and second fan speed level of the 512 MB GeForce 7800 GTX is actually more quiet than the GeForce 7800 GTX cooler and the Radeon X1800 XT cooler.

x1800xt is 13db louder than gtx 512 on full speed, 7-8db louder at medium speed.
http://www.hardocp.com/article.html?art=ODg1LDEx
 
The GTX 512 is a nice card but I can see why someone would be underwhelmed by it's performance/price ratio right now. It beats the XT by 10-20% on average but costs 20-25% more. And the XT is a little more feature-rich and boasts somewhat better IQ. The GTX 512 has a nice cooler on it that is reportedly quieter than the XT, however the GTX is more power hungry. So in the end, at it's current pricing, I don't think it's the slam dunk everyone is making it out to be. This could change if the price goes down and the price of the XT remains the same. But I have a feeling the GTX 512 is always going to cost 20-25% more than an XT.
 
^_^ ATI better bring out the R580 sooner rather then later befor Nvidia start to slack off and feel like they have won the war... i like how these companies dance... and right now at the moment Nvidia is doing a really nice jig... ,..Although ATi can shake there arse with the best of them ...

Videocards are so crazy these days.
 
when this gets down to the $550-600 range, i'm going to be all over it. 512mb video ram is for real people, i've seen call of duty 2 tests where a 512mb x800xl beats a 7800gtx 256. nvidia is really flexing its muscle right now, the ceo said that their margins are going to hit 40% this quarter and beyond (thats insane). i dont want to flame, i owned a redeon 8500le 128, a 9700pro, and a x800xl. but the only thing the x1800xt has going for it is the 512bit programmable memory controller, as evidenced by the somewhat smaller penalty for AA over nvidia, but nvidia beats them with pure muscle. plus they have much better drivers, better IQ, purevideo is way better than whatever ati came up with, sli is now a proven and rock solid tech (where's missfire?), and their executives don't rip off the investors buy ingaging in insider trading. we'll see what ati does with r580, but i gotta feeling that nvidia is gonna hit them even harder next round.
 
I have to stand up for hardocp - their reviews aren't so good if you want to bash the opposition with meaningless numbers but they are very useful for those who want to buy cards to play games instead of all this e-penis measuring.

If I have card X hardocp tells me what change to my actual game play going to card Y will make. If it's a small difference then doesn't matter how much faster some other sites time demo says it runs, when I actually play the game that's all the extra money will have bought me.
 
Originally posted by: videoclone
^_^ ATI better bring out the R580 sooner rather then later befor Nvidia start to slack off and feel like they have won the war... i like how these companies dance... and right now at the moment Nvidia is doing a really nice jig... ,..Although ATi can shake there arse with the best of them ...

Videocards are so crazy these days.

This here is the best post in this thread....I would rather read it over and over again then to have to read this thread again. He made the ONLY logical point.....
 
Originally posted by: BFG10K
Again it's nothing to do with "context", its simply your childish trolling.

If this isn't trolling then you quite simply must be an absolute idiot. There is no other way to describe your actions.

How old are you? Are you over the age of say...twelve?

This entire thread was constructed around the basis of the reasoning level of that of a six year old, and your arguments since have done nothing but follow that trend.

You think resolution increasing IQ is opinion? Again I'll ask are you trolling or are you just a clueless simpleton?

BFG, you could have made your points in this thread without totally disregarding the forum rules and resorting to name calling.

Whether you agree with 5150Joker or not, in my eyes a person slinging insults and calling another member names is in no position to claim the high ground on maturity.

You should backed your opinions with citations from experts, rather than attacking the poster.

This is the kind of thing we don't need in the video forum.
 
All i gotta say is im glad i bought 2 of these 7800GTX 512 monsters, they are super quiet, i cant even hear them running, Furthermore there is a significant differencr over my old setup of SLI XFX 7800 GTX-OCs. If Ati gets a faster solution whether its one card or 2 i will go with it, but right now Nvidia is kicking ARSE if people dont believe it they might be the same people that like Intel CPUS over AMDs when it comes to gaming as well.
 
Originally posted by: munky
Originally posted by: Matt2
also, I think Nvidia's AA looks a lot better than ATI's. Just look at the IQ comparison in the BFG 7800GT SLI review at Rage3D.

That depends on the monitor you're using. On an LCD NV's 2x and 4x AA look smooth and Ati's looks like a chain link on every polygon edge. On a CRT Ati's AA looks crisp and clean, while Nv's looks like a smudged mess.

Totally and completely untrue. Why would you state such garbage? AA/TAA looks pristine on my CRT. Its a 19" Dell. I truly don't know how LCD's look, but it looks terrific on my CRT. 7800GTX.

 
Back
Top