Try to sway me to ATI...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Acanthus
Originally posted by: ArchAngel777
Originally posted by: Gamingphreek
As of right now, there is no reason to buy an X1800XL. The 7800GT outclasses it...not by much but still it beats it in most.

As for high end: well you have to give ATI the nod. They just KILL with AA and AF enabled. WHile Nvidia arguably maintains min framerates almost as well, when comparing max and avg, ATI simply runs away with the benchmark. So for the high end, it is a tossup with the nod towards ATI for the better IQ and much better AA performance. Reasons to get Nvidia: Killed performance without or with minimal AA and AF, lower powerconsumption, and i guess you could argue SLI.

Purevideo and AVIVO are merely extras, with the nod to purevideo right now until ATI finalizes their drivers.

Performance:
Mid-High: 7800GT>X1800XL
High: X1800XTPE>=7800GT

IQ:
ATI>Nvidia

Power/Heat:
Nvidia>ATI

Pure Performance (No AA and AF)
Nvidia>ATI

Quality Performance (AA and AF)
7800GT>X1800XL
X1800XTPE>7800GTX

Purevideo vs AVIVO (As of right now, when ATI releases new drivers it may well change):
Purevideo>AVIVO

-Kevin

Another thing to consider is that the X1800XL claims it can do external HDR + AA... Something to consider if that is true and would affect my purchase.

Nvidia can do the same in all titles that are not EXR HDR. ATi cant even do EXR HDR as far as i know.

You sure? As far as I know, they can. But, I could be wrong.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Acanthus
Originally posted by: ArchAngel777
Originally posted by: Gamingphreek
As of right now, there is no reason to buy an X1800XL. The 7800GT outclasses it...not by much but still it beats it in most.

As for high end: well you have to give ATI the nod. They just KILL with AA and AF enabled. WHile Nvidia arguably maintains min framerates almost as well, when comparing max and avg, ATI simply runs away with the benchmark. So for the high end, it is a tossup with the nod towards ATI for the better IQ and much better AA performance. Reasons to get Nvidia: Killed performance without or with minimal AA and AF, lower powerconsumption, and i guess you could argue SLI.

Purevideo and AVIVO are merely extras, with the nod to purevideo right now until ATI finalizes their drivers.

Performance:
Mid-High: 7800GT>X1800XL
High: X1800XTPE>=7800GT

IQ:
ATI>Nvidia

Power/Heat:
Nvidia>ATI

Pure Performance (No AA and AF)
Nvidia>ATI

Quality Performance (AA and AF)
7800GT>X1800XL
X1800XTPE>7800GTX

Purevideo vs AVIVO (As of right now, when ATI releases new drivers it may well change):
Purevideo>AVIVO

-Kevin

Another thing to consider is that the X1800XL claims it can do external HDR + AA... Something to consider if that is true and would affect my purchase.

Nvidia can do the same in all titles that are not EXR HDR. ATi cant even do EXR HDR as far as i know.

Exactly. I had a thread on this a while back. The amount of logic required to enable HDR + AA outweighs the benefit. Neither ATI nor Nvidia can do OpenEXR HDR + AA.

Other methods, such as the one used in Lost Coast, both companies are able to run HDR + AA.

-Kevin
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Gamingphreek
Originally posted by: Acanthus
Originally posted by: ArchAngel777
Originally posted by: Gamingphreek
As of right now, there is no reason to buy an X1800XL. The 7800GT outclasses it...not by much but still it beats it in most.

As for high end: well you have to give ATI the nod. They just KILL with AA and AF enabled. WHile Nvidia arguably maintains min framerates almost as well, when comparing max and avg, ATI simply runs away with the benchmark. So for the high end, it is a tossup with the nod towards ATI for the better IQ and much better AA performance. Reasons to get Nvidia: Killed performance without or with minimal AA and AF, lower powerconsumption, and i guess you could argue SLI.

Purevideo and AVIVO are merely extras, with the nod to purevideo right now until ATI finalizes their drivers.

Performance:
Mid-High: 7800GT>X1800XL
High: X1800XTPE>=7800GT

IQ:
ATI>Nvidia

Power/Heat:
Nvidia>ATI

Pure Performance (No AA and AF)
Nvidia>ATI

Quality Performance (AA and AF)
7800GT>X1800XL
X1800XTPE>7800GTX

Purevideo vs AVIVO (As of right now, when ATI releases new drivers it may well change):
Purevideo>AVIVO

-Kevin

Another thing to consider is that the X1800XL claims it can do external HDR + AA... Something to consider if that is true and would affect my purchase.

Nvidia can do the same in all titles that are not EXR HDR. ATi cant even do EXR HDR as far as i know.

Exactly. I had a thread on this a while back. The amount of logic required to enable HDR + AA outweighs the benefit. Neither ATI nor Nvidia can do OpenEXR HDR + AA.

Other methods, such as the one used in Lost Coast, both companies are able to run HDR + AA.

-Kevin

Good information, I'll have to update the brain. Getting difficult to stay on top of this stuff when rumors stated as fact are listed under every new thread. In light of this new evidence, I would say then that the X1800XL has no place in your rig, go with the 7800 GT instead.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gamingphreek
Neither ATI nor Nvidia can do OpenEXR HDR + AA.
According to the official specs, the x1k cards can do EXR HDR + AA. But none of the games right now allow AA with HDR running, so we'll have to wait and see how it works.

Other methods, such as the one used in Lost Coast, both companies are able to run HDR + AA.

Yeah, when instead of taking the lazy way, the developers manually create shaders to simulate HDR. It works just as good for the most part, except in some cases, like when doing refractions. And the primitive x850 could also run HDR the same way, but somehow I dont remember any of the SM3 supporters mentioning that last year.

 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
According to the official specs, the x1k cards can do EXR HDR + AA. But none of the games right now allow AA with HDR running, so we'll have to wait and see how it works.

Links? AFAIK that is false.

Yeah, when instead of taking the lazy way, the developers manually create shaders to simulate HDR. It works just as good for the most part, except in some cases, like when doing refractions. And the primitive x850 could also run HDR the same way, but somehow I dont remember any of the SM3 supporters mentioning that last year.

What does SM3 have to do with HDR?? Actually that is backward. OpenEXR is the hard way. The Lost Coast is the easy way which has slightly worse IQ. THe advantage is that you are not using FP Blending so you do not need FP32 support (X8 series).

-Kevin
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gamingphreek
According to the official specs, the x1k cards can do EXR HDR + AA. But none of the games right now allow AA with HDR running, so we'll have to wait and see how it works.

Links? AFAIK that is false.

http://www.anandtech.com/video/showdoc.aspx?i=2552&p=2
"Antialiasing supported on MRT and fp16 output "

http://www.xbitlabs.com/articles/video/display/radeon-x1000_7.html
" The RADEON X1000 GPUs also allows you to use HDR along with full-screen antialiasing"

http://www.techreport.com/reviews/2005q4/radeon-x1000/index.x?pg=2
"HDR support with AA ? R520 and the gang can do filtering and blending of 16-bit per color channel floating-point texture formats, allowing for easier use of high-dynamic-range lighting effects?just like the GeForce 6 and 7 series. Unlike NVIDIA's GPUs, the R500 series can also do multisampled antialiasing at the same time, complete with gamma-correct blends"

Yeah, when instead of taking the lazy way, the developers manually create shaders to simulate HDR. It works just as good for the most part, except in some cases, like when doing refractions. And the primitive x850 could also run HDR the same way, but somehow I dont remember any of the SM3 supporters mentioning that last year.

What does SM3 have to do with HDR?? Actually that is backward. OpenEXR is the hard way. The Lost Coast is the easy way which has slightly worse IQ. THe advantage is that you are not using FP Blending so you do not need FP32 support (X8 series).[/quote]

AFAIK, EXR is the easier way from a coding perspective, because you can rely on the FP16 buffer to store HDR color values. Valve said they tried 4 different HDR models before finding one that works well enough, and one of their goals was to have HDR + AA. You cant do that with the FP buffer, so they had to implement their own way - I dont see how this is easier. I does have the advantage of runing on more hardware, and allowing for AA, but unless other developers also implement their own HDR model instead of using the FB buffer, dont expect to run HDR + AA in any game not based on the Source engine.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
What's the rest of your system?

The 7800GT is currently the best value higher end video card out there, so i won't try to convince you to buy ATi.

Now if you were comparing an X800XL vs. 6800GT, that's another story.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Read the reviews and keep the pundits from both sides from starting a flame war....Everything you need to know is already been written.....
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
HDR support with AA ? R520 and the gang can do filtering and blending of 16-bit per color channel floating-point texture formats, allowing for easier use of high-dynamic-range lighting effects?just like the GeForce 6 and 7 series. Unlike NVIDIA's GPUs, the R500 series can also do multisampled antialiasing at the same time, complete with gamma-correct blends"

Yes but isn't EXR standard 32bit Blending rather than 16 (Hence the reason the 6 and 7 series could run EXR but not the X8 series). So therefore they still cannot execute AA with HDR when running in far cry.

-Kevin
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gamingphreek
HDR support with AA ? R520 and the gang can do filtering and blending of 16-bit per color channel floating-point texture formats, allowing for easier use of high-dynamic-range lighting effects?just like the GeForce 6 and 7 series. Unlike NVIDIA's GPUs, the R500 series can also do multisampled antialiasing at the same time, complete with gamma-correct blends"

Yes but isn't EXR standard 32bit Blending rather than 16 (Hence the reason the 6 and 7 series could run EXR but not the X8 series). So therefore they still cannot execute AA with HDR when running in far cry.

-Kevin


www.openexr.com
"Support for 16-bit floating-point, 32-bit floating-point, and 32-bit integer pixels."

Apparently is both 16 bit and 32 bit

http://www.nvidia.com/object/feature_HPeffects.html

Nvidia claims support of 64 bit textures, which adds up to 16*4 (RGBA).
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
The new ATI X1000s can do OpenEXR HDR (from fp16 to fp64) with both multisampling and supersampling, and so can run Far Cry with HDR+AA. The NVIDIA 7 series can not. Don't call the pixel shader emulation methods HDR, because they are not official HDR. I believe Valve calls that LDR. And that mode just emulates bloom. No tone mapping...
 

imported_Dhaval00

Senior member
Jul 23, 2004
573
0
0
Most of you guys forgot to mention the alleged "super-overclockability" of the new ATI cards. If you are not desperate to finalize your build yet, I'd say wait for ATI to release their cards next month, observe the performance, and then build your system.

Remember, 90nm > 11nm. There will be driver improvements, and there will be newer games. If you are going to air-cool your system effeciently, I'd say wait for someone to officially report how well the 90nm cards perform. You don't want to regret building a rig 6 months down the road, only because you were desperate to get an nVIDIA card so that you wouldn't have to wait for a month for an ATI part.

Also, there have been RUMORS about better implementation of hyper memory in ATI XK1 series cards. If you could get an X1800XL with super-overclockability, and better hyper memory usage, you could miss out on a pearl of a product. There mught even be a 512MB module for X1800XL @ $449 that kicks nVIDIA 7800GT's butt.

Here is a story from our "rumored source," TheInquirer:

"Single X1800XT card can beat SLI - Finish overclocker Maki managed to overclock the card to 12278 in 3Dmark05. He overclocked the card from a default 625MHz core/1500MHz memory to a magnificent 877.5MHz core and 1980MHz memory. He used dry ice and he cooled the FX 57 to 3617.5 MHz to reach this score. This score is higher than normal non overclocked two Geforce 7800GTX in SLI as we tried to play with those two cards here. This little experiment shows one totally unexplored part of R520 chip as it can be overclocked sky high. Maybe this can help you make a buying decision. We've even heard you can reach 10000+ in 3Dmark05 with normal air cooling. Stay tuned, we're trying to do just that."

They have a picture @ http://theinquirer.net/?article=26818

Right now, you are in a moral dilemma. I've been there, done that. You can't wait, but you must wait. Fudging Technology makes me speak NERDY. I can't sleep, and I can't swallow.

Hope this helps to improve your situation a bit. I'd say wait for a month and a half to finalize your purchase.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Dhaval00
Remember, 90nm > 11nm.

If anything, 90nm is less than 110nm. That doesn't have to do with anything. I'm not trying to come off as rude, but seriously, it doesn't mean a thing. Power and thus heat can be leaked, and it in itself doesn't mean a thing. Maybe clock-for-clock assuming the efficiency is the same, the 90nm can clock higher. But that doesn't matter either. There's pipelines and everything else that factors in to that. The only thing we can be sure of is 90nm takes less of a wafer than 110nm.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Here is a story from our "rumored source," TheInquirer:

"Single X1800XT card can beat SLI - Finish overclocker Maki managed to overclock the card to 12278 in 3Dmark05. He overclocked the card from a default 625MHz core/1500MHz memory to a magnificent 877.5MHz core and 1980MHz memory. He used dry ice and he cooled the FX 57 to 3617.5 MHz to reach this score. This score is higher than normal non overclocked two Geforce 7800GTX in SLI as we tried to play with those two cards here. This little experiment shows one totally unexplored part of R520 chip as it can be overclocked sky high. Maybe this can help you make a buying decision. We've even heard you can reach 10000+ in 3Dmark05 with normal air cooling. Stay tuned, we're trying to do just that."

And it was proved that an OCed 7800GTX (Single Card) could reach those same 3dMarks. Additionally, 3dMarks do not necessarily turn into real performance.

and better hyper memory usage

The XL doesn't use HyperMemory. That is for the low end models.

There will be driver improvements, and there will be newer games.

Wouldn't that apply to Nvidia too?

Most of you guys forgot to mention the alleged "super-overclockability" of the new ATI cards. If you are not desperate to finalize your build yet, I'd say wait for ATI to release their cards next month, observe the performance, and then build your system.

Last i checked the X1800XL and XT were released a couple days ago. Unless you are saying wait for the low end competitors, i have no idea what you are talking about.

There mught even be a 512MB module for X1800XL @ $449 that kicks nVIDIA 7800GT's butt.

512mb will not sway the benchmarks like that.

Right now, you are in a moral dilemma.

How does this have to do with morals. Is choosing one company over the other immoral or something :p?

-Kevin
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Here is something I have noticed for the last 3-4 years....When it comes to drivers and driver updates Nvidia cards usually have more and generate more perf increases and often IQ increases...

That may be a a double edge sword....

on one side you can say they are selling us underperforming items at the start, or they are not optimised correctly to begin with (like perhaps ATi)....

on the other side you can say those easter eggs are a bonus for loner longevity...

I can see both sides, but when it comes to who may gain more over time I would route for Nvidia forceware drivers...They will have more dual core driver updates and thta could be huge....
 

imported_Dhaval00

Senior member
Jul 23, 2004
573
0
0
Originally posted by: xtknight
Originally posted by: Dhaval00
Remember, 90nm > 11nm.

If anything, 90nm is less than 110nm. That doesn't have to do with anything. I'm not trying to come off as rude, but seriously, it doesn't mean a thing. Power and thus heat can be leaked, and it in itself doesn't mean a thing. Maybe clock-for-clock assuming the efficiency is the same, the 90nm can clock higher. But that doesn't matter either. There's pipelines and everything else that factors in to that. The only thing we can be sure of is 90nm takes less of a wafer than 110nm.

Read some VLSI related articles. You'll know what advantages you could gain by building a chip @ 90nm than @ 110nm. You are true to say that, that the current ATI chips may not have anything to do with it, but the smaller you go, the better implementation you can have for transistors, pispelines, etc. Of course that has nothing to do with the ATI chips, but my ultimate point was to make him (or her) wait until ATI released their cards.

Gamingphreek:
Let me see your source for a "single 7800GTX reaching a score of 12278." ANd I haven't claimed anything; I consistently said that the article appeared on TheIquirer, who has a history of reporting rumors. The comment about hyper memory has appeared on quite a few forums, and it could be completely wrong. When I said "there will be driver imrovements...," I meant it with both parties in perspective.

The X1800XT & X1600 cards make their launch on the 5th Novemeber. Show me a source that is selling those card currently (no pre-orders).

Again, I am no ATI fan-boy. My current system uses a 7800GT, and my Linux machine has an X800XT. The whole point, again, was to make this guy (or girl) wait, before he/she takes the leap.

ANd I said "a 512MB module MIGHT kick nVIDIA's butt." I never said "it will." Don't play with words; you are confusing the people who speak NERDY.

And surely it is a moral dilemma if you are a fan-boy. DO you think a person who loves ATI (or nVIDIA) will not feel guilty when he/she makes the switch for the first time? It most definitely a right/wrong scenario. This is some deep analogy, but I rather not get into it - I need some sleep.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Read some VLSI related articles. You'll know what advantages you could gain by building a chip @ 90nm than @ 110nm. You are true to say that, that the current ATI chips may not have anything to do with it, but the smaller you go, the better implementation you can have for transistors, pispelines, etc. Of course that has nothing to do with the ATI chips, but my ultimate point was to make him (or her) wait until ATI released their cards.

Well more chips per wafer. More die space. And lower powerconsumption barring an (inevitable) leakage problems. I dont need to read an article, i have already read plenty.

Let me see your source for a "single 7800GTX reaching a score of 12278."

Ok: Link I would say that is pretty damn close, wouldn't you. It can be found in this thread

When I said "there will be driver imrovements...," I meant it with both parties in perspective.

Then why did you tell him to wait. If you are waiting for a driver to be released before buying you are completely wasting your time.

The X1800XT & X1600 cards make their launch on the 5th Novemeber. Show me a source that is selling those card currently (no pre-orders).

The X1800XL cards are already out. The X1800XT isn't yet, because (not sure but would bet that) yeilds are still low.

ANd I said "a 512MB module MIGHT kick nVIDIA's butt." I never said "it will." Don't play with words; you are confusing the people who speak NERDY.

It would seem to me that you are the one who is twisting words.

There mught even be a 512MB module for X1800XL @ $449 that kicks nVIDIA 7800GT's butt.
#1. I dont see a might in there
#2. Dont try to bs this, that placement of the word "might" makes all the difference.

And surely it is a moral dilemma if you are a fan-boy. DO you think a person who loves ATI (or nVIDIA) will not feel guilty when he/she makes the switch for the first time? It most definitely a right/wrong scenario. This is some deep analogy, but I rather not get into it - I need some sleep.

Video cards have nothing to do with morality, or immorality. Fanboy or not, it is just being torn between a company you have been loyal to. Moral/Immoral would be pre-marital sex or not (if you are a strong christian).

-Kevin
 

imported_Dhaval00

Senior member
Jul 23, 2004
573
0
0
I don't know what you are trying to do? You surely are an nVIDIA fanboy, aren't you? I have nothing against nVIDIA; heck I made enough money on its stock. I said a bunch of times already in my posts that my whole point was to make the original party wait. He/She is not going to lose anything if he/she can afford to wait for a month. If desperate, go buy nVIDIA now.

We could go on and on about our lousy, pointless arguements, but the OP isn't going to gain anything.

Oh, also, you do need to read some Ethics books. You will be then able to differentiate between morality and immorality. It doesn't have to be just sex. 6 months down the raod, I may concur "I made a wrong chocie, I made a mistake." That is more than enough to be titled as a dilemma, and a moral one, too.
 

ZobarStyl

Senior member
Mar 3, 2004
657
0
0
You can play the waiting game forever with ATi and purported shipping dates. I personally take them with a grain of salt. As for the "super-overclockability" of x1800XT, that's utter fanboyism. The sheer percentages speak for themselves...the link above has a GTX at 810 core from 430...vs the 625 stock XT that only hits 865 on dry ice (you do the math), or 685 (DriverHeaven) on stock cooling. Barely getting 10% out of a video card is not 'super overclockable'...in fact, it's almost substandard.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
It doesn't have to be just sex.

Can you say EXAMPLE?! :roll: Did you want me to list each and every thing that was moral or immoral.

Why am i an Nvidia fanboy? I merely proved each and everyone of your points wrong.

-Kevin
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: munky
Originally posted by: Acanthus
Originally posted by: ArchAngel777
Originally posted by: Gamingphreek
As of right now, there is no reason to buy an X1800XL. The 7800GT outclasses it...not by much but still it beats it in most.

As for high end: well you have to give ATI the nod. They just KILL with AA and AF enabled. WHile Nvidia arguably maintains min framerates almost as well, when comparing max and avg, ATI simply runs away with the benchmark. So for the high end, it is a tossup with the nod towards ATI for the better IQ and much better AA performance. Reasons to get Nvidia: Killed performance without or with minimal AA and AF, lower powerconsumption, and i guess you could argue SLI.

Purevideo and AVIVO are merely extras, with the nod to purevideo right now until ATI finalizes their drivers.

Performance:
Mid-High: 7800GT>X1800XL
High: X1800XTPE>=7800GT

IQ:
ATI>Nvidia

Power/Heat:
Nvidia>ATI

Pure Performance (No AA and AF)
Nvidia>ATI

Quality Performance (AA and AF)
7800GT>X1800XL
X1800XTPE>7800GTX

Purevideo vs AVIVO (As of right now, when ATI releases new drivers it may well change):
Purevideo>AVIVO

-Kevin

Another thing to consider is that the X1800XL claims it can do external HDR + AA... Something to consider if that is true and would affect my purchase.

Nvidia can do the same in all titles that are not EXR HDR. ATi cant even do EXR HDR as far as i know.

You mean the same way that an x850 could do HDR as long as it's not the EXR method? With AA too.

Yup, and Nvidia can do AA + HDR if its not EXR HDR.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Gamingphreek
HDR support with AA ? R520 and the gang can do filtering and blending of 16-bit per color channel floating-point texture formats, allowing for easier use of high-dynamic-range lighting effects?just like the GeForce 6 and 7 series. Unlike NVIDIA's GPUs, the R500 series can also do multisampled antialiasing at the same time, complete with gamma-correct blends"

Yes but isn't EXR standard 32bit Blending rather than 16 (Hence the reason the 6 and 7 series could run EXR but not the X8 series). So therefore they still cannot execute AA with HDR when running in far cry.

-Kevin

I believe far cry doesnt support 16 bit blending for its EXR HDR, so it would not be compatible with the X1800 for HDR without a patch.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
although i'm an nvida guy myself {for now}
just look at the benchmark of 7800 gtx and x1800 xt and see for yourself
it makes me wanna buy the x1800 xt
lol