Firingsquad Publishes HDR + AA Performance Tests

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Wow, way to go off topic lol so now it is needed to bring CPU's into discussion to prove ATI cards cant do HDR + AA with playable frame rates? After Wreckage is proven wrong (he was already, but heck) whats next? "but EVEN if you have all that, ATI cards cant do it smoothly with an ageia physx card?" oh cmon... cut the crap already

I can BET you will be the first to brag about how great the G80 performs with HDR + AA even if it is equal to a X1900:roll:
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
Originally posted by: Wreckage
You quoted an article that said there was not much difference between a $630 CPU and a $800 CPU. :roll:

So you don't think that a CPU affects gaming performance?

You are sooooo deep in denial, I guess that your imagination must get pretty good frame rates.
http://www.techreport.com/reviews/2006q3/core2/index.x?pg=5

Those benchmarks are run at 1024x768 and 800x600...OF COURSE they are CPU limited...you really try very hard to troll and derail a thread don't you?
 

CelSnip

Member
Jun 27, 2006
188
0
0
Originally posted by: Wreckage
Originally posted by: Wreckage
Originally posted by: CelSnip
He never said a CPU didn't affect gaming performance. Just that a $1000 CPU is completely unnecessary for a single card.

Actually he said....

Originally posted by: CelSnip
You do not need a $1000 CPU to push SLI or CF.

Which I never disputed. I was referring to single card performance. Although someone with a 1900 said it was unplayable for him and was ridiculed for it. No, no bias at all.

Originally posted by: thilan29
I get didn't go below 25fps with HDR on at 1280x1024 with my lowly X1800XL...I DOUBT you would have to go to 800x600 to get playable framerates with an X1900...that's being a bit dramatic.

Your system outpeformed a Radeon X1800 XT 512MB + FX-62 in the FS article.

Chances are he was playing at lower settings.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
Originally posted by: Wreckage
Originally posted by: thilan29
I get didn't go below 25fps with HDR on at 1280x1024 with my lowly X1800XL...I DOUBT you would have to go to 800x600 to get playable framerates with an X1900...that's being a bit dramatic.

Your system outpeformed a Radeon X1800 XT 512MB + FX-62 in the FS article.

And my X1800XL is overclocked to 675MHz/600MHz so you can't compare it to a stock X1800XT, as well as my CPU being overclocked. Furthermore it's guaranteed that the settings I used were different from what they used. I already mentioned that I turned self shadows off as that seemed to hit my fps noticeably.

Why do you even bother Wreckage?? Troll and derail is the order of the day huh??
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Ackmed
Originally posted by: Wreckage
You can use Tranparency AA and SLI AA for superior image quality.

Have you even looked at SLI AA benchmarks? Let alone with TRAA added on top of it? It was far from playable for me, and ATi's SuperAA is much, much faster.

thats because SLI AA is better quality :confused:
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: thilan29
Why do you even bother Wreckage?? Troll and derail is the order of the day huh??
Instead of valid responses, you just cast more insults.

I have posted my views, with valid links and facts to back it up. I do not expect certain members of this forum to ever agree with me no matter what the reality is. If it was not the same handful of one-sided individuals I would feel the need to post more links and facts. But I know nothing will change that particular groups mindset.

So continue to call names, accuse me of working for someone, or just outright lie. This is just a hobby for me, while others appear to have more at stake than that.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:

Click

Now, here is what happend when I found those deer :) :

Click

I flubbed up and accidently took out my torch :eek: but I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.

4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)

Did........um.......no one see.......those................clips..............

My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.
 
Jun 14, 2003
10,442
0
0
Originally posted by: ShadowOfMyself
Originally posted by: Nightmare225
I'll still be happy with my 7950GX2, right? i heard it's a great card, despite all of this HDR+AA talk...


Sure, the gx2 IS a great card, as long as you only want raw performance and dont care about image quality, if you do then go ATI


he's only missing HDR and AA together. if he runs that card set in HQ and hes gone through with coolbits on and disabled the optimizations in the CP then IQ is very very similar in most games.

i have to admit though, i havent seen anything like ATi's HQ AF. i had 16xHQAF on the otherday in HL2 (a game which i think likes its filtering alot) and it was just amazing how sharp things were, even at distance. QAAA seems to be doing its bit very well too, but i think NV's TSAA has the edge there.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
I'll tell you, what nvidia needs to do is make the signal they send to the monitor a higher quality. The first thing I noticed about the XTX is that I could see new details everywhere, a much sharper picture... I couldn't believe that my 599 (new) BFG 7800 GTX had some cheaper/inferior device sending doing that ... which frankly still pisses me off to high hell and why I'm not sure if I'll go back to nvidia soon because there is no way to measure this by pictures so I can see whether they have fixed it.

I can see stuff in games, 2d pictures, even the lettering in my bios I can see each pixel (that's how I know it isn't software) that wasn't apparent before. I didnt buy a 600 dollar card and a 600 dollar 20.1 inch LCD to have some cheap ass signal quality degrading the picture. The idea of it is offensive to me given the premium I've paid and I find it complete bullshit because I think its nvidia just trying to maximize profit, where a little more "love/pride" seems to go into ATI designs. (hardware anyway)

I know someone else here observed what I'm talking about and I think he said of installing the ATI card after the nvidia one "it was somewhat like getting a new LCD panel."


Anyway, aside from that I still think the GX2 is a good bet. I do wonder how it would compare if image quality settings were identical to the XTX (like that legit reviews article where a 700mhz/1800mhz 7900GTX gets trounced by the XTX once image Q settings are equalized). If you add about 10-15% to the XT score in anand's 7950 review you have the XTX equaling the GX2 in most games. (but you can always OC a GX2 ;) ) That and the fact that nvidia loses a fair amount of performance at HQ +

But unless G80 is the shizzlenit compared to r600 or I can get some kind of confirmation that Nvidia has improved their signal quality I won't be buying from them again.

As far as HQAF and TAA/AAA, I loved HQAF, TRAA is pretty good too, haven't really used AAA yet so who knows

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: josh6079
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:

Click

Now, here is what happend when I found those deer :) :

Click

I flubbed up and accidently took out my torch :eek: but I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.

4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)

Did........um.......no one see.......those................clips..............

My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.

I did, but i couldnt actually see anything because it was too dark, thus I didnt comment :confused:

But if the point is that its really smooth, well we already knew that :p
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: josh6079
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:

Click

Now, here is what happend when I found those deer :) :

Click

I flubbed up and accidently took out my torch :eek: but I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.

4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)

Did........um.......no one see.......those................clips..............

My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.


Hey is that "Lord of the Rings" music? How'd you get that?
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Frackal
I'll tell you, what nvidia needs to do is make the signal they send to the monitor a higher quality. The first thing I noticed about the XTX is that I could see new details everywhere, a much sharper picture... I couldn't believe that my 599 (new) BFG 7800 GTX had some cheaper/inferior device sending doing that ... which frankly still pisses me off to high hell and why I'm not sure if I'll go back to nvidia soon because there is no way to measure this by pictures so I can see whether they have fixed it.

I can see stuff in games, 2d pictures, even the lettering in my bios I can see each pixel (that's how I know it isn't software) that wasn't apparent before. I didnt buy a 600 dollar card and a 600 dollar 20.1 inch LCD to have some cheap ass signal quality degrading the picture. The idea of it is offensive to me given the premium I've paid and I find it complete bullshit because I think its nvidia just trying to maximize profit, where a little more "love/pride" seems to go into ATI designs. (hardware anyway)

I know someone else here observed what I'm talking about and I think he said of installing the ATI card after the nvidia one "it was somewhat like getting a new LCD panel."


Anyway, aside from that I still think the GX2 is a good bet. I do wonder how it would compare if image quality settings were identical to the XTX (like that legit reviews article where a 700mhz/1800mhz 7900GTX gets trounced by the XTX once image Q settings are equalized). If you add about 10-15% to the XT score in anand's 7950 review you have the XTX equaling the GX2 in most games. (but you can always OC a GX2 ;) ) That and the fact that nvidia loses a fair amount of performance at HQ +

But unless G80 is the shizzlenit compared to r600 or I can get some kind of confirmation that Nvidia has improved their signal quality I won't be buying from them again.

As far as HQAF and TAA/AAA, I loved HQAF, TRAA is pretty good too, haven't really used AAA yet so who knows

I cant tell the difference between the integrated ATI graphics (express 200) on my mobo and my 7600gt in terms of signal quality

and why havent you used AAA yet!??!?!?!:Q it should give a big boost to IQ especially in games with lots of vegetation like Oblivion or BF2 (no more aliased leaves), and your card is more than fast enough to handle it :)
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: Wreckage

You quoted an article that said there was not much difference between a $630 CPU and a $800 CPU. :roll:

So you don't think that a CPU affects gaming performance?

You are sooooo deep in denial, I guess that your imagination must get pretty good frame rates.
http://www.techreport.com/reviews/2006q3/core2/index.x?pg=5

Yeah, at the time, the FX-60 was the top end AMD CPU, and aout $1000. Again, you do not need a $1000 CPU to push a single card.

I never said CPU speed doestn effect game performance. Do not put words in my mouth. What I said was, that you a single card doesnt not "require" a $1000 CPU as you did.

Also, your link supports me, not you. Not sure why you even posted it. It is also at a very CPU bound res. As I said, anything close to high res, will see next to no difference. $400 A64, will run Oblivion virtually the same speed than a FX-62 at 1280x1024 or higher. A CPU much, much less than $1000 holds it down, against much more expensive, and over $1000 CPU's. So thanks for supporting me, proving you wrong.

Originally posted by: Wreckage

Which I never disputed. I was referring to single card performance. Although someone with a 1900 said it was unplayable for him and was ridiculed for it. No, no bias at all.

Actually, he never even played Oblivion, and was just speculating. More of your misinformation refuted.

Originally posted by: schneiderguy
Originally posted by: Ackmed
Originally posted by: Wreckage
You can use Tranparency AA and SLI AA for superior image quality.

Have you even looked at SLI AA benchmarks? Let alone with TRAA added on top of it? It was far from playable for me, and ATi's SuperAA is much, much faster.

thats because SLI AA is better quality :confused:

Wrong. Its because of the way NV supplies SLI AA. At least have some facts to support your claims. FYI, just because its 16x, doesnt mean its better than 14x.

For those who cant seem to admit it, ATi can run HDR+AA in Oblivion with "budget cards", at very playable settings. Get over it, and move on.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Ackmed, don't worry about it. You're not going to convince him and that's okay. If he wants to believe that it isn't possible, let him.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: schneiderguy
Originally posted by: Frackal
I'll tell you, what nvidia needs to do is make the signal they send to the monitor a higher quality. The first thing I noticed about the XTX is that I could see new details everywhere, a much sharper picture... I couldn't believe that my 599 (new) BFG 7800 GTX had some cheaper/inferior device sending doing that ... which frankly still pisses me off to high hell and why I'm not sure if I'll go back to nvidia soon because there is no way to measure this by pictures so I can see whether they have fixed it.

I can see stuff in games, 2d pictures, even the lettering in my bios I can see each pixel (that's how I know it isn't software) that wasn't apparent before. I didnt buy a 600 dollar card and a 600 dollar 20.1 inch LCD to have some cheap ass signal quality degrading the picture. The idea of it is offensive to me given the premium I've paid and I find it complete bullshit because I think its nvidia just trying to maximize profit, where a little more "love/pride" seems to go into ATI designs. (hardware anyway)

I know someone else here observed what I'm talking about and I think he said of installing the ATI card after the nvidia one "it was somewhat like getting a new LCD panel."


Anyway, aside from that I still think the GX2 is a good bet. I do wonder how it would compare if image quality settings were identical to the XTX (like that legit reviews article where a 700mhz/1800mhz 7900GTX gets trounced by the XTX once image Q settings are equalized). If you add about 10-15% to the XT score in anand's 7950 review you have the XTX equaling the GX2 in most games. (but you can always OC a GX2 ;) ) That and the fact that nvidia loses a fair amount of performance at HQ +

But unless G80 is the shizzlenit compared to r600 or I can get some kind of confirmation that Nvidia has improved their signal quality I won't be buying from them again.

As far as HQAF and TAA/AAA, I loved HQAF, TRAA is pretty good too, haven't really used AAA yet so who knows

I cant tell the difference between the integrated ATI graphics (express 200) on my mobo and my 7600gt in terms of signal quality

and why havent you used AAA yet!??!?!?!:Q it should give a big boost to IQ especially in games with lots of vegetation like Oblivion or BF2 (no more aliased leaves), and your card is more than fast enough to handle it :)


It could be dependent on the monitor.. it's potential and/or maybe compatibility.. Which monitor are you using?

I cant get AAA to work in BF2 for some reason, I think BF2 is doesn't "do" FSAA or what not, I read about it.

Yeah, the XTX will handle AAA in Oblivion probably but I'm not sure that with all these visual mods + 4xAA, + HQ16xAF + 1680x1050 + HDR + 2048x2048 Quarl texture mod (more than 512mb vid mem)

I might try 2xAA + AAA or something though, good idea


 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
Originally posted by: Wreckage
Originally posted by: thilan29
Why do you even bother Wreckage?? Troll and derail is the order of the day huh??
Instead of valid responses, you just cast more insults.

I have posted my views, with valid links and facts to back it up. I do not expect certain members of this forum to ever agree with me no matter what the reality is. If it was not the same handful of one-sided individuals I would feel the need to post more links and facts. But I know nothing will change that particular groups mindset.

So continue to call names, accuse me of working for someone, or just outright lie. This is just a hobby for me, while others appear to have more at stake than that.

My last two posts have been valid responses to your posts. You cut out the rest of my post, quote my last sentence, and call it invalid.

As I stated before, just in the Techreport link YOU provided, the games are being benched at 1024x768 and 800x600 and so of course a difference in CPU would appear?? Did you even bother to read my post??

What's being discussed here is resolutions of 1280x1024 and higher, or have you forgotten about the topic already. You somehow tried to derail this into a thread about CPUs.

I accused you of trolling and derailing the thread topic(which you have done already), I never said you were affiliated with anyone. And nice try with the BS of taking some non-existant moral high ground. Everyone knows what you came in here to do. As I've said before...keep posting the BS, there will always be people to refute any BS you post.

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Ackmed

Wrong. Its because of the way NV supplies SLI AA. At least have some facts to support your claims. FYI, just because its 16x, doesnt mean its better than 14x.

afaik SLI AA is supersampled Crossfire AA or whatever its called isnt. that explains the performance difference + IQ difference between the two

 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: Frackal

It could be dependent on the monitor.. it's potential and/or maybe compatibility.. Which monitor are you using?

im using a HP f1905 19inch LCD 1280*1024 native resolution

AAA should work in BF2, TRAA works fine for me with my 7600gt in bf2

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Wreckage
Originally posted by: Ackmed
Originally posted by: Wreckage


I was pointing at the the results for single card performance required a $1000+ CPU. Something everyone seemed to overlook. If you don't think that gaming perfomance is also tied to the CPU then you are either a liar or don't know enough about computers to be posting here.

They do not require q $1000 CPU.

With two of the cards in the FS article, the one you claim shows that you need a $1000 CPU to push a single card, lets look at some numbers.

AT shows that a X1800XT, gets the same 25fps with a 1.8gig, 2.0gig, 2.2gig, 2.4gig, and a 2.6gig A64. The X1900XT goes from 28.8, up to 32.1, at the same speeds. What does that tell you? That a $1000 CPU is not "required" for a single card, as you claimed. And thats even using the game in question, Oblivion, and two cards in the FS article. Other parts of the game do better with more Mhz, not nearly enough to "require" a $1000 CPU.

Not enough proof? Lets look at another.

We found that sometimes even the AMD Athlon 64 3800+ will bottleneck your video card. In most of our gameplay testing though, the AMD Athlon 64 3800+ did not bottleneck our video cards at all allowing our GPUs to reach their full potential. There were literally no real gameplay advantages between an Athlon FX-60 and Athlon X2 4800+ in our testing.

Case closed. You do not need a $1000 CPU, to push a single card, as you claimed. Speaking of "not knowing enough to post here"... your words, not mine.

You quoted an article that said there was not much difference between a $630 CPU and a $800 CPU. :roll:

So you don't think that a CPU affects gaming performance?

You are sooooo deep in denial, I guess that your imagination must get pretty good frame rates.
http://www.techreport.com/reviews/2006q3/core2/index.x?pg=5

Wreckage come on man, were talking about gaming at 12x10 with AA & AF on a Single X1900 XT, High Settings, not 8x6 Medium Quality with only HDR. Obviously the second situation will stress the CPU subsystem more then GPU.

The point Ackmed is trying to make is that for 12x10 with AA & AF the CPU is going to play
a realtively low impact on Oblivion, any Dual Core Athlon 64x2 or Core 2 Duo will be sufficient.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: schneiderguy
Originally posted by: Frackal

It could be dependent on the monitor.. it's potential and/or maybe compatibility.. Which monitor are you using?

im using a HP f1905 19inch LCD 1280*1024 native resolution

AAA should work in BF2, TRAA works fine for me with my 7600gt in bf2


Sort of Crossfire 14x AA and 10xAA have 2xSSAA thrown in.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: schneiderguy
Originally posted by: Ackmed

Wrong. Its because of the way NV supplies SLI AA. At least have some facts to support your claims. FYI, just because its 16x, doesnt mean its better than 14x.

afaik SLI AA is supersampled Crossfire AA or whatever its called isnt. that explains the performance difference + IQ difference between the two

They look virutally the same to me. I have used both personally, have you? Even reviews that compare show they look to be about the same. In fact, some even give ATi the edge.

http://www.firingsquad.com/hardware/ati_super_aa_vs_nvidia_sli_aa/page3.asp

There was a few more I remember, but couldnt find them via a quick search. I think xbit did a comparison several months ago? I forget.

But that doesnt even matter. You claimed SLI AA is slower, due to it having better quality. Even if SLI AA does have better IQ, thats not the reason. The reason is, the way NV does it. SLI AA was not intended when they put out the card. After ATi announced it, NV did the same. And actually had it available thru drivers, before ATi did. Its good that NV gives the option for it, but its much, much slower than Super AA.

But the point I was responding to was, wreckage said, "use Tranparency AA and SLI AA for superior image quality". Use? Sure. Playable? Doubtful. I couldnt even get BF2 to be playable for me with 8x SLI AA, and TRAA made it crawl. Granted that was with 7800GTX SLI.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
I tried AAA Quality Mode in BF2 (other settings 1680x1050, 4xAA, 16xHQAF w/ Full Trilinear, all BF2 settings High, Cat AI High, Mipmap/AF on High, 690/880 (really hot day) clocks for XTX)

I still get over 110fps average, I'm really impressed with this card. That's why I've been so pro X1900XTX lately. When I bought the GTX, it constantly underperformed IMO, at least until about 8 months later when drivers gradually managed to improve performance. The 78/79 series takes way too big a hit with AA and has far too crappy minimum frames... The XTX has better image quality, takes very little hit with AA apparently (especially with high mem OC), and has great minimum frames as well that give a very smooth experience. Sorry, but IMO having used both cards (although 1st was a 7800GTX) the XTX is way better this round in almsot every category
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Frackal
I'll tell you, what nvidia needs to do is make the signal they send to the monitor a higher quality. The first thing I noticed about the XTX is that I could see new details everywhere, a much sharper picture... I couldn't believe that my 599 (new) BFG 7800 GTX had some cheaper/inferior device sending doing that ... which frankly still pisses me off to high hell and why I'm not sure if I'll go back to nvidia soon because there is no way to measure this by pictures so I can see whether they have fixed it.

I can see stuff in games, 2d pictures, even the lettering in my bios I can see each pixel (that's how I know it isn't software) that wasn't apparent before. I didnt buy a 600 dollar card and a 600 dollar 20.1 inch LCD to have some cheap ass signal quality degrading the picture. The idea of it is offensive to me given the premium I've paid and I find it complete bullshit because I think its nvidia just trying to maximize profit, where a little more "love/pride" seems to go into ATI designs. (hardware anyway)

I know someone else here observed what I'm talking about and I think he said of installing the ATI card after the nvidia one "it was somewhat like getting a new LCD panel."

...

But unless G80 is the shizzlenit compared to r600 or I can get some kind of confirmation that Nvidia has improved their signal quality I won't be buying from them again.
Signal quality affects analog connections, not digital connections.

video card (digital) -> DVI-D cable (digital) -> LCD monitor (digital)

I see two possibilities for what you experienced. Either you screwed up and used a VGA or DVI-I cable, making the video card's DAC (digital to analog converter) a factor, or you enabled digital vibrance, which makes everything more colorful and yet muddy at the same time by shifting color into a higher but smaller range.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Neither of what you say is valid in this case.


The same cable from monitor to video card (it's DVI) is being used in both cases

And I am familiar with digital vibrance / saturation, and it's not the case either.


How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"

Read the post man.

I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.